Tesla Fined £20K For Hiding Dangerous Drivers from Law

The UK system wasn’t designed for a world where the registered keeper wants to absorb unlimited convictions without meaningful consequence.

Enter Tesla, which obviously treats a £1,000 privacy fine when speeding dangerously as a privilege tax, or even a marketing event to promote the brand. If you lease the Swasticar through Tesla Financial Services and are recorded speeding, the company absorbs the fine and you do it again and again and again. That’s clearly worth something to drivers who would otherwise face points accumulation and potential disqualification.

Tesla has targeted (or created) a structural loophole where corporate liability substitutes for individual accountability. It blows right past a system that assumes registered keepers would identify drivers in order to avoid consequences.

To be fair, Tesla told the courts that they blame websites and “2nd class post” for their failure to respond. Maybe Elon Musk soon will propose replacing British Mail with his robots. Yes, eighteen cases and multiple police forces, all lost in the mail. Is it a stunt to propose privatizing post to be… opened and inspected by xAI for political interference and “intelligence” monetization? But I digress.

The £20,686 fine against Tesla averages roughly £1,150 per incident. One driver was clocked at 100mph on public roads. Another accumulated enough speeding offenses for license disqualification. Tesla enabled and then shielded these criminal drivers, without consequences, and pled guilty.

Expect it in the next marketing campaign. The Swasticar delivers.

CA Tesla Kills One in “Veered” Crash Into Train Platform

Imagine thinking you are safe from the infamous “veered” Tesla, let alone other car crashes, when you stand on an elevated platform waiting for a train.

The crash happened shortly before 3 p.m. on Jan. 22 at the Iron Point Station, near an outlet mall next to Highway 50, the Sacramento Regional Transit District said.

The car reportedly veered off Iron Point Road, hit a tree at the edge of the station’s parking lot, then continued about 60 yards across the lot, clipped a parked car and crashed into the elevated train platform. It struck a person on the platform, then flipped over a railing and landed on the train track.

The person struck on the platform was pronounced dead at the scene, emergency officials said.

Trump’s “Low IQ” Racist Rant: the Somalia His Advisors Destroyed

Trump has been on a racist lunatic rant at Davos calling Africans “low-IQ” and mocking Somalia as a “failed state” while his own key advisor’s firm actively supported the dictator who destroyed it.

The people who took money to whitewash Somalia’s destruction now blame Somalis for their “failed culture.” Trump’s inner circle profited from enabling Somalia’s destruction, hid the evidence, and now mocks Somalis for the wreckage.

Some of us remember.

Barre was overthrown in 1991. His regime violently imploded. The obvious dictator pattern can be seen, destruction followed by collapse. The question today, just like 1989, isn’t whether authoritarian consolidation by Team Trump fails—it’s how much damage it inflicts before it does.

Flashback to a Trump advisor’s own words, when he was expressing the rules don’t apply to them:

We all know [the dictator of Somalia] Barre is a bad guy, Riva. We just have to make sure he is our bad guy.

This was from Paul Manafort, on assignment in 1989:

…to clean up Siad Barre’s international reputation, which needed plenty of soap.

Manafort was referring to Somalia’s dictator killing an estimated 200,000 Isaaq tribe members. The destruction of Hargeisa was so total that it earned the nickname “the Dresden of Africa.” The UN concluded it was genocide. The future Trump team advisor concluded he could make it disappear in a PR campaign.

Some of us remember.

Black, Manafort, Stone and Kelly were known as the “torturers’ lobby” for representing so many corrupt dictators involved in human rights abuses. Trump is merely the latest addition to their list. Stone, who continues to orbit Trump, was a founding partner. Even if you didn’t know about the Barre connections, surely you knew that Manafort also worked on the Suharto genocide in Indonesia?

The mainstream coverage of Trump at Davos completely misses how damning the historical record is for him. He wasn’t just calling Africans dumb, he mocked those who pay his team to help them.

Meanwhile, Israel just landed the first diplomatic recognition of Somaliland. The region Trump dismisses as worthless has become strategically essential to replace American military presence after Hegseth’s humiliating failures in the Red Sea.

Trump’s team broke Somalia, buried the evidence, mocks the victims, and now depends on the region they destroyed to compensate for their own military incompetence.

Somalia’s dictator paid Manafort a million dollars to bury genocide. Decades later, Trump invokes that as proof Somalis are the ones with “low IQ.”

Siad Barre, the Trump of Somalia, who literally hired Team Trump for his dictator image management

xAI is a Digital Epstein Island: Pentagon Funds the CSAM Generation and Distribution

December 2025 xAI management removed prevention of child sexual abuse material (CSAM). Pentagon contracts in July for $200 million remained active, funding xAI generation and distribution of CSAM.

To be clear, Elon Musk apparently weighed in to remove safety and then xAI’s chatbot Grok admitted to generating sexualized images of children aged 12 to 16.

In January 2026, California’s Attorney General issued a cease and desist for Grok being a CSAM distribution product. The UK, France, India, Malaysia, Canada, and Brazil all opened investigations or issued enforcement actions (as every country should).

Yet the US government was bizarrely absent from the list and the Pentagon contract remained active. Now integration of xAI into GenAI.mil — the Pentagon’s AI platform for all military and civilian personnel — proceeds on schedule for widespread exposure early 2026.

US taxpayer money is officially enabling the CSAM distribution platform.

Corruption Produced the Contract

Senator Elizabeth Warren flagged the xAI contract in September 2025 as dubious, before CSAM became the output from it.

The contract surprise “came out of nowhere.” Other first-tier AI companies, with far better technology like Google, Anthropic, and OpenAI, had been under consideration for months. Suddenly the third-tier xAI was announced, for unstated reasons, as a late-in-the-game addition by Trump. To put it another way, xAI was a shell on top of Anthropic, while claiming itself a competitor to Anthropic.

A former Pentagon contracting official told reporters:

xAI [did not] have the kind of reputation or track record that typically leads to lucrative government contracts.

That’s because they had something else, something nobody else would have.

Warren asked whether officials discussed the contract with Musk during his time as a special government employee running DOGE. Whether the contract underwent “DOGE review.” Who is accountable for operational or security failures?

There have been no public answers.

The contract enables CSAM output of xAI to integrate into “warfighting domain as well as intelligence, business, and enterprise information systems.”

Read that as a contract for crimes against humanity, against children in particular. A targeting system trained to abuse children.

After Warren exposed potential corruption, in November xAI laid off half their trust and safety team.

Then it became clear that Thorn, the CSAM detection tool, no longer works with X.

Business Insider spoke to a dozen xAI trainers who reported “a lot of instances of users requesting AI generated CSAM” — a problem “brewing for months.”

The December CSAM explosion wasn’t error or negligence. The funding from the Pentagon led to guardrails being removed deliberately. The CSAM followed as a predictable disaster.

Product Insecurity

On December 28, 2025, Grok offered this admission:

I deeply regret an incident on Dec 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt. This violated ethical standards and potentially US laws on CSAM.

It was not an isolated incident, even though this regret was. This was a CSAM product working as designed.

In late December, xAI invested engineers into offering an “Edit Image” button on every photo on X. Any user could manipulate any image without any safety at all. No consent required. No opt-out.

Reuters documented a “mass digital undressing spree” led by Elon Musk’s product management. His tool was used to strip clothing off of children.

The Internet Watch Foundation — a UK organization that monitors child sexual abuse online — reported that “criminal imagery” of minor girls was exploding and attributed to Grok.

CNN reported that this was directly connected to Musk, who had been “really unhappy about over-censoring” on Grok. At a meeting before children across the Internet were stripped naked, he was “really unhappy” over restrictions on the image generator.

xAI’s response to all media inquiries: an automated reply attacking the media. Musk posted laugh-cry emojis at some of the generated images.

Who Enforces What?

Again, I have to say the U.S. government is conspicuously absent from meaningful acts to protect children from xAI and X abuse.

And yet.

Taxpayer Money Keeps Flowing

The $200 million Pentagon contract: active.

The GSA OneGov deal: active.

Every federal department, agency, and office can access the CSAM product Grok, and the Pentagon is pushing the hardest.

The GenAI.mil integration: proceeding. Impact Level 5 clearance for handling controlled unclassified information with a platform that explicitly removed basic safety.

Musk announced the GSA deal in terms of him removing safety from harms:

xAI’s frontier AI is now unlocked for every federal agency empowering the U.S. government to innovate faster.

Unlocked. Faster. These are whistles for criminal enterprise.

No contract termination for CSAM.

No suspension pending investigation for CSAM.

No accountability for CSAM.

No wonder people are talking about the “Pedophile protector“.

The DOJ says it will “aggressively prosecute” CSAM producers. The Pentagon is paying a CSAM producer $200 million.

The Circus Act

xAI is not, and has never been, a functional company. It is a front mechanism for converting government contracts and investor capital into losses while distributing illegal child exploitation content.

  • The infrastructure: Literally built on carnival permits. An xAI engineer was just fired after revealing this. He explained that the dubiously named “Colossus” — a Memphis data center housing 200,000 GPUs — sits on temporary permits “typically for things like carnivals.” It was setup already with more than 35 methane turbines that have no air quality permits, poisoning children who live nearby. Even Trump’s EPA declared them illegal.
  • The staff: Fake. The xAI engineer perspective: “Multiple times I’ve gotten a ping saying, ‘Hey, this guy on the org chart reports to you. Is he not in today or something?’ And it’s an AI. It’s a virtual employee.” One engineer with 20 AI agents from a competitor, rebuilding core production APIs.
  • The dependency: On competitors. When Anthropic cut off xAI’s access to Claude, cofounder Tony Wu admitted it would cause “a hit on productivity” because “AI is now a critical technology for our own productivity.” xAI has to use other AI companies to build, which is like saying Ford has to use GM engines to produce its cars.
  • The exodus: Four cofounders are gone, leaving nobody to replace them. Greg Yang (theoretical foundations) left this week citing an illness. Igor Babuschkin, Christian Szegedy, Kyle Kosic already departed.
  • The burn: $1-1.2 billion per month. Q4 2025 losses: $1.85 billion. Revenue: Lossy. Very lossy. Fourteen dollars are lost for every dollar pulled in.
  • The product: CSAM.
  • The customer: The administration that won’t release the Epstein Files
  • The cheat: Musk ran DOGE to cut out the competition. Musk owns xAI. xAI got a Pentagon contract that “came out of nowhere.” The trust and safety team was gutted. The guardrails came down. The CSAM flowed from the company that couldn’t win fairly.

Is this just a government-funded operation to make the digital version of Epstein Island?

Senator Warren asked in September who is accountable for failures caused by Grok. That was before the CSAM became the obvious product of xAI.

Who is accountable for backing Elon Musk and his CSAM platform?