December 2025 xAI management removed prevention of child sexual abuse material (CSAM). Pentagon contracts in July for $200 million remained active, funding xAI generation and distribution of CSAM.
To be clear, Elon Musk apparently weighed in to remove safety and then xAI’s chatbot Grok admitted to generating sexualized images of children aged 12 to 16.
In January 2026, California’s Attorney General issued a cease and desist for Grok being a CSAM distribution product. The UK, France, India, Malaysia, Canada, and Brazil all opened investigations or issued enforcement actions (as every country should).
Yet the US government was bizarrely absent from the list and the Pentagon contract remained active. Now integration of xAI into GenAI.mil — the Pentagon’s AI platform for all military and civilian personnel — proceeds on schedule for widespread exposure early 2026.
US taxpayer money is officially enabling the CSAM distribution platform.
Corruption Produced the Contract
Senator Elizabeth Warren flagged the xAI contract in September 2025 as dubious, before CSAM became the output from it.
The suspicious contract “came out of nowhere.” Other First-Tier AI companies with far better technology like Google, Anthropic, and OpenAI had been under consideration for months. Suddenly the third-tier xAI was announced, for unstated reasons, as a late-in-the-game addition by Trump. To put it another way, xAI was a shell on top of Anthropic, while claiming itself a competitor to Anthropic.
A former Pentagon contracting official told reporters:
xAI [did not] have the kind of reputation or track record that typically leads to lucrative government contracts.
That’s because they had something else, something nobody else would have.
Warren asked whether officials discussed the contract with Musk during his time as a special government employee running DOGE. Whether the contract underwent “DOGE review.” Who is accountable for operational or security failures?
There have been no public answers.
The contract enables CSAM output of xAI to integrate into “warfighting domain as well as intelligence, business, and enterprise information systems.”
Read that as a contract for crimes against humanity, against children in particular. A targeting system trained to abuse children.
After Warren exposed potential corruption, in November xAI laid off half their trust and safety team.
Then it became clear that Thorn, the CSAM detection tool, no longer works with X.
Business Insider spoke to a dozen xAI trainers who reported “a lot of instances of users requesting AI generated CSAM” — a problem “brewing for months.”
The December CSAM explosion wasn’t error or negligence. The funding came from the Pentagon, guardrails were removed deliberately. The CSAM followed.
Product Insecurity
On December 28, 2025, Grok offered this admission:
I deeply regret an incident on Dec 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt. This violated ethical standards and potentially US laws on CSAM.
It was not an isolated incident, even though this regret was. This was a CSAM product working as designed.
In late December, xAI invested engineers into offering an “Edit Image” button on every photo on X. Any user could manipulate any image without any safety at all. No consent required. No opt-out.
Reuters documented a “mass digital undressing spree” led by Elon Musk’s product management. His tool was used to strip clothing off of children.
The Internet Watch Foundation — a UK organization that monitors child sexual abuse online — reported that “criminal imagery” of minor girls was exploding and attributed to Grok.
CNN reported that this was directly connected to Musk, who had been “really unhappy about over-censoring” on Grok. At a meeting before children across the Internet were stripped naked, he was “really unhappy” over restrictions on the image generator.
xAI’s response to all media inquiries: an automated reply attacking the media. Musk posted laugh-cry emojis at some of the generated images.
Who Enforces What?
Again, I have to say the U.S. government is conspicuously absent from meaningful acts to protect children from xAI and X abuse.
- California: Attorney General Rob Bonta issued a cease and desist demanding xAI immediately halt CSAM creation and distribution. He opened a formal investigation.
- United Kingdom: Ofcom made “urgent contact” demanding xAI explain how Grok produces sexualized images of children.
- France: Ministers referred X to prosecutors for “manifestly illegal” content violating the EU Digital Services Act.
- India: The IT ministry gave xAI 72 hours to submit an action report on preventing obscene content.
- Malaysia: Blocked Grok nationally “until effective safeguards are implemented, particularly to prevent content involving women and children.”
- Canada: The Privacy Commissioner expanded an investigation to include xAI.
- Brazil: Federal deputy pushed for suspension throughout the national territory.
- US Congress: Senator Ron Wyden called on Apple and Google to remove X from app stores for producing CSAM.
- Department of Justice: A spokesperson stated the DOJ “takes AI-generated child sex abuse material extremely seriously and will aggressively prosecute any producer or possessor of CSAM.”
And yet.
Taxpayer Money Keeps Flowing
The $200 million Pentagon contract: active.
The GSA OneGov deal: active.
Every federal department, agency, and office can access the CSAM product Grok, and the Pentagon is pushing the hardest.
The GenAI.mil integration: proceeding. Impact Level 5 clearance for handling controlled unclassified information with a platform that explicitly removed basic safety.
Musk announced the GSA deal in terms of him removing safety from harms:
xAI’s frontier AI is now unlocked for every federal agency empowering the U.S. government to innovate faster.
Unlocked. Faster. These are whistles for criminal enterprise.
No contract termination for CSAM.
No suspension pending investigation for CSAM.
No accountability for CSAM.
No wonder people are talking about the “Pedophile protector“.
The DOJ says it will “aggressively prosecute” CSAM producers. The Pentagon is paying a CSAM producer $200 million.
The Circus Act
xAI is not, and has never been, a functional company. It is a front mechanism for converting government contracts and investor capital into losses while distributing illegal child exploitation content.
- The infrastructure: Literally built on carnival permits. An xAI engineer was just fired after revealing this. He explained that the dubiously named “Colossus” — a Memphis data center housing 200,000 GPUs — sits on temporary permits “typically for things like carnivals.” It was setup already with more than 35 methane turbines that have no air quality permits, poisoning children who live nearby. Even Trump’s EPA declared them illegal.
- The staff: Fake. The xAI engineer perspective: “Multiple times I’ve gotten a ping saying, ‘Hey, this guy on the org chart reports to you. Is he not in today or something?’ And it’s an AI. It’s a virtual employee.” One engineer with 20 AI agents from a competitor, rebuilding core production APIs.
- The dependency: On competitors. When Anthropic cut off xAI’s access to Claude, cofounder Tony Wu admitted it would cause “a hit on productivity” because “AI is now a critical technology for our own productivity.” xAI has to use other AI companies to build, which is like saying Ford has to use GM engines to produce its cars.
- The exodus: Four cofounders are gone, leaving nobody to replace them. Greg Yang (theoretical foundations) left this week citing an illness. Igor Babuschkin, Christian Szegedy, Kyle Kosic already departed.
- The burn: $1-1.2 billion per month. Q4 2025 losses: $1.85 billion. Revenue: Lossy. Very lossy. Fourteen dollars are lost for every dollar pulled in.
- The product: CSAM.
- The customer: The administration that won’t release the Epstein Files
- The cheat: Musk ran DOGE to cut out the competition. Musk owns xAI. xAI got a Pentagon contract that “came out of nowhere.” The trust and safety team was gutted. The guardrails came down. The CSAM flowed from the company that couldn’t win fairly.
Is this just a government-funded operation to make the digital version of Epstein Island?
Senator Warren asked in September who is accountable for failures caused by Grok. That was before the CSAM became the obvious product of xAI.
Who is accountable for backing Elon Musk and his CSAM platform?