The Trump administration is actively suppressing evidence of their connections to child trafficking, while issuing a travel-ban on the EU official who tried to enforce laws against child sexual abuse material. Now Trump is in the news for protecting a platform that monetized CSAM generation.
Since the beginning of January, thousands of women and teenagers, including public figures, have reported that their photos published on social media have been “undressed” and put in bikinis by Grok at the request of users. The deepfake tool has prompted investigations from regulators across Europe, including in Brussels, Dublin, Paris and London.
Look at who’s missing from the list of regulators demanding action. Reuters captures the picture like this:
From Europe to Asia, governments and regulators are cracking down on the sexually explicit content generated by Grok, imposing bans and demanding safeguards in a growing global push to curb illegal material.
From Europe to Asia, CSAM regulation. Nothing from Trump?
California has now stepped in to join the EU complaints. Elon Musk has responded to the state by saying he’s open to restriction as jurisdiction-based (“where it’s illegal”) without identifying any jurisdictions, meaning he may continue harms wherever laws don’t explicitly force child protection.
Why not Trump? Consider the reporting about a 14-year-old girl taken to Trump at Mar-a-Lago in 1994. Epstein introduced her by asking “This is a good one, right?” Trump smiled and nodded in agreement. It’s not new information that emerged from any Epstein Files being released so far, just confirmed by the release despite administration efforts to bury it.
This context is more important than ever today, given the EU Commission has explicitly stated the child sexual exploitation content generated by Elon Musk’s platform is illegal. They specifically referenced harm to children.
The X (Twitter) response to the EU investigation has been… to monetize access to the capability, require payment for access. The Trump administration has done nothing to protect children.
There’s no ambiguity to launder here. Elon Musk didn’t disable the feature. He didn’t add consent verification. He didn’t implement age detection to prevent processing images of minors. He took the regulatory reaction as a value multiplier and ordered a paywall to capitalize on harm to children.
The only functional difference between “free illegal CSAM generation” and “paid illegal CSAM generation” is that the latter creates a revenue stream and a paper trail of subscribers who are paying specifically for the capability to generate this content.
It’s actually worse than just continuing to offer it because the paywall creates a business model predicated on demand for the illegal use case, since the illegality has driven attention to the feature in the first place. They’re pricing in the criminal market they just demonstrated exists.
The move is almost a perfect distillation of the emerging Silicon Valley non-compliance playbook: when caught enabling harm, add friction that filters out casual users while preserving access for motivated bad actors, who cleanse themselves with money. “Everything must be ok if someone is willing to pay”. It transforms liability questions without addressing the underlying harms. The women and children whose images are being processed without consent aren’t protected by the paywall, since the harms are now generating subscription revenue for attackers.
The Trump travel ban on EU officials for enforcing the DSA is the context that makes this make sense. Elon Musk is operating on the assumption that the Trump administration will shield his CSAM generation tools, so “compliance” becomes pure theater and gestures toward process while continuing and now monetizing the underlying violation of children.
There’s no hypocrisy, just consistency. They’re protecting each other’s CSAM operations.