Anil Dash’s recent paean to Model Context Protocol (MCP) as the harbinger of “Web 2.0 2.0” reads like the kind of Disney-esque a romanticized past of Lordship that never actually existed—and worse, it actively advocates for recreating the exact conditions that led to our current unrepresentative repression and taxation system of tech Lords.

Slave Plantation Nostalgia
Dash’s core thesis rests on a fundamental misunderstanding of why Web 2.0’s “openness” was sliced and diced into plantations trying to incarcerate and exploit their users. He presents it as a simple morality tale: the good guys built open protocols, then the bad guys at Facebook and Twitter killed good things because greed. This conveniently ignores the uncomfortable truth that we see users choose fraud all the time, they waltz into walled gardens (digital flytraps) because they fall for things like “it’s easier” or “it’s free” or “it worked better” regardless of greater perspectives on what happens to them next.
“Arbeit Macht Frei” was put above Nazi death camp entrances to emphasize how much wonderful freedom victims would see after entering, which was in fact zero. Facebook’s “connecting the world” rhetoric becomes especially sinister when you consider they were literally connecting genocidal mobs in Myanmar and Ethiopia.

Do you want another Auschwitz? Because we can see exactly how Facebook is on track towards another Auschwitz.
Facebook didn’t greedily ruin open social protocols through some nefarious conspiracy—it took advantage of it the way Nazis take advantage of democratic speech. Integrate themselves into democracy in order to destroy democracy and replace it with dictatorship.
Managing your digital identity across seventeen different federated services with inconsistent interfaces and zero interoperability guarantees was spun into a false “too hard” narrative full of fear and loathing that bashed freedom by offering false hope. The “beautiful” chaos of early Web 2.0 that Dash romanticizes was actually a user experience innovation challenge that worked for the technically sophisticated early adopters.
I speak from experience, given this blog you are reading right now dates all the way back to 1995. That means I crucially had several very hands-on deep in the weed years of Web development and infrastructure before Anil Dash showed up to have a go at it. I lived through the HyperCard and Gopher collapse. I was front line in failed X.25, IPX/SPX, Tokenring, NetBEUI, Appletalk, ATM… FDDI battles, let alone the mess of HTML and SSLv2 safety flaws, and many, many more.
The Innovation/Regulation Blindspot
Here’s where Dash’s analysis becomes actively dangerous: his celebration of “janky specs that everybody hurriedly adopted” completely ignores that this same hurried, unregulated adoption is exactly how we ended up with data plantations of digital slavery for value extraction embedded in our human communications infrastructure.
Consider his own example: OAuth was indeed simple and widely adopted but not early enough to avoid consolidation forced by a speculation collapse (the 2000 dot bomb). It thus opened the door to a Trojan horse built by Google and Facebook to take over our homes as identity providers for the entire Web. That “simple” protocol became the foundation for tracking users across every website they visit, which Facebook very falsely and cruelly tried to argue was them protecting everyone from themselves (the way President Andrew Jackson used to argue Blacks are better off as his slaves being alienated from society and worked to death). The simplicity Dash celebrates was subsidized by a predatory privacy invasion that users didn’t understand and couldn’t meaningfully consent to.
The women of Harvard who filed formal complaints against Zuckerberg laid out the problem right away. However, instead of the institution appropriately reigning in such clear criminal activity that undermined and exploited vulnerable online users, Facebook was seen by Harvard as an investment opportunity.
The appropriately regulated credit card industry provides the perfect counter-example to Dash’s thesis. The payment card industry has built a foundation of extensive fraud protections, chargeback mechanisms, and compliance standards to maintain clear harm principles and definitions. This regulation doesn’t make credit cards harder to use or stifle innovation—the opposite-it makes them far safer yet easier to use. It’s not perfect, but it carries a sense of stability and evolution in it’s ordered and standardized protocols, instead of revolution and chaos breeding harms. Meanwhile, the many “simpler,” less regulated alternatives like cryptocurrency and peer-to-peer payment apps have become the preferred tools for scams, ransomware, and fraud. We all know North Korea and Russia depend on Bitcoin hacks to avoid sanctions, right? Just like South Africa used emeralds and diamonds to keep apartheid running, right?
Embrace-Extend-Extinguish Risks
As much as I love to work on a new protocol and have been looking at MCP since before day one, Dash’s excitement about big players like OpenAI adopting MCP reads like someone documenting their own mugging in real-time. When he writes “it’s cool that other platforms adopted the same spec that Anthropic made,” he’s potentially witnessing the exact moment when an open protocol gets co-opted by the worst influences.
The playbook is formulaic at this point:
- Embrace the open standard enthusiastically
- Build the most polished, user-friendly implementation
- Use your massive resources to make it the obvious choice
- Extend with proprietary features that create lock-in
- Extinguish the open alternatives through market dominance
Google demonstrated this with their habit of trying to push web standards they could dominate into capture—championing openness while building Chrome’s dominance and pushing APIs that coincidentally favored their business model. Amazon did it with open source infrastructure—embracing open tools while creating proprietary managed services that are “easier” than running open versions yourself. The point being there’s supposed to be a balance where entering a restaurant to have a kitchen prepare things for you shouldn’t mean you can’t eat again without them.
Instead of teaching someone to fish, using a standard pole and reel, some want to teach attachment and dependence with rent-seekimg taxation. You will never eat for free again if they can reduce regulations far enough to legalize extraction and exploitation.
Simplicity Can Also Be Known as Death Spiral
It’s easy to crash a plane, better to fly one. Think of that when you read Dash’s “better is worse” philosophy—the idea that developers should resist improving specs and just implement them faithfully. Without being required to sign a code of ethics that they will do no harm, you must realize what the “faithful implementation” of Hitler’s specs means in practice. Dash appears to be advocating that engineers should reproduce all the worst harms, vulnerabilities, invasions, and horrible immoral power imbalances of a system while avoiding proper inspection and accountability.
He openly admits that MCP, like the plantation South under gag rules prohibiting abolitionist speech, is “a totally opaque system when it comes to what a platform is doing with your data”. He further admits that “security risks are enormously high”. These together should become a clarion call for innovation or disqualification, not simply the footnotes lost in celebrations of mass suffering.
If you prefer a lighter analogy, how did the soccer player get to pick up the ball with his hands, stuff it under his jersey and walk it into the goal to score? Rules make the game playable.
Yet he waves them away with the blithe assumption that “that stuff won’t get fixed until there are some really egregious violations that get a ton of bad press.” This is demonstrably false. The news is almost always a constant stream of bad press for things that aren’t getting fixed. In American news cycles, for just one easy and obvious example, things get fixed when a celebrity complains about harm. The American system is highly tuned to privilege and caste, where reporters grind through “sucks to be you” stories, until someone with bazillions says they feel danger. Then everyone notices, believes things matter, and asks what can be done to protect the wealthy.
This is exactly the mentality that gave us Enron, Tesla, Cambridge Analytica, the Equifax breach, and SolarWinds. Move fast, break things that other people suffer from, because the masses are expected to bear the cost of massive technical debt and security failures of elites.
The Missing Ethical Code That Defines Real Engineering
What’s entirely absent from Dash’s analysis is any serious engagement with the moral and societal implications of systems he’s advocating for despite the obvious gaps. There’s no discussion of how open protocols should handle misinformation, harassment, or coordinated manipulation campaigns. No consideration of how “interoperability” might actually amplify harm by making it easier for bad actors to operate across platforms.
The early web’s relative openness was a small community of mostly well-intentioned academics and hobbyists, yet by 1995 I was making a career almost immediately from investigations of serious harms. This breaks down completely when scaled to billions of users including state actors, organized crime, and people whose entire business model is exploiting others. Google reported many dozens of convictions of its own management at one point, due to pressure from women working there reporting systemic abuses.
The Regulation Imperative
Here’s what Dash fundamentally misses: the choice isn’t between “open” and “closed” systems—it’s between regulated and unregulated ones. The most successful “open” technologies in history, from telecommunications to financial services, succeeded precisely because they were built on robust regulatory frameworks that understood and defined abuse in order to orient more towards fair access.
The internet protocols Dash celebrates—HTTP, TCP/IP, DNS—work because they’re managed through standards bodies with governance structures, not because they were “janky specs everyone hurriedly adopted.” In fact, having lived through hands-on deployment of all those protocols among the many proprietary ones, I know first-hand why the moment political power entered the equation, technology governance became essential.
Let Historians Be Our Guide
If we want genuinely open, interoperable systems that serve users rather than exploit them, we need to start with regulatory frameworks that have reliably worked before:
- Mandate data portability (so switching costs stay low if not zero)
- Prevent anticompetitive bundling (so open alternatives remain viable)
- Require transparency (such as algorithmic, so users understand what they’re agreeing to)
- Establish liability for harms (so companies can’t externalize all their costs)
- Defend rights of user agency (so “simple” doesn’t become “manipulative”)
The developers Dash wants to inspire should be signing code of ethics and demanding foundations first, not celebrating the return of the same unregulated expansion and capture for exploitation that led to our current predicament.
Giddy’up Partner, Who’s the Sherrif and Judge in This Town?

Dash’s “Web 2.0 2.0” isn’t a return to some golden age of openness—it’s a push towards a fiction of unregulated conditions that made slavery plantations possible in Texas after Mexico had abolished them years before. His celebration of MCP reads like someone cheering for the very forces that will inevitably capture and exploit the systems he claims to want to protect.
The real tragedy is that genuinely open, user-serving technology is possible—but only if we learn from history instead of romanticizing it. That means starting with strong ethical frameworks and regulatory protections, not hoping that “janky specs” and good intentions will somehow produce different results this time.
The stakes are too high, with technology led genocide happening as we speak, and the pattern too clear, to fall for dangerous nostalgia and ignore what needs to be done, again.
When you add up the body count from platform-enabled violence, the comparison to historical systems of harm becomes much more concrete. These aren’t just “business practices” – they’re systems that weaponize human psychology for profit while externalizing the deadly costs onto society.
- WhatsApp used for chat surveillance yet also coordinated lynchings in India
- Facebook-enabled ethnic cleansing in Ethiopia
- YouTube’s radicalization pipeline developing and deploying mass shooters
- TikTok’s algorithm pushing self-harm content to vulnerable teens
- Twitter pushing Nazi propaganda while censoring women and LGBTQ, as well as enabling coordinated harassment campaigns to force suicide
- Tesla AI algorithms causing hundreds of sudden “veered” crashes, killing dozens of passengers
The list goes on and on, and it shouldn’t have been ignored then, and certainly not now. Regulate to innovate.