A Story of Bessie and Bits: Don’t Let Tech Oligarchs Turn Your Life into Their Livestock

In 1941, Congress passed the National Cattle Theft Act to crack down on interstate cattle rustling. Today, tech giants face similar scrutiny for how they handle our personal data – but calling it “handling” is like calling cattle rustling “secret livestock relocation.” The cattle theft law was brutally simple: steal someone’s cow, cross state lines, face up to $5,000 in fines (about $100,000 today) and five years in prison. You either had someone else’s beloved Bessie or you didn’t.

Clean, clear, done.

But data “theft” in our age of artificial intelligence? America’s tech oligarchs have made sure nothing stays that simple. When companies like Microsoft and Google harvest our personal data to train AI systems, they’re not just taking, they’re effectively duplicating and breeding. Every piece of your digital life from search history to social media posts, photos to private messages is treated like human livestock in their “data” centers, endlessly duplicated and exploited across their server farms to maximize growth for exploitation. Unlike cattle rustlers who at least had to know how to tie a knot, these digital ranchers have convinced courts and Congress that copying and exploiting your life isn’t really theft at all. It’s just “data sharing.”

As described in a recent Barings Law article, these tech giants are being challenged on whether they can just repurpose our data for their benefits. Their defense? You clicked “I agree” on their deliberately incomprehensible terms of service.

It’s like a cattle rustler claiming the cow signed a contract. It’s like the Confederacy publishing books that the slaves liked it (true story, and American politicians still to this day try to corrupt schools into teaching slavery is good and accountability for it is bad).

[Florida 2023 law says] in middle school, the standards require students be taught slavery was beneficial to African Americans because it helped them develop skills…

The historical parallel that really fits today’s Big Tech agenda isn’t cattle theft — it’s darker, as in racist slavery darker. Think about how plantation owners viewed human beings as engines of wealth generation, officially designated as “planters” in a system where the colonus (farmer) became colonizer. Today’s tech giants have built a similar system of value multiplication, turning every scrap of our digital lives into seeds for their AI empires.

When oil prospectors engaged in highly illegal competitive horizontal drilling in Texas to literally undermine ownership boundaries, at least they were fighting over something finite. But data exploitation? It’s infinite duplication and leverage. Each tweet, each photo, each private message becomes raw material for generating endless new “property” all owned and controlled by the tech giants.

Have you seen Elon Musk’s latest lawsuit where he falsely tries to claim that all the user accounts in his companies are always owned solely by him and not the users who create them and use them?

The legal framework around data rights hasn’t evolved by accident. These companies have deliberately constructed a system where “consent” means whatever they want it to mean as long as it benefits them. Your data isn’t just taken, it’s being cloned, processed, and used to build AI systems that further concentrate power in their hands. Could you even argue that a digital version of you they present as authentic, isn’t actually you?

The stakes go far beyond simple questions of data ownership. We’re watching the birth of a new kind of wealth extraction that denies real consent; one that turns human experience itself into corporate property with no liberty or justice for anyone captured.

The historic cattle laws stopped rustlers. The historic oil laws eventually evolved to protect property owners from subsurface theft. Today’s challenge is recognizing and confronting how tech companies have built an empire on an expectation of unlimited exploitation of human lives just because they are digital too.

As these cases wind through the courts, we’re left with a crucial question: Will we let companies claim perpetual rights to multiply and profit from our digital lives just because we were dragged against our better judgment into their gigantic monopolistic services as if the magna carta never happened? Should clicking “I agree” grant infinite rights to extract value from our personal data, creative works, and social connections like we’re meant to be serfs under a digital kleptocrat?

The answer will shape not just our digital future, but our understanding of fundamental human rights in the age of artificial intelligence.

Get a rope

How Palantir’s “God’s Eye” Created the Very Terrorists It Promised to Find

A Stryker vehicle assigned to 2nd Squadron, 2nd Stryker Cavalry Regiment moves through an Iraqi police checkpoint in Al Rashid, Baghdad, Iraq, April 1, 2008. (U.S. Navy photo by Petty Officer 2nd Class Greg Pierot) (Released)

From 2007-2014, Baghdad’s American-designed checkpoints were a daily game of “Russian Roulette” for Iraqi civilians. Imagine being stopped, having rifles pointed at your head, being harassed or detained simply because a computer system tagged you as suspicious based on the color of your hat at dawn or the car you drove.

This was the reality created by Palantir Technologies, which sold the U.S. military and intelligence community on the promise of a “God’s Eye” system that could identify terrorists through data analysis. But compelling evidence suggests their unaccountable surveillance system instead helped create the very terrorists they claimed they would find.

The evidence is stark: In 2007, Baghdad had over 1,000 checkpoints where Iraqis faced daily humiliation — forced to carry fake IDs and even keep different religious songs on their phones to avoid being targeted. By 2014, many of these same areas had become ISIS strongholds.

This wasn’t coincidence.

A pivotal WIRED exposé revealed how Palantir’s system nearly killed an innocent farmer because it misidentified his hat color in dawn lighting. U.S. Army Military Intelligence experts on the ground described their experience literally as:

“if you doubt Palantir you’re probably right.”

And here’s the key quote that encapsulates the entire broken system:

“Who has control over Palantir’s Save or Delete buttons?”

The answer: Not the civilians whose lives were being ruined by false targeting.

The Institute for War and Peace Reporting documented how these checkpoints created a climate of fear and sectarian division in 2007. Civilians were “molested while the real militants get through easily.” The system was so broken that Iraqis had to carry two sets of ID and learn religious customs not their own just to survive daily commutes.

Most damningly, military commanders admitted their targeting data was inadequate and checkpoint personnel had “no explosives detection technology and receive poor, if any, information on suspicious cars or people.” Yet Palantir continued to process and analyze this bad data, creating an automated system of harassment that pushed communities toward radicalization.

When ISIS emerged in 2014, it found fertile ground in the very communities that had faced years of algorithmic targeting and checkpoint harassment. The organization recruited heavily from populations that had endured years of being falsely flagged as threats — a tragic self-fulfilling prophecy. During this period, Palantir’s revenue grew from $250 million to over $1.5 billion — a for-profit-terror generation engine enriching a few who cared little or not at all about the harms. The American taxpayers were being fleeced.

Palantir marketed itself as building a system to find terrorists. Instead, it helped create them by processing bad data through unaccountable algorithms to harass innocent civilians until some became the very thing they were falsely accused of being. The company has never had to answer for this devastating impact.

As we rush to deploy more AI surveillance systems globally, the lesson of Palantir in Iraq stands as a warning: When you build unaccountable systems to find enemies, you may end up creating them instead.

We must ask: How many of today’s conflicts originated not from organic grievances, but from the humiliation and radicalization caused by surveillance systems that promised security while delivering only suspicion leading into extra-judicial assassinations?

Palantir’s profits are from failure. Their income an indicator of the violence they seed.

Note: This analysis draws on documentation from 2007-2014, tracking the relationship between checkpoint systems and the rise of ISIS through contemporary reporting and military documents.

AU Tesla Autopilot Gone Wild: Out-of-Control Robot Attacks Parked Cars, Owners

Saying “Out-of-Control Tesla” seems redundant at this point.

Wild footage captured the moment an out-of-control Tesla hit vehicles in a busy shopping centre carpark, before plummeting off the side and injuring its two occupants. The driver’s dash cam showed a black Tesla T-bone an SUV, causing it to spin on the rooftop carpark at DFO Homebush, in Sydney’s west, at about 9.55am on Saturday. The vehicle kept driving and struck the car with the dash cam. A loud crash was heard from the Tesla as it went over the edge of the carpark to the level below. The Tesla is understood to have been on autopilot…

You don’t want to be anywhere near a Tesla robot, obviously.

Why does Australia even allow them in the country? If they can ban assault automatic rifles they can ban assault automatic pilots. Tesla is a threat to public safety by design.

FL Tesla Kills One Motorcyclist

Another motorcyclist has been killed by Tesla, apparently by abruptly turning left in front of them. Boca News Now has reported it as “DEATH BY TESLA”.

According to investigators, the Tesla began to make a left turn into the path of the Kawasaki Ninja. The front of the Ninja impacted the passenger’s side front fender of the Tesla. As the crash ensued, the Tesla rotated counterclockwise and came to an uncontrolled final rest within the outside northbound lane of Seminole Pratt Whitney Road.

Driverless is suspected, given Tesla sensors have a long and tragic history of failing to see motorcycles and violently running into them.