Chinese Abruptly Remove Their Driverless Cars From California Roads

Some are speculating that U.S. national security concerns have spooked the Chinese, causing an abrupt halt to many foreign robots being tested on California roads.

Didi is not the only Chinese company that appears to be scaling back autonomous vehicle testing in California, or pulling out entirely.

A DMV spokesperson said that five Chinese-based companies — Baidu Apollo, Pony.ai, WeRide, Didi, and AutoX — drove around 130,000 miles on public roads in California between December 2022 and November 2023.

That’s a significant decline from the previous year, when Chinese autonomous vehicle companies conducted over 450,000 miles of testing. Didi’s vehicles only drove 4,000 miles in 2023, per BI’s calculations.

At least three other Chinese firms — Deeproute.AI, QCraft, and Pegasus Technology — which previously had licenses to test in the state, are no longer listed on the California DMV’s site.

A Deeproute.AI spokesperson told NBC that the company stopped testing in California in 2022.

Chinese-owned firms Nio, Black Sesame, and Xmotors.AI do have permits to test autonomous vehicles in California but did not record any testing activity in the last two years.

As I’ve said widely and repeatedly since at least 2016, China doesn’t need to fire ICBMs at American cities if it can just issue a simple “destroy” command to tens of thousands of road robots (driverless cars).

But on the flip side, the Chinese may worry their most advanced deployments might be embarrassingly unable to win a typical SF street brawl.

Waymo apparently was completely surprised this week when SF’s notoriously colorful Chinatown crowds destroyed their robot.

On that note I personally was in a Tesla in 2016 with three other engineers using SF roads who together 1) disconnected it from the Tesla servers and instead operated the car on a rogue service 2) injected hostile map/navigation data to throw the car off course 3) confirmed trivial and predictable vision sensor flaws (e.g. projected lines that force a “veered” crash into a target).

It was painfully obvious then, as a direct witness to the many security vulnerabilities, that Tesla had produced a remotely controlled explosive death trap that no country should allow on its roads. Yet as much as I gave lots of talks, and spilled lots of ink, I don’t think it made enough of an impression because Tesla kept sending more and more people to their early grave.

We’ve come a long way now with the news from SF that a Waymo robot was just subjected to very public safety test that it almost immediately failed spectacularly. A sense of national security finally may be forming, in as much as the Chinese see California roads no longer as ripe for trivial remote control and exploitation.

Tesla Cybertruck Can’t Handle Rain: Immediately Damaged by Water

A brand new Cybertruck showing moisture decay gets put out with the other garbage in the East Bay
Here is yet more proof that some old white guy high on drugs shooting at his own car in the desert shouldn’t be the one making actual real world product engineering decisions let alone PR claims about “survivability“.

The member wrote: “The advisor specifically mentioned the Cybertrucks develop orange rust marks in the rain and that required the vehicle to be buffed out. … [He] also shared photos of small orange specs of rust on the stainless-steel body, which they claim were taken after a “dish soap wash.”

Can’t get it wet?

Rain degrades the shell?

Have to regularly buff and buff, and buff some more, to prevent rapid deterioration after exposure to moisture?

It’s impossible to put less thought into this vehicle. A baby chimpanzee could probably deliver a more “survivable” design by randomly pushing buttons on a keyboard.

This corrosion problem seems like the very basic kind of stuff the British famously figured out by the 1800s. Somehow Tesla is caught off guard over two centuries later, to the point that owners have to publicly and loudly grouse about rust spots just days after getting delivery of a new car.

Let me put it like this. If Elon Musk had built ships for the British in 1800, they’d all be speaking French today. Tesla peddles fraud, the opposite of survivable.

WaPo Warns of “Veered” FSD Crash and Burn: Tesla Employee First Victim

1) The witness and barely-surviving passenger in a Tesla very clearly stated (based on 911 dispatch recordings and interviews) that the owner had FSD controlling the car when it crashed and killed him.

Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road”…

2) The same witness said FSD had been in use on the trip just before the fatal crash, and noted that it had been repeatedly making unsafe movements requiring quick interventions.

3) The dead owner’s widow also stated the victim was convinced as an employee of Elon Musk that FSD should be trusted all the time and every time the car was operated… for safety.

Von Ohain used Full Self-Driving nearly every time he got behind the wheel, Bass said, placing him among legions of Tesla boosters heeding Musk’s call to generate data and build the technology’s mastery. While Bass refused to use the feature herself — she said its unpredictability stressed her out — her husband was so confident in all it promised that he even used it with their baby in the car. […] “Now it feels like we were just guinea pigs.”

[…] “Once Hans passed away and time went by, there wasn’t any more discussion about him,” said the former employee, a member of von Ohain’s team who soon resigned. To von Ohain’s widow, Tesla’s silence seemed almost cruel. Though the company eventually helped cover the cost of her move back home to Ohio, Bass said, Tesla’s first communication with the family after the crash was a termination notice she found in her husband’s email.

These three points come through clearly in a new Washington Post article about the late Hans von Ohain, a former Tesla employee.

Tesla owners have long complained of occasionally erratic behavior by the cars’ software, including sudden braking, missed road markings and crashes with parked emergency vehicles. Since federal regulators began requiring automakers to report crashes involving driver-assistance systems in 2021, they have logged more than 900 in Teslas. A Post analysis found at least 40 crashes that resulted in serious or fatal injuries.

[…]

As Rossiter yelled for help on the deserted mountain road, he remembers, his friend was screaming inside the burning car.

Allegedly the Tesla Model 3 FSD software “veered” off a Colorado road straight into a tree without braking. Although von Ohain survived the crash, emergency responders and his friend watched him trapped and burned alive, typical for Tesla crashes.

Check out Tesladeaths.com if you want to track the rapidly mounting Tesla death toll.

And watch this.