The cars keep destroying themselves, leaving everyone scratching their heads.
“It looks like it was trying to board a ferry and suddenly accelerated into the gate, basically destroying the Tesla,” said McLean. “We don’t know what caused it to happen,” said McLean, adding police are initially looking at either a mechanical issue, or a matter concerning the driver, which may have caused the sudden acceleration.
There’s a twist to the story.
There was no vessel in the berth at the time of the incident. The vehicle was not attempting to board a ferry.” […] Typically, in order for a vehicle to get to the ferry ramps it would have to have been authorized to board a ferry, so it remains unclear if the vehicle was intending to board at another ramp but ended up accelerating toward one that had no ferry.
It brings to mind the crash video from China that shows brake lights illuminated, while Tesla insisted the brakes were never used.
Tesla also claims that the driver never pressed the brakes. Pictures from public cameras show that this is not true: the brake lights are clearly on at least one occasion without any obstacles ahead…
The big problem with Tesla analysis, of course, is that they may simply have no integrity (especially when compared with other brands). The logs are fallible. So when you read a statement like this one, ask yourself whether the log may record what the car thought and NOT what the driver actually instructed.
Data shows that the vehicle was traveling at 6 mph when the accelerator pedal was abruptly increased to 100%. Consistent with the driver’s actions, the vehicle applied torque and accelerated as instructed.
That phrase “consistent with the driver’s actions” seems wrong. Why would someone write it that way? It gives the impression that they started with that assumption and then just looked for some sloppy way to prove it.
What if the pedal system increased to 100% in contradiction to the driver actions.
I’m not speaking hypothetically but from experience. I’ve been able to inject commands using CAN-bus into cars giving them bogus commands, even exploiting race conditions. Sending 100% accelerator signals and having it hit the logs begs whether any real proof exists of a connection to physical pedal.
You’ve heard of phantom braking. Why not phantom breaking… from unintended acceleration?
NHTSA opened a formal investigation in February 2022 regarding phantom braking incidents. The investigation includes 2021-2022 Tesla Model 3 and Model Y vehicles, and by May 2022, the government knew of more than 750 unintended sudden braking incidents.
Allegedly this is why some Tesla owners think they need a camera on the floor recording their foot positions.
A better solution would be the logs going to the owner. And then the owner regularly testing the logs and validating integrity controls.
With the brake pedal lights in the video contradicting Tesla’s overconfident statement that brakes weren’t applied (according to their logs) you see the problem. With the logs being sent always to the owner’s personal data storage and with regular integrity tests, you’d definitely see the problem.