Tesla Keeps Crossing Double-Yellow Causing Head-on Collisions

Elon Musk infamously boasts he makes mistakes whenever and doesn’t respect the rules.

Well, I guess we might make some mistakes. Who knows?

This seems to be coming up repeatedly as bad news for his customers, let alone anyone around them, when their car acts like the CEO and gloatingly crosses a double yellow.

Source: DC Fire

Another Tesla, another Tesla owner dead:

The preliminary investigation revealed, at approximately 7:50 a.m., a 2019 Tesla Model 3, occupied by five subjects, was traveling southbound in the left travel lane in the 3000 block of Connecticut Avenue, Northwest. The Tesla crossed the solid-double yellow lines into the northbound lane of travel, struck a 2018 Toyota C-HR, and then struck a 2010 Mercedes-Benz ML-350, head-on. […] On Sunday, February 26, 2023, the driver of the Tesla succumbed to his injuries and was pronounced dead.

The road suggests the driver was operating in a straight line on a road known for speed abuse (e.g. “40 to 60 percent of the people completely disobeyed the speed limit by more than 12 miles per hour”). A related problem with Tesla ignoring double yellow has been reported for many years. The algorithm treats lines as open to cross in almost any case, such as a vehicle slowing ahead.

Source: Google StreetView

I’ll say that again, Tesla engineering allegedly treated double yellow the same as dotted white line. Cars were trained to attempt unsafe maneuvers to feed unnecessary speed.

In this case when a car ahead slowed for the pedestrian crossing (as they should), a speeding Tesla algorithm likely reacted by accelerating in a jump across the double yellow into a devastating head-on crash… by design!

That crash scene reminds me of December 2016 when Uber driverless was caught running red lights, foreshadowing one death a year later that brought criminal charges and shut down their entire driverless program.

Source: Twitter

Tesla driverless software caused a similar fatality basically at the same time April 2018. Their cruel and unusual reaction was to increase the charge more and hire an army of lawyers to make the victims and their story disappear.

…when another vehicle ahead of him changed lanes to avoid the group [of pedestrians], the Model X accelerated and ran into them [killing Umeda].

Source: US District Court. Tesla software accelerated into pedestrians, parked motorcycles and a van. The company has for years manipulated courts and press to cover up this important 2018 crash.

2018.

Tesla software fatally accelerated straight into a traffic/hazard after it saw a vehicle ahead evade it. See why this 2023 crash reminds me of 2018?

Did you hear about this death and how bad Tesla was, or just the Uber case? The crashes were around the same time, and both companies should have been grounded. Yet Tesla instead bitterly spared with a grieving family and manipulated the news.

Here’s a 2021 Tesla owner forum report showing safety engineering has regressed, FSD five years later feeling worse than Uber’s cancelled program.

Had a super scary moment today. I was on a two lane road, my m3 with 10.6.1 activated was following a van. Van slowed down due to traffic ahead and my car decided to go around it by crossing the yellow line and on to oncoming traffic. […] The car ahead of me wasn’t idling. We were both moving at 25 mph. It slowed down due to traffic ahead. My car didn’t wait around. It just immediately decided to go around it crossing the yellow line.

A cacophony of voices then chime in and say the same thing happened to them many times, their Tesla attempting to surge illegally into a crash (e.g. “FSD Beta Attempts to Kill Me; Causes Accident”) . They’re clearly disappointed software has been designed to ignore double yellow in suicidal head-on acceleration, although some try to call it a bug.

January 2023 a new post shows Tesla still has been designed to ignore road safety. This time Tesla ignores red lights and stopped cars, allegedly attempting to surge across a double yellow into the path of an oncoming train!

Source: Tesla Motors Club

2023. A ten year old bug? Ignoring all the complaints and fatalities?

You might think this sounds too incompetent to be true, yet Tesla recently was forced to admit it has intentionally been teaching cars to ignore stop signs (as I warned here). That’s not an exaggeration. The company dangerously pushes a highly lethal “free speed extremism” narrative all the way into courts.

A speed limiter is not a safety device.

That’s a literal quote from Tesla who wanted a court to accept that reducing speed (e.g. why brakes are on cars) has no effect on safety.

As the car approached an intersection and signal, it accelerated, shifted and ran a red light. The driver then lost control… [killing him in yet another Tesla fire]. The NTSB says speeding heightens crash risk by increasing both the likelihood of a crash as the severity of injuries…

Let me put it this way. The immature impatient design logic at Tesla has been for its software to accelerate across yellow lines, stop signs and through red lights. It’s arguably regressive learning like a criminal, to be more dangerous and take worse risks the more it’s allowed to operate.

…overall [Tesla’s design feeds a] pattern here in Martin County of more aggressive driving, greater speeds and just a general cavalier sense towards their fellow motorists’ safety.

The latest research on Tesla’s acceleration to high speed mentality shows it increased driver crash risk 180 to 340 percent, with a survival rate near zero! Other EV brands such as Chevy Bolt or Nissan LEAF simply do not have any of this risk and are never cited.

340 percent higher crash risk because of design.

You have to wonder about the Tesla CEO falsely advertising his car would magically be safer than all others on the road… while also boasting he doesn’t know when obvious mistakes are being made. It’s kind of the opposite. The CEO likely is demanding known mistakes be made intentionally, very specific mistakes like crossing the double yellow and accelerating into basically everything instead of slowing and waiting.

People fall for this fraud and keep dying at an alarming rate, often in cases where it appears they might have survived in any other car.

And then you have to really wonder at the Tesla CEO falsely advertising his car would magically drive itself, inhumanely encouraging owners to let it wander unsafely on public roads (e.g. “wife freaks” at crossing double yellow on blind corner) yet never showing up for their funerals.

Let me just gently remind the reader that it was a 2013 Tesla crossing a double yellow killing someone that Elon Musk boasted was his inspiration to rush to market “Autopilot” and prevent it ever happening again.

2013.

Ten years later? Technology failures that indicate reckless management with a disregard for life that rises to the level of culpable negligence.

Note there is another big new lawsuit filed this week, although it talks about shareholder value as if that’s all that we can measure.

Tesla finally is starting to be forced by regulators into fixing its “flat wrong” software. The gravestones are proof of the mistakes being made. The grieving friends and families know.

The deceased driver’s profile in this latest crash says he went to Harvard, proving yet again intelligence doesn’t protect you from fraud.

Source: Tesladeaths.com

And that’s not even getting to reports of sudden steering loss from poor manufacturing quality, or wider reports about abuse of customers:

Spending over $20,000 on a $500 repair… a LOT of customers are getting shafted on this…TRUSTING Tesla do this, and they’re failing horribly at the expense of the customer.

If you don’t die in a crash from substandard hardware and software engineering, or grossly negligent designs, Tesla’s big repair estimate scams might kill you.

Why are Tesla allowed on the road?

Elon Musk Just Layed Off The Staff Who Believed His Promise They Wouldn’t Be Layed Off

If you’re still working at Twitter you might think the lesson from the Titanic is that promotions were open territory soon after its skipper drove into an iceberg.

Musk fired Crawford last week, as part of yet another round of layoffs at the company — after Musk promised late last year that there would be no more layoffs.

Crawford apparently thought she would coldly benefit from others being pushed into lifeboats, as if a big promotion suddenly was hers if she slept on the deck or complimented the skipper’s pants.

Layoffs instead came for those who foolishly stayed, those believing their CEO promise there would be no layoffs.

First they came for PR and she said nothing, then they came for engineering and again she said nothing, then they came for her…

This CEO is constantly saying things immediately proven false. He uses a classic tactic of permanent improvisation, formerly known as white supremacist dictatorship.

Layoffs are unfortunate, but layoffs soon after layoffs while promising no layoffs… that’s executive malfeasance.

Tesla Racing Instructor Warns Sudden Acceleration A Design Flaw: NOT Driver Fault

As I suggested a couple weeks ago, Tesla sudden acceleration has hallmarks of 1980s design flaws.

Now a Tesla Racing Instructor is trying to tell the world it happened even to him.

…nothing hits home as something like this happens to a Tesla fan. Greg Wester, a long-time Tesla owner from San Francisco, just shared a close encounter when his Tesla Model 3 suddenly accelerated in a parking garage. According to his story, the car was stopped when it suddenly bounced forward. Luckily, he had his foot on the brake pedal and was able to “overpower it.” Now, Greg is not a regular driver. He races his Tesla and is also a racing instructor, so he should know when to press the accelerator and when to brake. Greg is trying to comprehend what caused this and is willing to extract the info from the car computer if he finds a way. Nevertheless, he has lost confidence in Tesla now. Following his incident, he wrote on Twitter that he seriously considers installing a 400-volt kill switch next to the steering wheel. He also says that “no pedestrian should ever walk in front of a Tesla that is parking.”

No one is safe around a Tesla, although this guy brags “I can handle dangerous cars” he also admits a serious flaw:

…willing to extract the info from the car computer if he finds a way…

This is HUGE. A monster clue. He doesn’t trust Tesla and hints here he needs total separation for the info. It’s his car, his data. Yet Tesla isn’t giving owners a way to present facts about their own life because Tesla controls “info” entirely and selfishly. Their management practices always have been a four alarm data security dumpster fire. Tesla staff can do what they are known to do and lie, with no simple trusted path designed for truth to be found by the people they put in harm’s way.

The driver gave his detailed explanation on Twitter, reminiscent of the 911 call that changed everyone’s mind about Toyota liability (Mark Saylor’s calm, professional tragedy disproved driver fault).

It felt almost like some extra electricity all of a sudden grounded and spun the motor. Very scary. I was sitting close to the steering wheel and my foot was 100% covering the brake, light pressure — not half on like heel-toe shifting and throttle blipping. Wearing flat tennis shoes.

In related news another Tesla suddenly accelerated into a building and caught fire.

It then caught fire again while on the tow truck, injuring the driver and forcing crews to dump its burning toxic wreckage into the street.

That’s almost as bad as the news a Tesla accelerated into a girl scout cookie stand in a Walmart parking lot.

The guy suddenly accelerating into girl scouts was given a sobriety test. His car was not given a similar test. Perhaps he could argue unless Tesla agrees to independent standardized third party tests of its data in real time… then he should be allowed (like Tesla) to test himself days later and in a closed box to decide if he thinks he was at fault.

Tesla seems more and more like an airplane that can’t land without crashing. Don’t get in one. Don’t be around one.

Increasing Evidence Tesla Drivers Burn to Death While Unable to Open Any Door

I’ve noticed a string of Tesla reports saying basically the same thing.

Drivers who survive a Tesla crash succumb to smoke and fire in a confusing escape puzzle — they’re killed by design, a planned death-trap, not the impact.

First, to set the stage of accountability Tesla early on fraudulently tried to claim it would be the safest car on the road.

This was a careless prediction not a statement of fact, and it of course lulled people into false sense of duty and care about survival.

Look at the 2016 autopilot crashes and Tesla’s CEO announcement that changes were being made to bring down deaths (they didn’t, Tesla deaths increased dramatically).

Tesla deaths by year. Source: TeslaDeaths.com

A simple example of the disconnect and a long lingering problem was a 2016 high speed crash into the back of a high visibility service vehicle (flashing warning lights etc) and another one that year under a truck (decapitation) — both caused by autopilot design.

Neither are close to what safety tests are designed to rate, and they have continued to happen to Tesla drivers.

I wrote about a tragic case just the other day in 2023 that is like the exact same problem straight out of 2016.

What Tesla seems to have been doing instead of actual hard work of progress (engineering survival based on data) and preventing repeated crashes was… lazily gaming crash tests and calling their own customers dumb.

That inhumane attitude leads now into yet another uniquely Tesla tragedy. Drivers allegedly are being burned to death while unable to exit their car (a repeating and predictable event also not in the crash test).

Consider a detail in Vancouver’s under-reported investigation. A driver who was fully alert in a stopped position noticed smoke spewing from a vent. His new car doors and windows failed to open. Young and fit he immediately kicked out a window. His car is engulfed in flames within minutes.

This was not a crash, it was not a battery issue. It was just another Tesla catching fire without any warning, locking the driver inside by design. Kicking the window gives us the rare survivor perspective. Here’s how he described it:

…Jutha told the North Shore News after the fire he’d never had to use the emergency door release – a latch that the driver can pull up, located under the door panel containing the window controls – in the eight months that he owned the vehicle – a Tesla model Y 2021 – and didn’t know where to find it. Most of the other Tesla owners he spoke with after the fire also had no clue where the mechanical lever is… “What if I was an older person who couldn’t kick out the window?” he said. “It was terrifying.”

A new 2021 car spontaneously burst into flames, becoming a death trap in seconds. Who wants to be told the pinnacle of Tesla design engineering is a messy puzzle to open a door when wasted time means you die from their fire?

The danger probability brings to mind one of the early red flags on Tesla engineering culture. A factory wiring mistake caused a car to burst into flames during a dealer test drive. It wasn’t reported enough, especially because Tesla brushed it aside with an obvious “no true Scotsman” fallacy. That’s not normal.

It also brings to mind stories about Tesla sitting in dealer lots, and on dealer trucks, that burst into flames. When dealers and delivery professionals are seeing frequent fires like this, when it’s their entire job to prevent them, you can expect every driver to be in a worse position than them. Again, not normal.

Is there high alert now for every Tesla on the road? Are drivers running manual exit tests before they start every trip?

If you were a pilot would you be confident stepping into a 737MAX just by saying “I read the manual once”? An even better example might be the B26 “Widowmaker”.

Life Begins With a Checklist…and it May End if You Don’t Use It

Tesla probably should force drivers to watch a safety instructional video before every ride. This is how to unbuckle your belt, emergency exits are on your right and left, this is how to operate a door handle. Watch the whole thing every time you start the car, proving you are willing and capable of operating the exit, or no go.

Sadly Tesla drivers seem to be on the far opposite end with their “it does everything for me” attitude, buying a car to avoid using their delicate hands. Doing things is for people who didn’t drop big fantasy money for magic pumpkin rides. A Tesla consumer profile likely would never accept hours of monthly training and testing just to open a car door.

It’s like they firmly believe buying a lottery ticket from a scam artist won’t be worthless at the time of need.

Disney fantasy princess thinking should be illegal in transit safety, definitely not buoyed by irrelevant safety tests and CEO promises for things that never happen.

Second, on that note, there’s a huge buried lede in the massively over-reported 2021 Texas crash investigation. The driver died because he desperately failed to open a door.

Headlines have coldly been trying to exonerate “driverless” software, completely burying the issue.

The government report explained why an injured Tesla driver was found burned to death in the back seat. The NHTSA soberly indicated he was apparently unable to open any doors.

The frontal impact with the tree resulted in a power loss of the car’s 12-volt system… mechanically opening the rear door during a power loss requires additional steps. According to the owner’s manual, during a loss of 12-volt system power, a rear-seated occupant must locate a small cutout in the carpet beneath the seat cushions and pull the mechanical release cable tab toward the center of the vehicle to manually open the rear door.

For some reason that chilling analysis is not making headlines at all. I haven’t found anyone speaking about the trap except the investigators, who are slowly and methodically raising alarms.

We’re living in a time when journalists rush to print “robot wasn’t in charge when owner died” to catch far more eyeballs than the honest headline and warning:

Owner died when trusted robot failed to let him escape its fire“.

The report hints at the driver being intoxicated, not just badly injured. That’s important because he wasn’t going to kick out the windows. It’s also important because Tesla may be far more to blame here than the usual drink and drive narrative.

Being too intoxicated to open a door to get out but not too intoxicated to open a door to get in…

In fact, when authorities pulled a sleeping driver from a Tesla, headlines bizarrely tried to credit the car for protecting him after enticing him to get in and go. All the credit, none of the responsibility. The driver stupidly argued he couldn’t be charged because Tesla had sold him a robot so he believed he was his own passenger.

It’s like a fairy tale of “being dumb lucky” was what sold him on the car, so he could gamble with everyone’s lives not just his own.

That brings me to the third point. I’m finding more and more evidence of high-risk owners thinking the Tesla CEO was targeting and enabling them to get crazy.

They want to be sleeping in the car while it travels on dangerous public roads at high speed. And that’s what this CEO said he was selling them.

There’s an especially bad imbalance here. The CEO pressed hard on an “easy” marketing campaign that falsely portrayed the car as safe to get into when drivers are impaired. Sleeping drivers then became a common thing with tragic results (here, here and here to start), begging the question why Tesla didn’t make it much harder to use.

On the other hand the car maker installs a complex puzzle by design that makes it basically impossible to get out of as it fails unsafe.

Getting in and going fast was engineered extensively to be trivial for someone incoherent who can’t wake up. Yet getting out before smoke and fire kill the driver? Intentionally engineered to be so difficult that even expert crash investigators are stumped after months of trying.

Read the new report from Colorado.

After more than eight months of investigating, the Colorado State Patrol has finished its investigation into a May 2022 Tesla crash and subsequent fire in Evergreen that killed one person. […] The CSP investigator writes in the report, “I am unable to conclusively determine why (Von Ohain) did not exit the vehicle.” […] Madden stressed that everyone who owns and drives an electric vehicle should the manual and know how to get out of the car if the electronics fail.

The passenger in this case was able to get out. NOT the driver.

To be clear, taxpayers funded an eight month investigation. The driver was awake and alert but was unable to open a Tesla door. His passenger exited only to watch the driver burned alive. And the report concludes…

Drivers should read the manual to avoid death in a burning box?

Eight months of digging, exploring all the data, yet safety experts still couldn’t find an answer for why a driver couldn’t exit his Tesla. Something tells me the manual won’t help. It’s a design failure that needs to be scrapped.

Being stuck like this seems exactly backwards for drivers. Getting in and getting it moving should be RTFM (read the fffing manual) whereas stopping and getting out should be designed as push button easy.

Also there’s a data security footnote from Colorado, as necessary logs were destroyed in the fire due to “remoteness”. Another Tesla design failure.

Sales of Tesla obviously are juiced by the ease of drivers getting in and driving so fast they can’t stop, but will they ever be impacted as we find why so many can’t get themselves out after a crash?

Is being a death trap somehow good for business?

And that’s not even speaking to the fact that first responders aren’t given information from Tesla needed to do forced extractions. In some cases there was no Tesla manual for people whose job is to read the manual. In other cases they were told by Tesla to crawl into the fire to find a hidden latch, when the fire was too intense to even break the window.

Anyone who has ever been on an airplane knows the public sentiment on such an important risk model. Getting out alive gets priorty over an easy ride. It’s very, very entry level transport ethics. Fail safe is the most basic level of robotics and engineering, a bar that Tesla never should have been allowed to fall below.

Tesla fails at the most basic tests of ethics, such that a rise in detailed expensive investigations of easily avoidable driver deaths should convince regulators to ban the brand.