Tesla Driver Beta Get a New Pair of Underpants. FSD 11.3.4 Attempts to Crash Into Oncoming Truck

The screen capture of a video (intersection approach starting at 9:00) shows clearly that Tesla’s bug-riddled software suddenly tries to veer left into the path of an oncoming truck (9:05), even though the route planner next to it has only a forward indicator.

Source: YouTube

If you blinked you might have missed that unexpected blue left arrow… and you’d perhaps then be dead. It’s unmistakable how this driver has less than a second to grab the wheel and force his Tesla back into its proper course and lane (9:06).

Source: YouTube

Why did version ELEVEN of software veer sharp left on a forward route with very obvious oncoming hazards?

Full video:

One theory seems to be after Elon Musk’s 2016 rushed demand that Tesla will depend on radar for safety, followed by Elon Musk’s 2021 rushed demand that Tesla will not use radar at all, he built a toxic culture of quality implosions and caused safety to decline with later models. This is the result of engineers being overruled by an erratic CEO’s “shiny-object” fantasy vision.

…system is randomly too conservative or too aggressive, i.e, can’t map out the surrounding correctly… I trust it as much as I trust R Kelly at a youth group.

The opposite of trust, the inverse of ethics. When you see any Tesla on the road, be prepared for it to assault others unpredictably. It’s a good reminder how society foolishly relies on Tesla drivers knowing, let alone wanting, to be risk adverse.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.