Was the driver asleep? Failure of the car’s “intelligence” seems the most likely explanation, although we can’t yet rule out a human deciding to drive into oncoming traffic across a bridge at 230am. The driver looks alert and standing freely, surrounded by CHP:
Here’s the map view of the Freemont entrance to the Bay Bridge, with yellow lines to demarcate I-80 and its various tentacles gripping the city. It’s been going through renovations and at least confusing enough that Google felt a need to place directional arrows on their map:
Again, it’s tempting to say Tesla has nothing to do with this. Perhaps some will say a human would have had a reason (from being confused to willful disregard) to enter the wrong deck (upper, westbound) while headed east over bridge. They may even argue the computer could have done a better job.
However it’s even more likely and tempting to discuss whether a SF driver asleep like so many other cases put too much trust in their car (typical tech worker living in East Bay taking Tesla into city because awesome supercar autopilot is awesome, duh why don’t you believe in the ubergenius of Musk?).
Officers say he failed a sobriety test but told them it didn’t matter because his car was on auto-pilot.
And then there was Tesla guy asleep while driving south on 101 at 3:30am at 70 mph. CHP put themselves in front of the Tesla and hit the brakes to convince the computer to stop:
CHP could not confirm that the vehicle was on Autopilot, but “considering the vehicle’s ability to slow to a stop when Samek was asleep, it appears the ‘driver assist’ feature may have been active at the time.”
And another Tesla was spotted in LA operating without a driver, apparently because a “little thing” defeated Tesla’s best safety attempts to detect human alertness
the Tesla driver appeared slumped over with something tied around the steering wheel.
“If his little thing tied around that steering wheel fell off, and he was still sleeping, he would have slammed into somebody going 65 miles per hour,” Miladinovich said.
When the system doesn’t sense adequate torque on the steering wheel, Tesla says…[it does something about it]
It may turn out Tesla engineers didn’t think about common safety issues for upper and lower deck bridges. That’s what we’re waiting now to have CHP confirm, based on the story so far and that screen grab of the driver.
In birds-eye view you can see the reports of Tesla going the wrong-way at Freemont Street and I-80 puts the car right at the start of the upper/lower deck split:
Entering upper deck means a primitive navigation tool still would register right path on map and be unable to react until it was far too late (separated past Treasure Island) and restricted by barriers…continuing about 10 miles into the 880 northbound on the wrong side.
All that above begs the question whether a 2019 computer would allow such navigation variances that it wouldn’t prevent a car from driving directly into oncoming traffic on the wrong deck of one of the longest bridges in America, close to Tesla HQ.
Tesla engineering has been known to misread road lines, misread road-signs, slam into barriers and even spontaneously explode into fireball…at this point I’ll wager a stacked double-decker bridge entrance was all it took for Tesla AI to willingly start driving wrong way.