The More Driverless Cars Deployed The Less People Want Them

From 2012 I warned here and in talks that the biggest and most significant problem in big data security was integrity. The LLM zealots didn’t listen.

By 2016 in the security conference circuit I was delivering a series of talks about driverless cars being a huge looming threat to pedestrian safety.

Eventually, with Uber and Tesla both killing pedestrians in April 2018, I warned that the move to such low quality robots on the road would increase conflict and fatalities even more instead of helping safety.

Well, as you might guess from my failure to slow LLM breaches, I had little to no impact on the people in charge of regulating driverless engineering; nowhere near enough influence to stop predictable disasters.

It’s especially frustrating to now read that the NHTSA, which was politically corrupted by Tesla in 2016 to ignore robot safety and hide deaths, is still so poorly prepared to prevent driverless causing pedestrian deaths.

Feds Have No Idea How Many Times Cruise Driverless Cars Hit Pedestrians

Speaking of data, confidence in driverless has continued to fall as evidence rolls in, which is a classic product management dilemma of where and how to field safety reports. Welcome to 2012? We could have avoided so much suffering and loss.

Great Disasters of Machine Learning: Predicting Titanic Events in Our Oceans of Math

One thought on “The More Driverless Cars Deployed The Less People Want Them”

  1. I appreciate your clear explanations and the emphasis on safety.
    It’s evident that you care about societal well-being and want us to tackle these issues safely and effectively.
    Your blog has become my go-to resource for driverless-related ethics, and I’ll be sure to share it with colleagues who might face similar concerns with their engineering.
    Keep up the fantastic work!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.