TX Tesla Kills One by Plowing Into Rear of Car Stopped at Red Light

“Autopilot” seems a most plausible explanation for how red stoplights at an intersection, as well as the red brake lights on the back of a stopped car, were all conspicuously ignored at very high speed.

A person who was behind the wheel of a Tesla is now dead after reportedly crashing into an Uber in downtown Houston Sunday night. The deadly crash happened on Jefferson Street and La Branch around 11:30 p.m. Police said that the driver of the Tesla rear-ended the Uber, which was stopped at a red light. The Tesla was said to have been traveling at a high rate of speed at the time of the collision. The driver of the Tesla was later pronounced dead…

Source: Google Maps

I mean it could be another case of suicidal Tesla, but the presence of a passenger, not to mention the configuration of four lanes with two lights, makes that less plausible than Autopilot failure.

My first guess is that the Tesla’s software had registered the brake lights of the car but not that the car was stopped, as it didn’t see the red lights above the lane it was speeding in, but we’ll have to wait for more details.

According to HPD Vehicular Crimes Division Sergeant R. Dallas and Officer T. Syed, the Tesla was traveling eastbound on Jefferson at a high speed when the driver lost control. The Tesla struck a gray Toyota Highlander, which was stopped at a red light at the intersection of Jefferson and La Branch. The Tesla continued through the intersection, left the roadway, and collided with the concrete wall of a parking garage.

Can you imagine a Tesla at high speed on Autopilot ignoring these red lights until it’s too late and then crashing into a parking garage on the right of the intersection? Of course you can. It’s almost certainly another Autopilot crash. Source: Google Maps

Colorado HB23-1011: AgTech Right to Repair Starts in 2024

Last year the government of Colorado drew a bright line on the right to repair, which just went into effect. Hundreds of thousands of agricultural workers can now legally fix their own technology without unfair interference or fees by manufacturers.

Starting January 1, 2024, the act requires a manufacturer to provide parts, embedded software, firmware, tools, or documentation, such as diagnostic, maintenance, or repair manuals, diagrams, or similar information (resources), to independent repair providers and owners of the manufacturer’s agricultural equipment to allow an independent repair provider or owner to conduct diagnostic, maintenance, or repair services on the owner’s agricultural equipment. A manufacturer’s failure to comply with the requirement to provide resources is a deceptive trade practice.

Here are the bi-partisan primary sponsors of the bill, who seem to have done an exceptional job writing technology concepts into law:

  • Brianna Titone – Software Developer
  • Ron Weinberg – IT Solutions
  • Nick Hinrichsen – Transit Operations
  • Janice Marchman – Teacher

You Can Get Answers Only to Questions You Think to Ask

This phrase seems like the buried lede in an economic analysis of market discontent.

No single poll is definitive, and you can get answers only to questions you think to ask.

It’s a philosophical point, which for me invokes the wonderful empiricist insights of the late 1700s.

The path towards knowing the right questions to ask can be derived from one of the most famous philosophers in history, Mary Wollstonecraft. A prominent figure in the early suffrage and abolition movements (rights for women and non-whites) she wrote “A Vindication of the Rights of Woman” in 1792, which argued for social and political equality (power) through the process of learning (education).

In the modern context of AI hacking, Wollstonecraft’s emphasis on societal responsibility of knowledge is very pertinent. She famously stated “I do not wish [women] to have power over men, but over themselves”, which should be required training for every hacker. Her philosophy remains powerfully useful, providing us with the idea that “learning” via questions is a known disruptive power dynamic in society that must be guided through ethical considerations.

My favorite way of thinking about it (as a long time physical security auditor, let alone encryption key breaker) is no single lock is definitive, but every prior lock is illustrative. Thus you get past locked doors by thinking about enough of the ways to get in.

Criminal Charges: German Tesla in “Veered” Head-on Collision

A German news report says a Tesla tragically veered head-on into an oncoming car.

Den Fahrer des Tesla erwartet nun ein Strafverfahren wegen fahrlässiger Körperverletzung bei einem Verkehrsunfall.

Negligent (fahrlässiger).

The Tesla driver now faces criminal proceedings for causing negligent bodily harm in a traffic accident.

You can clearly see in the photo from the emergency crew that road markings were obscured, and the Tesla failed similarly to many other Autopilot cases of head-on collision.

Tesla rutscht auf Gegenfahrbahn © Feuerwehr Miesbach. Source: Rosenheim24

At this rate other car brands on the roads may need to have a warning system for when a Tesla is near, so they can prepare for evasive action and assist everyone being negligently harmed by Tesla.

Tesla drivers are involved in more accidents than drivers of any other brand.

The brand is not only a complete fraud, the worst engineering on the road, its faults are causing increasing damage to society. The more Tesla, the more disaster.

Tesla deaths per year only get worse and worse. Source: tesladeaths.com