FTC notice on AI: Tesla “false or unsubstantiated” claims are illegal

In the wake of the Tesla engineer testifying his CEO allegedly ordered criminally false and unsubstantiated “driverless” claims (planned deception)… the FTC is now warning everyone that tactic was and still is illegal.

…the fact is that some products with AI claims might not even work as advertised in the first place. In some cases, this lack of efficacy may exist regardless of what other harm the products might cause. Marketers should know that — for FTC enforcement purposes — false or unsubstantiated claims about a product’s efficacy are our bread and butter. […] Are you exaggerating what your AI product can do? Or even claiming it can do something beyond the current capability of any AI or automated technology? For example, we’re not yet living in the realm of science fiction, where computers can generally make trustworthy predictions of human behavior. Your performance claims would be deceptive if they lack scientific support or if they apply only to certain types of users or under certain conditions.

We know with absolute certainty that Tesla claims in November 2016 were dishonest, their “AI” did not work as advertised.

The Tesla advertisement (video) required multiple takes to have a car follow a pre-mapped route without driver intervention. It was unquestionably a staged video that depended on certain conditions, which were never disclosed to buyers.

Had they presented a vision for the future, with sufficient warnings about a reality gap, that would be one thing. The official Tesla 2017 report to the California DMV (Disengagement of Autonomous Mode) revealed its “self-driving” tests in all of 2016 achieved only 550 miles and suffered 168 disengagement events (failing every three miles, on average). And they didn’t even really test on California public roads.

Such a dismal result should have been the actual video message, because ALL of those heavily curated miles were in making a promotional video that claimed the exact opposite.

Tesla and especially the CEO plainly branded their video with a grossly misleading claim the human driver was there only for “legal purposes” (as if also implying laws are a nuisance, a theme that has resurfaced recently with Tesla’s latest AI tragically ignoring stop signs and yellow lines).

The Tesla marketing claims were and still are absolutely false: the human was in a required safety role to take over as the system (very frequently) disengaged with high risk.

This disconnect is so bad that their claims still do not work seven years later, as evidenced in a massive recall.

Tesla’s false advertising increasingly seems directly implicated in widespread societal harms including loss of life (e.g. customers who believed Tesla’s “legal purposes” lie — among many others — increased fatality risk in society).

Source: Tesladeaths.com

Pilot thought his non-responsive flight instructor was a joke. He had died on takeoff

A pilot showed up at his club with pressure to get in the air. Weather conditions were not ideal that June day in 2022, but he knew he had to put in time, so he asked a flight instructor to copilot. A UK safety bulletin has posted details of what happened next.

The pilot recalled that shortly after takeoff from Runway 28 the instructor’s head rolled back. The pilot knew the instructor well and thought he was just pretending to take a nap whilst the pilot flew the circuit, so he did not think anything was wrong at this stage. He proceeded to fly the aircraft round the circuit. As he turned onto base leg the instructor slumped over with his head resting on the pilot’s shoulder. The pilot still thought the instructor was just joking with him and continued to fly the approach. He landed normally on Runway 28 and started to taxi back to the apron. However, the instructor was still resting on his shoulder and was not responding, and the pilot realised something was wrong.

It’s an odd story, a sad one for sure, that touches on trust issues in safety operations.

“Fossil fuel industry is actually scared” by Ukraine

A conference in Texas just awkwardly cancelled a Ukrainian energy expert.

…though she hasn’t been told by the energy summit’s organisers why she was barred, she believes that “the fossil fuel industry is actually scared by having someone from Ukraine attend”.

She pointed out that the Stand With Ukraine campaign was not only calling for an end to the “global fossil fuel addiction that feeds Putin’s war machine” but also for countries to stop expansion of coal, oil and gas, and start phasing out. […]

“Of course, we are in stark opposition to the oil and gas lobby, and the push to expand fossil-fuel infrastructure is the opposite of energy security. We will be safe only when public money and state subsidies fully withdraw from the oil and gas industry and get to spend at-scale on renewables and energy efficiency.”

If the conference had let her in, we probably never would have heard about this important point.

It’s worth reporting widely that reducing overly centralized fuel systems would have a direct impact on regional political stability and wars, even though it feels like news from 70 years ago.

Rise in “Ghost” Tankers Delivering Russian Oil to Asia

Someone is buying up old decrepit tankers, turning off any tracking electronics, and pushing huge amounts of Russian oil into Asia.

Industry insiders estimate the size of that “shadow” fleet at roughly 600 vessels, or about 10% of the global number of large tankers. And numbers continue to climb. …an estimated 25 to 35 vessels are being sold per month into the shadow fleet, according to another senior executive at an oil trading firm. Global Witness, a nonprofit, estimates that a quarter of oil tanker sales between late February 2022 and January this year involved unknown buyers, roughly double the proportion the previous year.

While allegedly hard to identify by modern standards, at the same time the age of the vessel and the fact that it is dedicated to carrying Russian oil makes it classically simple to find, track and… disrupt or disable.