Protocol has a fascinating look at how Axon tried to abuse its AI ethics board, which wasn’t having it:
Last week, Axon gave its AI ethics board two days’ notice of its intention to publicly announce the development of Taser drones for schools, after the board had spent about a year vetting a proposal to let law enforcement officials pilot the drones. In doing so, the resigning board members wrote, the company “bypassed Axon’s commitment to consult with the company’s own AI Ethics Board.”
Nine out of twelve members of the board resigned, which begs the question what the remaining three were thinking. Reuters gives readers a clue.
It explored the idea of a Taser-equipped drone for police since at least 2016, and Smith depicted how one could stop an active shooter in a graphic novel he wrote. The novel shows a daycare center with what looks like an enlarged smoke alarm, which first recognizes the sound of gunfire and then ejects a drone, identifying and tasing the shooter in two seconds. Axon first approached its ethics board more than a year ago about Taser-equipped drones, and the panel last month voted eight to four against running a limited police pilot of the technology. The company announced the drone idea anyway, as it said it wanted to get past “fruitless debates” on guns after the Uvalde shooting, sending shares up nearly 6%.
A graphic novel concept (e.g. comic book thinking, unrealistic and oversimplified)… that can send shares up by appearing to be impatient and insensitive. Who doesn’t want to stick around to “stay in that tent”?
Or to ask it another way, what’s missing from such a close-minded tightly controlled ethics equation in public safety product management?
Ethics board members worried the drones could exacerbate racial injustice, undermine privacy through surveillance and become more lethal if other weapons were added, member Wael Abd-Almageed said in an interview. “What we have right now is just dangerous and irresponsible,” said Abd-Almageed, an engineering research associate professor at University of Southern California.
Dangerous and irresponsible sounds like something the company might take for its byline, or publish as their new comic book take on public safety, given such thinking is being credited with “shooting” up their shares.
Let’s be honest here. These drones are a terrible idea. As someone who has been regularly breaking AI and talking very publicly about it, the Axon concept sounds like disaster — increasing harm instead of lowering it, distraction from real solutions, and thus a total waste of money.