9 of the 12 ethics board members appointed by Axon to advise on technology solutions have resigned, citing plans to install drones with a stun gun and widespread surveillance in schools. “After several years of operation, the company fundamentally failed to accept the values we were trying to instill,” write the departing members. “We have lost faith in Axon’s ability to be a responsible partner.”
Axon (formerly Taser) has evolved into a law enforcement software and hardware giant in recent years, providing not only the familiar and formerly eponymous electric weapons, but also body cameras and entire digital evidence management platforms. Putting aside for now the risks inherent in privatizing such things, Axon has been quite thoughtful with its technology, seeking advice from the communities in which these tools will be used, as well as the police officers who will carry or own them.
The AI Ethics Council was created a few years ago when it became clear that machine learning was an extremely valuable tool, but also one that could easily be poorly built, misused, or a combination of both. A board of experts, scientists, and industry professionals had to give their opinion on technologies that offered security measures, accountability measures, and so on.
It was a good start outgoing members wrote in a statementt:
Each of us joined this council in the belief that we could influence the direction of the company in a way that would help mitigate the harm that police technology could cause and make better use of any benefits. For a while, we’ve seen this influence show up in some of Axon’s solutions. From not equipping any of our products with facial recognition, to withdrawing a new software tool for collecting data from social media websites, to pushing for much-needed legislation to control the use of license plate readers, we’ve seen tangible evidence of a difference. which we did.
I spoke with CEO Rich Smith back in 2020. and found he had a refreshingly honest look at whether technology is the answer to the ongoing police crisis.
“Technology is not a panacea. It will not solve these problems for us,” he said. But it is equally true, he continued, that without technology, some of these problems will be intractable. Body cameras and other digital tracking of police encounters are not an absolute boon, but how else can we expect such events to be systematically recorded? Those who will define these tools are not the police, but the companies that make them, and Axon is struggling to put itself in that position.
But lately, it may have gone too far on how much and what type of technology should be used as a deterrent to mass shootings.
“[Axon] intends to develop taser-equipped drones, pre-position them at potential targets for mass and school shootings, and surround those targets with surveillance cameras with real-time streaming capabilities,” the board of directors said in a letter.
“The board was presented and discussed with the Taser Drone as something to be piloted with strict controls because there are so many questions to be answered about using this type of equipment. However, the board was very quickly notified that Axon plans to announce this tool as a widely shared concept, completely ignoring the caution of the AI Ethics Board,” said Mecol McBride, former board member and director of advocacy for the New York university. police project guard group. “If it was so easy to get the Board away from something so important, we should have asked ourselves, what are we doing here?”
The board warned Axon that if they continued like this, there would be layoffs. This went on and on and they retired.
Given this, and protests from other members of the community that this may not have been an appropriate response to the threat of mass shootings, Smith wrote this blog post recognizing that the company may have gotten ahead of itself.
“In light of the feedback, we are putting this project on hold and refocusing on further engagement with key stakeholders to fully explore the best way forward,” he wrote. “Remotely controlled non-lethal drone with TASER in schools is an idea, not a product, and there is still a long way to go. We have a lot of work and research to see if this technology is viable and to understand if societal problems can be properly addressed before moving forward.”
He also said that they would “improve” the process of collecting alternative opinions, although, as McBride pointed out, he appeared to have reversed the existing system. What improvements will prevent it from doing the same in the future with any body, no matter how well staffed, that plays a purely advisory role while the hawks hold leadership positions? Axon did not answer questions about the future of the board, referring to the above post.
Curiously, Smith states in it that the outgoing ethics board members “decided to withhold direct involvement in these matters before we heard or had the opportunity to answer their technical questions.”
However, Max Isaacs, a Policing Project attorney who worked with Axon and the Board on the idea, stated that “For over a year, the Ethics Board has been discussing the parameters of a narrow pilot program with Axon,” implying that Smith’s report is back. “The company’s breach of its promise to consult with the Ethics Board before making such important decisions and its commitment to ongoing mass surveillance indicates that Axon is not sufficiently committed to the responsible development of this technology,” Isaacs said.
An Axon spokesperson explained that the board had evaluated a drone equipped with a stun gun for police, not the one that is “pre-installed in public places” and briefly intended for use in schools. It’s a bit of a weak cover for saying the board didn’t weigh in: their concerns about police deployments were no doubt even more pressing for school deployments. And, as noted in the resignation letter, Axon didn’t give them much time to respond – or, more likely, knew exactly what the response would be.
Whatever the case, Taser Drone’s plan is on hold, and Axon may think twice before jumping into a discussion of a powder keg with a match in hand. Technology will always play an important role in security and law enforcement, but no one needs (and this can have serious consequences) to move faster than we can imagine.
Credit: techcrunch.com /