Amazon weighs request to stop selling facial recognition software... briefly

As we’ve seen, privacy advocates and some of those interested in criminal justice reform have serious concerns about law enforcement making use of facial recognition software when investigating crimes. Many have called for a ban on such practices by the police and other law enforcement agencies. Apparently, their calls were heard by the executives at Amazon, makers of Rekognition, a facial recognition program they sell to law enforcement. A vote was taken among shareholders to determine if they should curtail those sales recently and the results went very solidly in one direction. Tough luck, guys. We’re doing business over here. (NY Post)

Advertisement

Amazon.com Inc shareholders overwhelmingly rejected a proposal that the company stop selling facial recognition technology to government agencies, while a resolution to audit the service drew more support, a regulatory filing on Friday showed.

Some 2.4 percent of votes were in favor of the ban. A second proposal that called for a study of the extent to which Amazon’s “Rekognition” service harmed civil rights and privacy garnered 27.5 percent support.

Amazon’s sale of the technology to law enforcement in Oregon and Florida has put the company at the center of a growing US debate over facial recognition, with critics warning of false matches and arrests and proponents arguing it keeps the public safe.

Hey, at least they put it to a vote in a democratic fashion, right? And a stunning 2.4% of their shareholders wanted to take the product off the market. Barely a quarter of them were interested in a study to see if the software was violating anyone’s privacy or civil rights. (The study will go forward anyway.)

It’s not as if people shouldn’t have legitimate concerns over the specific capabilities of Rekognition. As we covered here previously, independent testing of the product certainly produced some dodgy results. The application was very good at identifying white males, but that was about it. The error rate for white women was measurably higher. By the time you got around to black women, it was barely able to correctly identify them as females. So there’s clearly room for improvement.

But the underlying questions about invasions of privacy still ring hollow to me. When the cops are investigating a crime, if the software spits out a misidentification, what harm is actually taking place. A pair of human eyes should be able to look at the picture of the new “suspect” and usually identify that it’s not a match for the person they’re looking for. In the worst case scenario, perhaps a detective will need to show up to interview the person and quickly find out that they were nowhere near the scene of the crime and eliminate them from suspicion.

Advertisement

The facial recognition software is only a tool designed to assist with investigations. It’s not a digital Judge Dredd scanning everyone’s profiles and meting out justice from the ether. And if the cops are interested in using it, somebody will come along to develop it. Might as well be Amazon I suppose.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Beege Welborn 5:00 PM | December 24, 2024
Advertisement
Advertisement