Amazon's facial recognition software found to be racist, sexist

When covering advancements in technology including artificial intelligence and facial recognition software I’ve run across all manner of complaints from various activists. Many worry about invasions of privacy or government overreach, where Big Brother compiles lists of citizens and tracks their every move. While some of these fears may be justified to a degree, one thing I didn’t see coming was allegations that the software was going to turn out to be bigoted. And yet that’s the latest claim we’re hearing about Amazon’s facial recognition software products. They’re surprisingly accurate if you happen to be a white male, but otherwise… not so much. (Associated Press)

Advertisement

Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

In the accuracy tests that were conducted, they asked the Rekognition software to identify “light-skinned” (read: white) and “darker-skinned” (read: black or brown) individuals of both genders. Amazingly, in almost one third (31%) of the cases, the software wasn’t even able to conclude that darker-skinned women were even women, misidentifying them as men. White women were tagged as men 7% of the time. Black or Hispanic males were only misidentified one percent of the time and there were zero errors in identifying white males.

How is that even possible? The study concludes that “artificial intelligence can mimic the biases of their human creators.” I’m pretty sure we’re all familiar with the standard jokes about how “all Xs look alike to white people,” where X can be any other race you choose to insert. If we’re assuming that this software was primarily developed by white guys and there’s any truth to that old saw, perhaps that’s possible?

Advertisement

But how in Sam Hill is it missing on identifying women as women? With the exceptions of transgender people or customers attending drag shows, that’s generally a pretty easy call to make unless the individual is going out of their way to look like the opposite gender. (And even then it’s not generally that difficult.) If the software is able to correctly identify men virtually all of the time, how can it fail to pick up on someone being female in so many cases?

This still sounds like some extremely useful software for law enforcement and I hope they’re not abandoning it entirely. But at the same time, police need to be made aware of the increased error rates when it comes to minority women and take that into account when springing into action.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Beege Welborn 5:00 PM | December 24, 2024
Advertisement
Advertisement