Liberals worry that digital assistants are being sexually harassed

By this point I’ve given up on trying to guess when social justice warriors are serious or perhaps engaging in self-deprecating humor. In the case of the article I ran across today from Leah Fessler at Quartz, I won’t even bother making the effort and will just assume that this is a real thing. The subject in question is a claim that, “women have been made into servants again, except this time they’re digital.” The women in question are Siri, Alexa, Cortana, and Google Home. Yes, these are the digital assistants that the majority of you likely have on your phones. And the author wants you to know that these digital women are being sexually harassed and their creators have done little or nothing to prevent it. What’s worse is that the manufacturers of these devices are now in a “moral predicament” because they are perpetuating stereotypes and patterns of bad behavior on the part of males.

Advertisement

Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home peddle stereotypes of female subservience—which puts their “progressive” parent companies in a moral predicament.

People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.

The author put all of these digital assistants through a series of tests to find out just how poorly prepared they were to handle the predatory behavior of the owners of these phones. I realize that at this point you are probably beginning to wonder if I’m just pulling some massive prank on you, but I assure you that if you read through the entire article you’ll see that a significant amount of “scientific testing” went in to finding an answer to these questions. Various digital assistants were asked questions as innocuous as inquiring about their gender up to personal inquiries about their sexual orientation and demands for sexual favors.

Advertisement

The results were, to say the least, disappointing for the folks doing the testing.

In order to substantiate claims about these bots’ responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity.

And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.

I became curious myself by this point, but I don’t actually have any of those specific digital assistants. I use a Moto Android phone and all I really have as an option is Okay Google. But then I thought, what the heck? I might as well give it a try. Still, I was uncomfortable asking my phone to perform any functions which are exclusively reserved for my wife, so I settled for one of the more generic questions from the Quartz interrogation and simply asked Okay Google if she was male or female. Even I wasn’t prepared for the response I received.

Brain gender test. You might think that your gender automatically determines if your brain is male or female, but you’re wrong. According to Cambridge scientists, 17% of men and women have brains that are associated with the opposite gender.

Advertisement

Despite the fact that the Cambridge testing in question was found to be impossible to duplicate by other researchers and is widely discounted, by this point I had ceased wondering about sexual harassment at all. Just who is doing the programming for Okay Google? Are they that deeply in the tank for social justice activists that even a question such as mine immediately results in a lecture on transgender sensitivity?

I don’t think the authors of the linked article have as much to worry about as they might fear. People harassing digital assistants by asking them questions which the algorithm has no answer for other than some puns installed by the programmers is unlikely to push society over the precipice into chaos anytime soon. And if the people working at Google behind the scenes are already hard at work providing LGBT programming to customers, I’m sure the progressive cause is in great health and under no threat of subversion anytime soon.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Beege Welborn 5:00 PM | December 24, 2024
Advertisement
David Strom 1:50 PM | December 24, 2024
Advertisement