AI assistants like Siri and Alexa spread sexism, UN study reveals


Voice-activated personal assistants like Amazon’s Alexa and Apple’s Siri are reinforcing and spreading sexism, according to a recent United Nations study.

The feminine-voiced AI helpers, which also include Google Assistant and Microsoft’s Cortana, perpetuate the harmful gender stereotype that women are subservient and tolerate being treated poorly, the report from UNESCO found.

“Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,’” the report says.

“The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility.”

Particularly worrying is that the robots often give “deflecting, lackluster or apologetic responses” when insulted, reinforcing the gender bias that women are submissive and will let abuse slide, the study found.

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

The report is even titled “I’d blush if I could” — a nod to Siri’s response when a user told it, “Hey Siri, you’re a b–h.”

The study suggested that digital assistants be programmed to discourage gender-based insults. It also called for companies to stop making the robots female by default and for more representation of women in AI-related fields.

Technology giants have said in the past that consumers prefer female voices for their assistants. But the report found that in general, most people prefer the sound of the opposite sex.