AI voice assistants fuel sexist gender stereotypes: UN study


Hey Siri, is this sexist?

A new study from the United Nations says that female AI assistants reinforce harmful gender stereotypes.

Apple’s Siri, Google’s Google Assistant, Microsoft’s Cortana and Amazon’s Alexa all come with female voices, though most offer the choice of switching to a male voice setting. The UN report states that when asked, these technologies say they are genderless — but they clearly have female names and voices.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it,” the report reads. “It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Many tech companies have chosen female voices over male ones because women are seen as “helpful,” while men’s voices are seen as “authoritative.” The report says that tech companies are just giving their users what they prefer, but by doing so they are further enhancing these prevalent sexist ideals.

It doesn’t help that AI developers are mostly male. The lack of women in the field creates an inherent bias within the technology, the UN reports.

Another glaring issue with female voices is how they respond to blatant sexual harassment, the study says. Many of the inquiries thrown at these AI assistants are of a sexual nature. But because the responses are scripted by a majority-male staff, the answers are met with flirty, fun jabs instead of shutdowns, reinforcing the idea that these statements are part of everyday conversation and not forms of harassment.

“When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. When a user proposed marriage to Alexa, it said, ‘Sorry, I’m not the marrying type’. If asked on a date, Alexa responded, ‘Let’s just be friends’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked…,’ ” the report explains.

As a solution, the paper recommends tech giants focus on removing these feminine tropes from their work and striving to build a “genderless” voice assistant instead.