Female-voice AI reinforces bias, says UN report

AI-powered voice assistants with female voices are perpetuating harmful gender biases, according to a UN study.

These female helpers are portrayed as “obliging and eager to please”, reinforcing the idea that women are “subservient”, it finds.

Particularly worrying, it says, is how they often give “deflecting, lacklustre or apologetic responses” to insults.

The report calls for technology firms to stop making voice assistants female by default.

The study from Unesco (United Nations Educational, Scientific and Cultural Organization) is entitled, I’d blush if I could, which is borrowed from a response from Siri to being called a sexually provocative term.

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

“Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility,” the report says.

 

READ MORE

Our videos