First UNESCO recommendations to combat gender bias in applications using artificial intelligence

Beginning as early as next year, many people are expected to have more conversations with digital voice assistants than with their spouse.

Presently, the vast majority of these assistants—from Amazon’s Alexa to Microsoft’s Cortana—are projected as female, in name, sound of voice and ‘personality’.

I’d blush if I could’, a new UNESCO publication produced in collaboration with Germany(link is external) and the EQUALS Skills Coalition(link is external) holds a critical lens to this growing and global practice, explaining how it:

  1. reflects, reinforces and spreads gender bias;
  2. models acceptance of sexual harassment and verbal abuse;
  3. sends messages about how women and girls should respond to requests and express themselves;
  4. makes women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; and
  5. forces a synthetic ‘female’ voice and personality to defer questions and commands to higher (and often male) authorities.
READ MORE

Our videos