Digital Assistants Enforce Gender Bias – UN report
Report claims female-voiced Siri and Alexa entrench gender biases with submissive responses
The inequality between the sexes is now also being blamed on digital voice assistants such as Siri and Alexa, a report from a UN agency has claimed.
The central thrust of the report is that these digital assistants are helping to entrench harmful gender biases. It says that the often submissive and flirty responses offered by the assistants to many queries – including abusive queries – reinforce ideas of women as subservient.
Digital assistances are an increasingly useful tool for many households, and indeed smart speakers are said to be the seventh most widely used device on a daily basis. The UK government said in April that smart speakers could now be used to access more than 12,000 pieces of government information.
Gender bias
But the UN report believes that there is a problem with smart speakers and digital assistants.
The report from the UN agency Unesco is entitled ‘I’d Blush if I Could’, which is apparently the response that Siri gives to the phrase: ‘You’re a slut’.
“Siri’s ‘female’ obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education,” the report said.
“Today and with rare exception, most leading voice assistants are exclusively female or female by default, both in name and in sound of voice,” the report noted. “Amazon has Alexa (named for the ancient library in Alexandria), Microsoft has Cortana (named for a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman), and Apple has Siri (coined by the Norwegian co-creator of the iPhone 4S and meaning ‘beautiful woman who leads you to victory’ in Norse).”
“While Google’s voice assistant is simply Google Assistant and sometimes referred to as Google Home, its voice is unmistakably female,” said the report. “AI technologies and the complex processes that undergird them require extensive guidance and tinkering to project gender and a human-like personality in ways that are familiar and satisfying to customers.”
“To justify the decision to make voice assistants female, companies like Amazon and Apple have cited academic work demonstrating that people prefer a female voice to a male voice,” the report stated. “This rationale brushes away questions of gender bias: companies make a profit by attracting and pleasing customers; customers want their digital assistants to sound like women; therefore digital assistants can make the most profit by sounding female.”
“A related or concurrent explanation for the predominance of female voice assistants may lie in the fact that they are designed by workforces that are overwhelming male,” said the report.
Gender neutral
The report then went to say that digital assistances should not be made female by default and technology firms should explore the feasibility of developing a neutral machine gender that is neither male nor female.
Research in 2017 found that British parents were still steering young girls away from pursuing jobs in IT, as technology careers were still being viewed as being better suited to boys.
Parents apparently picked traditional jobs such as doctor (24 percent), teacher (20 percent) and lawyer (17 percent) as the top preferred careers for girls.
Last month it was reported that a global team at Amazon are reviewing audio clips of people speaking to their Alexa-powered Echo smart speakers.
Put your knowledge of artificial intelligence to the test. Try our quiz!