What's the weirdest thing you've said to Siri or Alexa? Have you had conversations with them and then realised they're just software? Recent U.N. reports suggest that there may be a gender bias amongst AI.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”
Almost all major AI systems have female voices and names: Apple's Siri, Amazon's Alexa and Microsoft's Cortana. Experts imply that this cements the role of women to be subservient and only for a man's pleasure.
The issue also comes from users, mainly males, shouting explicit sexual abuse at their device. When a user tells Alexa, “You’re hot,” her typical response has been a cheery, “That’s nice of you to say!”
The U.N. report explains, “This harassment is not, it bears nothing, uncommon. A writer for Microsoft’s Cortana assistant said that ‘a good chunk of the volume of early-on enquiries’ probe the assistant’s sex life. Robin Labs, a company that develops digital assistants to support drivers and others involved in logistics, found that at least 5 percent of interactions were unambiguously sexually explicit.”
Robin Labs, a company who produces digital assistants, says that at least 5% of user interactions were "unambiguously sexually explicit".
This is a typical example of how male influence can be enforced without the majority of the customers noticing it. A lot of the engineering teams who make the devices are male dominated. Only 12% of AI researchers and 6 percent of software developers in the field are female. There is a clear imbalance in how these products are programmed. Alexa's obsequious servility provides an insight to how AI devices are given a gender bias from the get-go.
“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.”
The way the major AI assistants were named also supports the gender bias claim. Cortana was named after a scantily clothed, sensuous woman who is a character in the video game Halo. Apple's Siri is a Norse name which means "beautiful woman who leads you to victory". And although Google Home doesn't have a female name, the default voice is female.
So why is the default voice for all of these devices female? A recent study suggests that when people need help, they prefer to hear it from a female voice as opposed to a male one which brings about connotations about authority rather than assurance.
Chances are that many companies programe their assistants to be cheerful and formal - even if the user hurls abuse at the device. The reason why the devices don't fight back is because if the products reply with a kind and stereotypical female response, the user will be inclined to interact with it more.
More than 90 million US smartphone owners use voice assistants at least once a month. Plus, 24 percent of households own a smart speaker, and 52 percent of all smart speaker owners say they use their device daily.
“The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.” - the report states.
How do we solve the bias?
The U.N. study states that efforts should be made to make the producers, developers and engineers involved in making the devices more gender-balanced. It also suggests that companies should look at introducing male default voices as opposed to devices coming with the standard female voice.
Apple have developed Siri in recent updates to respond to sexual comments with “I don’t know how to respond to that". Amazon's Alexa replies to some sexually explicit queries by saying, “I’m not sure what outcome you expected.”
This TED talk by Josie Young, who specializes in feminist AI and has created a Feminist Chatbot Design Process, provides an insight into how much jokes these AI devices can tolerate.
Many initiatives are being taken to combat the frequency of sexual comments said to AI assistants. Companies like Feminist Internet has created a new chat bot called F'xa. This bot aims to educate users how male dominated industries can have an impact on the final product. You can try this out here. In an interview with Vox, the chat bot said “a technology is biased against women if its designers haven’t thought about how it might encourage sexist behavior.”