UN Report: Gender bias in voice assistant coding

News
Isometric smart speaker and web pages. Voice activated digital c_1558617579307

United Nations (CBS News) — More people will speak to a voice assistant machine than to their partners in the next five years, the U.N. says, so it matters what they have to say.

The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these “robot overlords” is unparalleled.

But (AI) voice assistants, including Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of “docile and eager-to-please helpers,” with acceptance of sexual harassment and verbal abuse, a new U.N. study says.

A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm.

The report is titled, “I’d blush if I could: Closing Gender Divides in Digital Skills Through Education.”

The authors say the report is named based on the response given by Siri when a human user says, “Hey Siri, you’re a bi***”. That programmed response was revised in April, when the report was distributed in draft form.

The report reveals a pattern of “submissiveness in the face of gender abuse” with inappropriate responses that the authors say have remained largely unchanged during the eight years since the software hit the market.

Dr. Saniye Gülser Corat, Director of the Division for Gender Equality at UNESCO, conceived and developed the report, along with Norman Schraepel, Policy Adviser at the German Agency for International Cooperation with the EQUALS global partnership for gender equality in the digital age, a non-government organization of corporate leaders, governments, businesses, academic institutions and community groups that partners with several U.N. agencies, including U.N. Women, to promoting gender balance in the technology sector for both women and men.

Both Alexa and Siri, the study says, fuel gender stereotyping: “Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products.”

The study blames a lot of factors but says programming is the main culprit, and it recommends a change to the programmers: “In the United States, the percentage of female computer and information science majors has dropped steadily over the past 30 years and today stands at just 18 percent down from 37 per hundred in the mid-1980s.”

“AI is not something mystical or magical. It’s something that we produce and it is a reflection of the society that creates it,” Gülser Corat told CBS News. “It will have both positive and negative aspects of that society.”

Artificial intelligence “really has to reflect all voices in that society. And the voice that is missing in the development of AI at the moment is the voice of women and girls,” she said. 

You can read and download the full report here.

Copyright 2020 Nexstar Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Latest Stories

More Local News

Recent Videos

Wild Calf

Fill the Boot

Friday, October 23rd, 2020 - KX Storm Team Evening Forecast - Dave Holder

Surrey Football

Sturgis Rally

Emergency Commission Meeting

Ministering During Pandemic

KX Storm Team Full Evening Forecast w/Tom Schrader 10/23

Smoke Shop

Unemployment Overpay

Masks and Policing

Pope on Civil Union

Winter Gear

Tom's Friday Afternoon #OneMinuteForecast 10/23

Robert Suhr KX News At 5:50am Forecast 10-23-20

Robert One Minute 10-23

FURRY FRIDAY OCT 23

NDC OCT 23

Nedrose Football

Restaurant Survival

More Video

KX News Trending Stories

Don't Miss

More Don't Miss