Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Next challenges

Recent achievements

  • Global
  • Personal

Recognition

  • Give kudos
  • Received
  • Given

Leaderboard

  • Global

Trophy case

Kudos (beta program)

Kudos logo

You've been invited into the Kudos (beta program) private group. Chat with others in the program, or give feedback to Atlassian.

View group

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Siri and Alexa Reinforce Gender Bias, U.N. Finds.

“The servility expressed by digital assistants projected as young women provides a powerful illustration of gender biases coded into tech products” ⁦@UN⁩ report finds Siri and Alexa Reinforce Gender Bias. 

5VgX5Hyt.jpg

From left: the Apple Homepod, Google Home and Amazon Alexa. Their voice-activated assistants reinforce problematic gender stereotypes, Unesco says in a new report.

Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?

The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.

“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”

One particularly worrying reflection of this is the “deflecting, lackluster or apologetic responses” that these assistants give to insults.

The report borrows its title — “I’d Blush if I Could” — from a standard response from Siri, the Apple voice assistant, when a user hurled a gendered expletive at it. When a user tells Alexa, “You’re hot,” her typical response has been a cheery, “That’s nice of you to say!”

Siri’s response was recently altered to a more flattened “I don’t know how to respond to that,” but the report suggests that the technology remains gender biased, arguing that the problem starts with engineering teams that are staffed overwhelmingly by men.

Here is the full article to read more.

As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”

 

0 comments

Comment

Log in or Sign up to comment
TAGS

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you