acm-header
Sign In

Communications of the ACM

ACM TechNews

Fearful of Bias, Google Blocks Gender-Based Pronouns From New AI Tool


View as: Print Mobile App Share:
Google refused to take chances at a time when gender issues are reshaping politics and society.

Googles technology will not suggest gender-based pronouns because the company fears the risk is too high that its Smart Compose technology might predict someones sex or gender identity incorrectly and offend users.

Credit: Toby Melville/REUTERS

In May, Google introduced a new feature for Gmail that automatically completes sentences for users as they type, but the tool will not suggest gender-based pronouns because the risk is too high that the system might predict someone's sex or gender identity incorrectly and offend users.

Google is taking extra precautions because it wants to bill itself as better understanding the nuances of artificial intelligence (AI) when compared to its competitors.

Gmail has 1.5 billion users, and its "Smart Compose" feature assists on 11% of messages worldwide sent from Gmail.com.

Google's decision to be cautious on gender follows some high-profile embarrassments for the company's predictive technologies.

The policy of banning gendered pronouns also affects the list of possible responses in Google's Smart Reply service, which allows users to respond instantly to text messages and emails with short phrases such as "sounds good."

From Reuters
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account