Google AI Won’t Label People ‘Man’ or ‘Woman’ to ‘Avoid Bias’

Google LGBT pride bus
AFP Contributor/Getty

A Google AI tool used to recognize objects in images will reportedly no longer attach gender labels such as “woman” or “man” to people in pictures in order to “avoid bias.”

Business Insider reports that a Google AI tool that can recognize and label the contents of an image will no longer assign gender labels to photos of people. Google’s Cloud Vision API is a service used by developers to allow them to attach labels to photos identifying the contents.

This tool is used to detect faces, landmarks, brand logos, and explicit content and is used within a number of industries, from retailers using the system to power their visual search features to researchers identifying animal species. Google reportedly told developers in an email on Thursday morning that it would no longer use “gendered labels” for its image tags.

Any images of people will reportedly now be tagged with “non-gendered” labels such as “person.” Google states that this was done as it was unable to determine an individual’s gender solely from their appearance. The company also cited ethical rules on AI stating that gendering photos could add to unfair bias.

The email states: “Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias.”

Frederike Kaltheuner, a tech policy fellow at Mozilla with expertise on AI bias, told Business Insider that the update was “very positive.” Kaltheuner stated in an email:

Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.

Classifying people as male or female assumes that gender is binary. Anyone who doesn’t fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person’s gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people.

Google states in its own AI principles that algorithms and datasets can reinforce bias stating: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.”

Only one developer so far has commented on the change, stating that it was made due to “political correctness.” The developer stated: “I don’t think political correctness has room in APIs. If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.”

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.