Google Search is the go-to online search engine for billions of people around the world. So needless to say, Google’s search engine processes an astronomical number of queries each day. Among other things, this processing allows the company to come up with Autocomplete predictions (example shown below).

google-atucomplete

Google defines Autocomplete as “a feature within Google Search designed to make it faster to complete searches that you’re beginning to type.”

While that’s definitely convenient, sometimes the kind of predictions Google Search offers may leave a bad taste in mouth. And worst, there are even times when the predictions you see make you question if it’s morally ok for a responsible company like Google to offer them to its users.

Case in point, autocomplete search predictions highlighted by Claudia Zettel on Twitter. Take a look at Claudia’s tweet below and you’ll likely agree that the predictions Google offered are strongly Misogynistic.

For those who don’t understand German, here is the translation for some of these predictions:

google-autocomplete-1

google-autocomplete-2

google-autocomplete-3

google-autocomplete-4

google-autocomplete-5

google-autocomplete-6

google-autocomplete-7

Of course, the severity varies, but there’s no denying that the predictions are sexist in nature. We cross verified these autocomplete predictions are our end (in a Firefox incognito window), and could see similar results:

google-autocomplete-verify

Even if there’s an agreement that most stable adults are unlikely to be swayed by these predictions, what about kids? School-goers (boys or girls) whose mind is still developing are most vulnerable to these kind of predictions.

google-autocomplete-comment

For its part, Google clearly states these are just predictions (and not suggestions). But the question is, are most kids aware of this? No. For them, this is what Google is “suggesting,” so there’s an element of truth in these “suggestions’ for them.

There’s are comments on Claudia’s post that points out that this is a problem that’s rampant in our society as a whole.

google-autocomplete-society

google-autocomplete-comment-2

This is true, as back in 2013, even UN Women had used Google Search Autocomplete feature to highlight the rampant sexism in our society.

While this ‘society’ argument may be correct, it doesn’t take away any sharpness from Claudia’s point.

The food for thought here is should Google be highlighting this kind of mindset – howsoever rampant – in its search predictions? Agreed, until anything is unlawful, Google can’t be forced to remove it from its products. But what about morally incorrect stuff?

Google has in past taken action on similar developments that were brought to the company’s notice. Back in 2016, the search giant removed predictions that completed queries like “are jews,” “are women,” and “are muslims,” with words like “evil” and “bad”.

There have been other instances as well. For example, several other such Autocomplete-related issues were highlighted by Wired earlier this year. Google immediately swung into action, and removed some of the predictions.

We hope the Mountain View, California-based company will take appropriate action in this case as well. In fact, in a follow up tweet to her post, Claudia says she got to know from one of her friends working at the company that the relevant team is looking into this.

google-looking-autocomplete

If Microsoft’s translation sounds a bit unclear, here’s Google’s translation of Claudia’s tweet:

And before I explain one more type Google. My best friend works in the Google Suggest team and they look at each other very well and intervene. Thanks for the attention.

We have reached out to both Claudia and Google to see if either of them can provide any more relevant details on the matter. The story will be updated as and when there’s new information to share.

PiunikaWeb is a unique initiative that mainly focuses on investigative journalism. This means we do a lot of hard work to come up with news stories that are either ‘exclusive,’ ‘breaking,’ or ‘curated’ in nature. Perhaps that’s the reason our work has been picked by the likes of Forbes, Engadget, The Verge, Macrumors, and more. Do take a tour of our website to get a feel of our work. And if you like what we do, stay connected with us on Twitter (@PiunikaWeb) and other social media channels to receive timely updates on stories we publish.

Dr. Aparajita Sharma
1227 Posts

Currently, I am pursuing Ph.D (Psychology), and have been teaching the same for past four years. Coming to PiunikaWeb, I know it was a complete switch over, but the idea was appealing enough to put in all the effort it called for. My work primarily involves research. Oh, and yes, some of the photographs you see here are clicked by me. Overall, I am enjoying whatever I am doing, and hoping you’ll also feel the same reading all my articles. You can find me on LinkedIN.

Next article View Article

[Update: Fixed] Android Auto showing weather for wrong locations, Google working on it

New updates are being added at the bottom of this story……. Original story (published on January 10, 2018) follows: A new Android Auto issue has come to light. Adding to...
Sep 03, 2021 1 Min Read