The search result for ‘Bitches Near Me’ might have been an AI learning error but they show a bitter reality.
A couple days ago someone tweeted how searching the term ‘Bitches Near Me’ on Google Maps yielded some very shocking results.
When a person searched this term, girls schools, girls hostels and other female-centric places of business appeared. I tried the search myself and found results for all the female hostels around my location.
The revelation started from Twitter with the first tweet appearing from an Indian Twitter user who thought this might be an Easter Egg for users to find put into the code intentionally by the engineers at Google.
So how many of you know this but when you search "bitches near me" on @googlemaps it gives the location of girls hostels near you.
Is this a bug @Google or its one of those Easter eggs of @googlemaps— R Gupta (@RGupta26857811) November 25, 2018
And of course, the idea that this might have been intentional pissed a lot of people off and rightly so.
In this day and age, a company as big as Google, using an offensive term used to refer to women as part of it’s search is unthinkable.
If you search "Bitches near me" on Google Maps, it will show you the locations of all the Girl Schools, parlours, hostels etc. near you. Absolutely shocking. Shameful act by Google.#Google #Googlemaps pic.twitter.com/rFkoL8NC75
— Mush speaking (@MushfiqueAhmed4) November 26, 2018
What the fuck. It shows the same here as well. What level of fucked up and disrespectful do you have to be to make your algorithm read that? https://t.co/m3NFYtS2If
— Qalbi (@apkiammi) November 26, 2018
Saw this meme on Reddit saying open your Google maps and search, 'bitches near me'. Out of curiosity, I did it. The results were shocking. @Google needs to work more on its maps!
— Aqeel Khan (@AforAqeel) November 26, 2018
A ‘Bitches Near Me’ Google Search Will Take You to Women’s Hostels. Considering the monopoly and clout that Google enjoys, it must be held accountable for promoting such stereotypes. #GoogleSearch #Goofup #Badword #Monopoly #CORRECTION #CHANGE #RespectWomen
— Sachin (@Sachin55600323) November 27, 2018
These results were not only limited to the subcontinent apparently.
Sorry for that guys, but it works in Jordan in the same way.
Fix this please @googlemaps https://t.co/koooOehtdP— Osama Jaber أسامة جبر (@jbr_osama) November 26, 2018
Google appears to have taken notice of the issue and corrected it, since searching this term now yields no results. A lot of people were left wondering why this was the case in the first place.
I talked to our very own resident tech geniuses and my coworkers, Daniyal Shahid and Syed Saad. Both of them believed that this was probably not intentional but a result of Google’s machine learning algorithm, where searches by users are tagged with specific data. And so, as you can imagine, people have probably been clicking on women’s hostels and schools when they search this term on the internet which is why Google Maps displayed these results.
This Twitter user demonstrated how this was indeed very concerning since changing the word ‘bitches’ with ‘dog’ yielded completely different results.
So here's in two screenshots is the difference between [Googling for] "Dogs near me" and "Bitches near me". pic.twitter.com/v5CtBdiRPN
— Prasanto K Roy (@prasanto) November 26, 2018
While the mistake has now been fixed and shows you a ‘No Results’ message it brings light to a large oversight on part of Google. The company has yet to release a statement about this entire incident but hopefully, they will take responsibility for the mistake and have better oversight in the future.
Private Groups On Facebook That Promote Child Pornography In Pakistan Are Flourishing
A Pakistani Girl Posted This Sexist Tweet Aur Feminists Ko Aag Lag Gayi
Cover Image Source: @ahappychipmunk via Twitter.com/netpremacy