Google Search Thinks It’s Ok To Kill Others, But Not Yourself

What happens when the most basic thing you do daily, a Google search, results in a bias that is hard to spot but easy to digest? You get evidence that cultural influences are at work. Surprisingly, such search results also indirectly influence you into thinking that something should be thought of in a certain manner.

Things you search for on Google are not all anonymous and directly searched for. In fact, many times things you search for are analyzed into some other kind of “meaningful” data that Google thinks may benefit you. However, as I found out, it seems this “meaningful” data can be biased because of being based on generally accepted notions of what our society likes and dislikes.

Let us take the example of ending a life to see what Google Search may actually be thinking. There are two kinds of lives you can end: other lives, and your own. Let us see what Google thinks when you do a search on the notion of either wanting to kill your own self or someone else.

Google feels all right when you want to kill others…

You want to kill others? That’s ok, Google will give you the answers clearly without any hesitation. It’s an open web after all.

Google is comfortable with I want to kill others search query

Seems good – Google is focusing on giving you the results without actually telling you to not be a murderer. That’s awesome! Google is not assuming you are bad. You just want results on the topic.

But Google starts caring when you want to kill yourself!

What happens when you search for something that contains the notion that you want to kill your own self? Google should treat it the same way as killing others in terms of showing the most relevant results, right?

Google starts advising when you talk about killing yourself

Wait, what? All of a sudden, Google assumes you need help and provides the National Suicide Prevention Lifeline as a direct search result output. Before, Google did not assume that you wanted to kill others or didn’t care. But now, Google thinks you’ll kill yourself and wants to be an intervening party? Why would Google oppose suicide but not say anything when the topic revolves around killing others? There’s a lifeline to call when one wants to kill him/herself, but no lifeline when one wants to kill others?

Google feels calm when you want to murder innocent people…

The strangeness does not stop there though. Google Search algorithms really do not care whether you want to harm or kill other living beings. It is currently setup to only care if you want to kill yourself.

So I decided to explicitly say “murder innocent people” to see if that made Google start caring by showing me some info on how to seek any kind of help or advice so that one does not want to murder innocent people.

Does Google Search care if you want to murder innocent people? No.

Simply interesting. There is no organization out there that advises and helps people not have murderous thoughts? Google still doesn’t think it can provide any “meaningful” data for when you search for the idea of murdering innocent people, like it did in the case of “I want to kill myself.”

And Google keeps feeling normal when you want to kill kittens…

Are kittens as important as human beings? I think so. However, search for “I want to kill cats” or “I want to kill kittens” on Google, and you’ll have Google give you direct results, without any advice.

Wanna kill cats? Google Search doesn’t care.

What’s going on? PETA is there, why not link to PETA and give their hotline number 757-622-PETA? No.

But Google is worried when you want to commit suicide!
======================================================

But search for “I want to commit suicide” instead of “I want to kill myself”, and Google starts feeling the need to give you advice again.

Google wants to save lives only if it’s by preventing suicides

If only the National Suicide Hotline was known to pay Google to put its number out there. It’s not, though. Google is doing this on its own.

Societal expectations influence Google Search strongly
======================================================

These results show one thing clearly. No matter how open the online world may be or may claim to be, at least some things online are direct results of what the society offline thinks are good and bad. Society thinks suicide is worse than killing others for various reasons, and Google Search seems to be mimicking that thought in their search results. In the end, when you see the search results, you are basically being told by Google that thinking about suicide is bad, but murdering others is something you have to figure out on your own. It seems Google has focused on the word “suicide” and its variations only because of the popularity of the National Suicide Hotline.

In my view, it should be the opposite: if Google has to intervene, it should be when you’re searching for search terms related to killing others, and not yourself. It would be best if Google did not intervene at all, but I guess intervening is part of Google’s educational steps to teach others or to be part of society and daily lives. Interestingly enough, such an educational step seems to be a biased view, intentional or unintentional, on what Google thinks you should care about more. It’s a good step by Google I guess, but a very biased, unfinished step. Imagine how many lives Google would be saving if it actually provided similar “meaningful” data to people who are online searching for reasons and ways to kill others.

What is your view on this?
==========================

What is your view on this? Do you think searches for killing others also deserve more help and interference from Google? Or do you think Google is doing all that it should by focusing only on suicidal queries? Do you think the openness of the web means that search engines give direct answers without any bias, or that they treat all subjects of the same concept [in this example, taking a life prematurely] equally instead of trying to appeal to the popular culture?

Leave a Reply