Don’t miss the latest developments in business and finance.

Is Google's eagerness to answer questions promoting more falsehood online?

Human programmers need to be aware that there can be actual social consequences when they write

The Conversation logo
Thomas Maher | The Conversation
Last Updated : Jan 06 2017 | 12:03 PM IST

When people have questions, they often ask Google. They expect high-quality, accurate answers. Late last year, it emerged that the top answer Google gave to “Did the Holocaust happen?” linked to a neo-Nazi, white supremacist, Holocaust-denying website.

The ensuing outcry included people buying Google ads for the US Holocaust Memorial Museum so that it would appear near the top of the results as well. After initial resistance, Google tweaked its algorithm – but only enough to push the false, prejudiced information somewhat farther down in the results.

These responses, however, miss a crucial element of the interplay between the tactics of Holocaust deniers’ tactics (and conspiracy theorists more broadly) and Google’s search algorithm. Google wants to answer questions and is often very good at it. But when the question itself has a hidden or implicit agenda, like expressing doubt about historical facts, the urge to answer that question shifts from a strength to a weakness.

Sowing doubts about the historical record is the bread and butter of Holocaust denial, and conspiracy theories more broadly. These illegitimate sites claim to be innocently curious, “just asking questions” about historical events and widely held beliefs. They are, of course, much more nefarious, seeking to spread anti-Semitism and right-wing hate.

As a scholar of political sociology and the Holocaust, it’s clear to me that sites intentionally presenting misinformation and propaganda are preying upon Google’s eagerness to answer questions. These sites, peddling what is sometimes called “fake news,” capitalise on people’s tendency to ask those questions directly on Google. This is one important example of the real-world effects of how algorithms are written. Human programmers need to be aware that there can be actual social consequences when they write what can seem like dry, straightforward code.

Many sites don’t answer the question

First, and very importantly: Of course, the Holocaust happened. There are mountains of evidence proving it happened. The perpetrators admitted to it. There are documents outlining the transport and extermination process. There is forensic evidence from the extermination sites. And there is abundant corroborating eyewitness testimony.

But code matters: Google’s search algorithm uses more than 200 factors to figure out how to prioritise results so as to give users the information they’re looking for. One of the first things it looks at is how well the site’s content responds to the specific inquiry.

For example, if a person searches for “running shoes,” Google doesn’t know, from the query, exactly what about running shoes the person is hoping to learn. So it will offer results ranging from reviews of running shoes to places selling running shoes.

But asking whether the Holocaust happened is the equivalent of asking “Did World War II happen?” Understandably, legitimate sites don’t typically engage with the idea that it might not have. Despite being filled with detailed discussions of what, when, where, why, how and to whom the Holocaust happened, the most authoritative sites on the history of the Holocaust don’t address this one direct underlying question: Did it happen? They know it did, and elaborate from there.

However, this appears to suggest to Google’s algorithm that those sites don’t have the most relevant information to answer the specific question a searcher is asking.

This problem is amplified because Google’s algorithm attempts to evaluate sites’ credibility when determining where to include them in search results. When reputable sites don’t seem to provide the answer, less trusted sources that offer direct – though false – responses are able to rise to the top of the search results.

Making matters worse, the algorithm uses machine learning to offer related suggestions about what the searcher might be looking for, even if they don’t use the exact search terms. An initial query premised on Holocaust denial will trigger the system to provide more options like it.

A way for experts to respond

Taken together, the way that deniers frame questions and the Google algorithm’s desire to answer specific questions combine into a recipe for spreading conspiracy theories across the internet. However, Google’s emphasis on credibility means that experts have avenues for addressing these issues: public writing, blogging and linking to factually accurate work.

If the Holocaust Museum were to write an article titled “Did the Holocaust happen?” and provide some basic facts, the content and the site’s credibility would move it to the top of the search results. Its current page confronting Holocaust denial could even be quickly modified to add a line saying “Often this denial comes in the form of a question: ‘Did the Holocaust happen?’” That would introduce the keywords that could boost the existing page’s relevance to Google’s algorithm. (Whether due to additional tweaking on Google’s part, or its algorithm’s response to news coverage, Holocaust Museum content is, as of this writing, much more prominently displayed in Google’s results.)

To be sure, this will not eliminate Holocaust denier websites from Google’s search results entirely, and perhaps not even from the first page of them. Nor will it deter dedicated deniers from finding information that supports their preconceived notions about history. Holocaust denial is based on a selective interpretation of the historical record and deep-seated anti-Semitic beliefs. No website will correct or uproot these beliefs in one fell swoop.

However, offering accurate information alongside false information may give individuals who have yet to internalise these beliefs pause. And it suggests a useful path to those who seek to disseminate truth and fact in the face of denial and conspiracy theories.

Thomas Maher, Postdoctoral Researcher in Sociology, University of Arizona

This article was originally published on The Conversation. Read the original article.