In 2011, acclaimed tech writer Steven Levy published In The Plex: How Google Thinks, Works, and Shapes Our Lives. The book not only chronicled Google’s rise to tech dominance, but also provided readers with an unprecedented look at how the higher ups at Google think and operate. One of the more interesting and broader themes of the book is that Google is beholden to cold hard data and finely tuned algorithms.
As applied to its ubiquitous search engine, Google remains steadfast in its belief that search results should be prioritized by its algorithm and its algorithm alone. This is all well and good, but its recently been brought to light that a number of hate-oriented websites have managed to game Google’s algorithm such that the top search results to certain questions yield patently false web listings.
As a prominent example, if a user types in “Did the Holocaust happen?”, the top non-paid listing is from a White Supremacy Neo Nazi website, as evidenced via the photo below.
While Google understandably claims that “search is a reflection of the content that exists on the web”, the issue above is particularly problematic because if Google’s stated mission is to “organize the world’s information”, it’s failing miserably if it’s algorithm is structured such that it directs users to patently false information.
Another problem recently brought to the surface is that some Google queries autocomplete with hateful phrases. As an example, the phrases “Are Jews” and “Are women” were, up until recently, auto-completed with the word “evil.”
As the issue has picked up more traction in the press, Google now appears intent on figuring out how to address it.
The BBC reports that search engine expert Danny Sullivan recently met with Google to discuss the issue.
“I’m as horrified and disappointed by the results as many people are,” he told the BBC.
However, he said Google – which processes five billion searches a day – was keen to come up with a solution that was broadly applicable across all searches, rather than just those that have been noticed by users.
“It’s very easy to take a search here and there and demand Google change something,” explained Mr Sullivan, “and then the next day you find a different search and say, ‘why didn’t you fix that?'”
Writing on his own site, Sullivan added that Google is aiming to “find solutions that are generally defensible, are rooted in policy and can be implemented through algorithms as much as possible.”
For as much flack as Facebook seems to be getting for disseminating fake news, somehow Google is escaping criticism for directing users to hateful opinion sites masquerading as news sites.