Google autocomplete helps legitimize conspiracy theorists, study says

BURNABY, British Columbia — Google’s autocomplete feature attempts to make internet browsing and the retrieval of accurate information easier, but is this supposedly convenient attribute helping fuel conspiracy theories and mislead the public? Unfortunately, the answer to that question is yes, according to researchers from Simon Fraser University.

Study authors report that Google’s autocomplete algorithms often place oversimplified, innocuous subtitles on prominent conspiracy theorists. For example, while classifying Alex Jones as an “American radio host” may technically be true, it’s leaving out a pretty major portion of the story. Researchers argue this trend may seem minor in scope at first but may be misleading countless internet users and even helping amplify extremist views.

Another example is Gavin McInnes, the creator of the neo-fascist Proud Boys organization. Officials consider the group a terrorist entity in Canada and a hate group in the United States, yet Google’s algorithm displays a subtitle for Mr. McInnes that reads “Canadian writer.”

Jerad Miller killed multiple people in a 2014 Las Vegas shooting. Google’s algorithm says he was an “American performer.”

In collaboration with The Disinformation Project at the School of Communication at SFU, the research team analyzed the automatic subtitles displayed by Google for 37 alleged conspiracy theorists. They found “in all cases, Google’s subtitle was never consistent with the actor’s conspiratorial behavior.”

No way to change Google’s algorithms?

We’re not just talking about one website or even social media platform here. Google is synonymous with the internet itself at this point. Considering the sheer volume of daily traffic seen on Google’s server, study authors worry the subtitles “can pose a threat by normalizing individuals who spread conspiracy theories, sow dissension and distrust in institutions and cause harm to minority groups and vulnerable individuals,” according to Nicole Stewart, a communication instructor of communication and PhD student on The Disinformation Project.

For what it’s worth, according to Google, a series of complex algorithms automatically generate those subtitles. In other words, the search engine itself can’t accept or create custom subtitles.

Researchers explain that these subtitles are universally either neutral or positive – but never negative, even when it would be appropriate.

“Users’ preferences and understanding of information can be manipulated upon their trust in Google search results, thus allowing these labels to be widely accepted instead of providing a full picture of the harm their ideologies and belief cause,” says Nathan Worku, a Master’s student on The Disinformation Project, in a university release.

This study focused specifically on conspiracy theorists, but study authors say similar results appear when searching for widely known terrorists or mass murderers.

“This study highlights the urgent need for Google to review the subtitles attributed to conspiracy theorists, terrorists, and mass murderers, to better inform the public about the negative nature of these actors, rather than always labelling them in neutral or positive ways,” researchers conclude.

The study is published in M/C Journal.

YouTube video

Follow on Google News

About the Author

John Anderer

Born blue in the face, John has been writing professionally for over a decade and covering the latest scientific research for StudyFinds since 2019. His work has been featured by Business Insider, Eat This Not That!, MSN, Ladders, and Yahoo!

Studies and abstracts can be confusing and awkwardly worded. He prides himself on making such content easy to read, understand, and apply to one’s everyday life.

The contents of this website do not constitute advice and are provided for informational purposes only. See our full disclaimer