Fake News

(© georgejmclittle - stock.adobe.com)

COLUMBUS, Ohio — It’s a phenomenon that doesn’t seem to be going away anytime soon. “Fake news” and misinformation being spread online has quickly developed into a major problem, and not just in the United States, but globally as well. When examining this issue, most people are quick to point their finger at partisan news outlets, lax social media regulations, and automated misinformation purveying bots. However, an eye-opening new study finds that there is another big source of false information at play here: you!

Researchers from Ohio State University say that it is very common for people, when given credible, legitimate statistics regarding a controversial topic, to misremember that information so that it supports their pre-conceived beliefs and biases.

A group of participants were presented with the true fact that the amount of Mexican immigrants living in the United States has declined in recent years. However, due to many of the participants’ personal political allegiances, they actually tended to remember and recall the exact opposite; that there are more Mexican immigrants present in the U.S. today.

Moreover, the study also noted that when people then go and spread the misinformation they’ve created, the story and numbers usually drift further and further from reality.

“People can self-generate their own misinformation. It doesn’t all come from external sources,” says lead author and assistant professor of communication at OSU Jason Coronel in a release. “They may not be doing it purposely, but their own biases can lead them astray. And the problem becomes larger when they share their self-generated misinformation with others.”

To come to their conclusions, the research team conducted two experiments. In the first, 110 participants were given short written descriptions of four societal issues that involved numbers or statistics. For two of those issues, the researchers had prepared and found that the truthful, accurate information being shared was consistent with most of the participant’s pre-conceived beliefs on the subjects. However, for the other two societal issues, the data presented to the participant group largely went against what most of them already believed to be the case.

After reading over all four pieces of information, participants were asked to remember and write down all of the relevant numbers and statistics that had been presented to them. It’s important to mention that when they were originally showed the data, it was never mentioned that they would be expected to remember or recall any of it.

After compiling all of the participants’ responses, it was observed that people usually correctly remembered statistics and numbers that agreed with and reinforced how they already saw the world, but were much more likely to inaccurately recall data that disrupted their own beliefs.

“We had instances where participants got the numbers exactly correct – 11.7 and 12.8 – but they would flip them around,” Coronel says. “They weren’t guessing – they got the numbers right. But their biases were leading them to misremember the direction they were going.”

Furthermore, the research team used eye-tracking technology to observe the participants while they read over the four pieces of information. This helped them confirm that the participants were giving all four pieces of information the same level of attention.

“We could tell when participants got to numbers that didn’t fit their expectations. Their eyes went back and forth between the numbers, as if they were asking ‘what’s going on.’ They generally didn’t do that when the numbers confirmed their expectations,” Coronel explains. “You would think that if they were paying more attention to the numbers that went against their expectations, they would have a better memory for them. But that’s not what we found.”

In the second experiment, the research team were curious to see if these memory inaccuracies are further distorted over the course of being spread from person to person. So, they set up a trial similar to the classic children’s game “telephone.”

One participant was given the true, accurate statistic regarding Mexican immigrants living in the United States between 2007-2014 (down from 12.8 million to 11.7 million), and asked to write down the numbers as best they could remember. Then, those recalled numbers from the first participant were handed to another participant who was asked to remember and write them down. From there, those numbers were passed to a third participant and so on.

Alarmingly, the results revealed that, on average, the first participant usually flipped the real numbers to falsely indicate Mexican immigrants in the U.S. increased. Even worse, by the end of the process, the average participant had written that the number of Mexican immigrants in the U.S. had increased by 4.6 million within the span of those seven years. That’s a far cry from the true statistic of a 900,000 decrease.

“We need to realize that internal sources of misinformation can possibly be as significant as or more significant than external sources,” comments co-author and doctoral student Shannon Poulsen. “We live with our biases all day, but we only come into contact with false information occasionally.”

The study is published in the scientific journal Human Communication Research.

About John Anderer

Born blue in the face, John has been writing professionally for over a decade and covering the latest scientific research for StudyFinds since 2019. His work has been featured by Business Insider, Eat This Not That!, MSN, Ladders, and Yahoo!

Studies and abstracts can be confusing and awkwardly worded. He prides himself on making such content easy to read, understand, and apply to one’s everyday life.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink


Chris Melore


Sophia Naughton

Associate Editor