Mr. Know It All

(Credit: © Robert Byron | Dreamstime.com)

COLUMBUS, Ohio — The next time you find yourself in a heated argument, absolutely certain of your position, consider this: researchers have discovered that the more confident you feel about your stance, the more likely you are to be working with incomplete information. It’s a psychological quirk that might explain everything from family disagreements to international conflicts.

We’ve all been there: stuck in traffic, grumbling about the “idiot” driving too slowly in front of us or the “maniac” who just zoomed past. But what if that slow driver is carefully transporting a wedding cake, or the speeding car is rushing someone to the hospital? The fascinating new study published in PLOS ONE suggests that these snap judgments stem from what researchers call “the illusion of information adequacy” — our tendency to believe we have enough information to make sound decisions, even when we’re missing crucial details.

“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” explains study co-author Angus Fletcher, a professor of English at The Ohio State University and member of the university’s Project Narrative, in a statement. “If you give people a few pieces of information that seems to line up, most will say ‘that sounds about right’ and go with that.”

In today’s polarized world, where debates rage over everything from vaccines to climate change, understanding why people maintain opposing viewpoints despite access to the same information has never been more critical. This research, conducted by Fletcher, Hunter Gehlbach of Johns Hopkins University, and Carly Robinson of Stanford University, reveals that we rarely pause to consider what information we might be missing before making judgments.

Making decisions: Good choice vs Bad choice
It’s hard to truly make a good decision when you don’t have all the facts. (© Jane – stock.adobe.com)

The researchers conducted an experiment with 1,261 American participants recruited through the online platform Prolific. The study centered around a hypothetical scenario about a school facing a critical decision: whether to merge with another school due to a drying aquifer threatening their water supply.

The participants were divided into three groups. One group received complete information about the situation, including arguments both for and against the merger. The other two groups only received partial information – either pro-merger or pro-separation arguments. The remarkable finding? Those who received partial information felt just as competent to make decisions as those who had the full picture.

“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher notes. “They were quite sure that their decision was the right one, even though they didn’t have all the information.”

Social media users might recognize this pattern in their own behavior: confidently sharing or commenting on articles after reading only headlines or snippets, feeling fully informed despite missing crucial context. It’s a bit like trying to review a movie after watching only the first half, yet feeling qualified to give it a definitive rating.

Young couple having conversation or debate
“Most interpersonal conflicts aren’t about ideology, they are just misunderstandings in the course of daily life,” says study co-author Angus Fletcher. (© WavebreakMediaMicro – stock.adobe.com)

The study revealed an interesting finding regarding the influence of new information. When participants who initially received only one side of the story were later presented with opposing arguments, about 55% maintained their original position on the merger decision. That rate is comparable to that of the control group, which had received all information from the start.

Fletcher notes that this openness to new information might not apply to deeply entrenched ideological issues, where people may either distrust new information or try to reframe it to fit their existing beliefs. “But most interpersonal conflicts aren’t about ideology,” he points out. “They are just misunderstandings in the course of daily life.”

Beyond personal relationships, this finding has profound implications for how we navigate complex social and political issues. When people engage in debates about controversial topics, each side might feel fully informed while missing critical pieces of the puzzle. It’s like two people arguing about a painting while looking at it from different angles: each sees only their perspective but assumes they’re seeing the whole picture.

Fletcher, who studies how people are influenced by the power of stories, emphasizes the importance of seeking complete information before taking a stand. “Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’ That’s the way to fight this illusion of information adequacy.”

Paper Summary

Methodology

The study employed a preregistered experimental design with 1,261 participants recruited through Prolific, an online platform. Participants were randomly assigned to different groups: a control group received complete information about a school merger scenario, while treatment groups received either pro-merger or pro-separation arguments. Some participants were then given additional information and asked to reassess their positions, while others completed survey questions about their decision-making confidence and information adequacy perceptions.

Results

The key findings revealed that participants who received only half the information felt equally competent to make decisions as those with complete information. Surprisingly, those with partial information showed higher confidence in their initial decisions. When exposed to complete information later, most participants maintained their original positions, though their confidence levels decreased. The study also found evidence of a “false consensus effect,” where participants believed others would reach the same conclusions they did.

Limitations

The researchers acknowledge several limitations. The hypothetical nature of the school merger scenario might not have generated the same investment as real-world decisions. Additionally, the study used an online sample, and participants had no way of knowing they were receiving incomplete information. The researchers also note that their broad definition of “adequate” information – encompassing relevance, quantity, importance, trustworthiness, and credibility – might benefit from more focused examination in future studies.

Discussion and Takeaways

The study complements existing research on “naïve realism” – the tendency to believe we see objective reality – by identifying another psychological bias that affects decision-making. The findings suggest that simply making people aware of potential information gaps might improve perspective-taking and reduce conflict in various settings, from personal relationships to political discourse. The researchers propose that developing habits of questioning our information adequacy could lead to more thoughtful and nuanced decision-making.

Funding and Disclosures

The research was supported by start-up funds from Johns Hopkins University School of Education, with the funders having no role in study design, data collection and analysis, decision to publish, or manuscript preparation. The authors declared no competing interests.

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply

7 Comments

  1. Scott G Allam says:

    Consensus says that Scientists are some of the biggest “know-it-alls” around.

  2. flyfisher111 says:

    Ever hear of the dunning-Kruger effect?

  3. Dave Wallace says:

    “What the hell is water?”

  4. Jake Fontaine says:

    Person, woman, man, camera, TV… I’m a stable genius, not a narcissistic sociopath!

  5. Monica Stellar says:

    I knew that.

  6. Raymond T Jackson says:

    Well. That’s why they call them “know it all’s”.

  7. bpq says:

    This very study has its own limitations and draws its own blanket conclusion, contradicting its actual message. While “Know it alls” can actually be quite intelligent, if narcissistic experts from various fields, they also can be hyper-partisan disinformation merchants. There are many recent examples: 1. President Trump was ‘proven’ to be a Super Secret Traitorous Russian Double Agent Spy;
    2. President Biden is ‘sharp as a tack’ and not neurocognitively incompetent/impaired; 3. Somehow human activity is equal or greater in impact than the sun on earth’s climate; etc. Over the last decade, non-biased, objective physicians and scientists were dismissed, ridiculed, or worse for speaking/writing accurate statements that disagreed with the often Leftist-Orwellian Newspeak ‘Know it Alls’ who understood that political and psychological propaganda often relies on simple repetition to be embedded.