Father playing with baby

(© Syda Productions - stock.adobe.com)

In A Nutshell

  • Category recognition starts early: Two-month-old infants’ brains already organize objects into meaningful categories: alive vs. not alive, big vs. small, despite having blurry vision and minimal life experience.
  • Development doesn’t follow expected order: High-level brain regions responsible for complex categorization mature before simpler visual processing areas, overturning assumptions about hierarchical brain development.
  • Babies mirror trained AI networks: Infant brain patterns resemble artificial intelligence systems trained on millions of images, suggesting babies either learn extraordinarily fast or come pre-wired with organizational templates.
  • Early framework adapts to culture: While babies appear to have built-in visual organization systems, these templates remain flexible enough to adapt to whatever cultural environment they’re born into by their first birthday.

A 2-month-old can barely make out her mother’s face from across the room. Her vision is blurry, colors are muted, and she has been looking at the world for approximately eight weeks. Yet inside her brain, a sophisticated filing system is already up and running: organizing cats separately from trees, toys differently from food, living things apart from objects.

This doesn’t mean babies consciously understand what these things are yet. It means their visual brains already organize what they see in meaningful ways.

Scientists at Trinity College Dublin just pulled off something remarkable. They managed to have more than 100 babies lie still enough in an MRI machine to watch what their brains were doing. What they found upends much of what we thought about how infant vision develops.

“We found categorical structure present in high-level visual cortex from 2 months of age,” the researchers wrote in their paper, published in Nature Neuroscience. Translation: babies are organizing their visual world into meaningful groups right from the start, not gradually building up to it over many months like we assumed.

The Backwards Brain

The back of your brain handles basic vision (edges, colors, shapes). The bottom and sides handle the complex stuff like recognizing your grandmother’s face, knowing that all dogs are dogs even though they look different, understanding that a chair is for sitting.

Scientists assumed babies would develop in that order. Simple first, complex later. This assumption appears to be wrong.

The brain region responsible for high-level category recognition (the part that lets you instantly know “that’s an animal” or “that’s furniture”) is already functioning in 2-month-olds. Meanwhile, a region that sits earlier in the processing chain doesn’t mature until much later.

The research team showed babies pictures of things they might see in daily life: cats, birds, rubber ducks, shopping carts, trees, food. Each image appeared for three seconds, growing larger on screen to hold the babies’ wandering attention. Nursery rhymes played softly in the background. Getting usable data from 78% of the 2-month-olds represents a nearly unheard-of success rate in infant brain research, where head movement and fussiness typically derail many scans.

When researchers analyzed the brain scans across many infants, they could tell which category of object babies were viewing just by looking at the group patterns of brain activity. More impressively, babies’ brains responded similarly to different examples of the same thing. For example, a cat from the side produced similar brain activity to a cat from the front. They weren’t just memorizing individual pictures. They were recognizing categories.

Arachidonic acid, a long-chain fatty acid essential to infant brain development, dropped 15% in breast milk within a week of the dietary change.
Your baby knows you’re not a cat. (Credit: leungchopan on Shutterstock)

Smarter Than Their Eyes

At 2 months old, babies can only focus clearly about a foot in front of them. They’re still figuring out colors. They have maybe 5% of adult visual acuity. And yet their brains are already making the same fundamental distinctions adults do: alive versus not alive, big versus small.

The researchers tested whether babies were just responding to simple visual features. Maybe all the cats happened to be the same size or color. They measured every image’s size, shape, color, and compactness. Even after accounting for all these surface-level similarities, the categorical organization remained. The babies’ brains were responding to something deeper than appearance.

Then came the AI comparison. When the team fed the same images into an artificial neural network with random, untrained connections, it barely resembled what baby brains were doing. But a network that had been trained to recognize and classify objects? It matched the infant brain patterns remarkably well, especially in those high-level processing regions.

These babies had been looking at the world for eight weeks. The AI had been “looking” at millions of images. Yet they extracted similar kinds of information: the features that make cats different from trees, animals different from objects. Either babies are extraordinarily fast learners, or they come pre-wired with some kind of organizational template.

Born Ready, or Learning Fast?

That’s the question researchers can’t yet answer. Is this sophisticated categorization ability present at birth, or does it emerge lightning-fast during those first two months? The study caught babies at 2 months and again at 9 months, showing that the system refines and strengthens over time. But what’s happening in between birth and that first scan remains a mystery.

“We propose that neural adaptation to the statistics of the visual world may have shaped the brain’s architecture over evolutionary time,” the researchers write. In other words, maybe human brains evolved to expect certain patterns in visual input: faces will be important, animals will move, large objects won’t fit in your mouth. Having this framework pre-installed would give babies a massive head start.

If this early template is shared across humans, it could help explain why people across wildly different cultures develop similar ways of categorizing the world. A baby in Dublin and a baby in rural Indonesia have completely different visual experiences, yet their brains seem to organize objects along the same fundamental lines.

For parents, this research offers a fascinating glimpse into what’s happening behind those unfocused baby eyes. Your infant might not be able to see the family cat clearly, but her brain is already filing it under “living thing,” separate from her toys and blankets. She’s building a mental model of the world faster and earlier than anyone realized.

The question now is what to do with this information. If babies’ brains are this sophisticated this early, what kind of visual environment best supports that development? The researchers note that by 12 months, babies start showing preferences for culturally relevant objects and backgrounds, meaning their pre-installed template is flexible enough to adapt to whatever visual world they’re born into.

The study doesn’t test parenting choices directly, but it suggests babies come equipped to learn from whatever they see, whether it’s board books or real books, toys or household objects, screens or the actual world. The sophisticated categorization system is already there, ready to organize whatever information comes its way.

Paper Notes

Limitations

The study focused on common objects and categories, deliberately avoiding specialized categories like faces that have been studied extensively in previous infant research. While researchers could measure consistent patterns across groups of babies, individual infants showed variable responses, particularly in lateral visual regions. The study included both full-term infants and some born prematurely who spent time in neonatal intensive care, though results were validated in the full-term group alone. Motion during scanning remains a challenge with infant populations, requiring strict quality control procedures that excluded some collected data. The measurements capture brain activity at specific timepoints but cannot determine exactly when during the first two months these representations emerge. The sample size decreased at the 9-month follow-up as some families chose not to continue participation.

Funding and Disclosures

This work was funded by European Research Council Advanced Grant ERC-2017-ADG, FOUNDCOG, 787981, Irish Research Council grants GOIPG/2021/223 and GOIPG/2023/2479, MSCA-IF ‘InterPlay’ 891535, Science Foundation Ireland SFI RC-17/RCPhD/3482-RCPhD and Research Ireland 22/FFP-A/11050. The authors declare no competing interests.

Publication Details

The study “Infants have rich visual categories in ventrotemporal cortex at 2 months of age” was authored by Cliona O’Doherty, Áine T. Dineen, Anna Truzzi, Graham King, Lorijn Zaadnoordijk, Keelin Harrison, Enna-Louise D’Arcy, Jessica White, Chiara Caldinelli, Tamrin Holloway, Anna Kravchenko, Jörn Diedrichsen, Ailbhe Tarrant, Angela T. Byrne, Adrienne Foran, Eleanor J. Molloy, and Rhodri Cusack from Trinity College Institute of Neuroscience and collaborating institutions. The research was published in Nature Neuroscience on February 2, 2026. DOI: 10.1038/s41593-025-02187-8. The study received ethical approval from Trinity College Dublin School of Psychology Research Ethics Committee, the Rotunda Ethics Committee, and the Coombe Ethics Committee. Pseudo-anonymized imaging data from consenting participants are available at OpenNeuro (ds006883).

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply