Abstract speaker silhouette with bubbles

Abstract speaker silhouette with bubbles in the head (© Petr Vaclavek - stock.adobe.com)

Study authors from Massachusetts General Hospital explain that this work provides a detailed map regarding how speech sounds (consonants, vowels) formulate in the mind long before someone speaks, as well as how people string them together during language production. In a broader sense, these insights into the brain cells in charge of language production could eventually lead to improvements in the understanding and treatment of both speech and language disorders.

“Although speaking usually seems easy, our brains perform many complex cognitive steps in the production of natural speech—including coming up with the words we want to say, planning the articulatory movements and producing our intended vocalizations,” says senior author Ziv Williams, MD, an associate professor in Neurosurgery at MGH and Harvard Medical School, in a media release.

“Our brains perform these feats surprisingly fast—about three words per second in natural speech—with remarkably few errors. Yet how we precisely achieve this feat has remained a mystery.”

Researchers used cutting-edge technology called Neuropixels probes to record activities among individual neurons in the prefrontal cortex, a frontal region of the human brain. This approach led to Dr. Williams and his colleagues identifying cells involved in language production that may underlie speaking abilities. They also note the discovery of a separate collection of neurons dedicated solely to speaking and listening.

“The use of Neuropixels probes in humans was first pioneered at MGH. These probes are remarkable—they are smaller than the width of a human hair, yet they also have hundreds of channels that are capable of simultaneously recording the activity of dozens or even hundreds of individual neurons,” comments Dr. Williams, who worked to develop these recording techniques with Sydney Cash, MD, PhD, a professor in Neurology at MGH and Harvard Medical School and study co-leader. “Use of these probes can therefore offer unprecedented new insights into how neurons in humans collectively act and how they work together to produce complex human behaviors such as language.”

Man's brain
Researchers say brain cells can predict what we’re going to say before we say it! (© Prostock-studio – stock.adobe.com)

The study details how neurons in the brain represent some of the most basic elements involved in the construction of spoken words, ranging from simple speech sounds (phonemes) to their assembly into more complex strings like syllables. For instance, the consonant “da” happens by touching the tongue to the hard palate behind the teeth, and is necessary to produce the word “dog.”

By recording individual neurons, researchers discovered that certain neurons become active before this phoneme materializes verbally. Other neurons, meanwhile, reflect more complex aspects of word construction, like the specific assembly of phonemes into syllables.

Thanks to the advanced technology at their disposal, researchers successfully showed that it’s possible to reliably determine the speech sounds an individual will say before they speak. Put another way, scientists can predict what combination of consonants and vowels will come forth verbally before the words leave one’s lips. This incredible feat can potentially help build artificial prosthetics or brain-machine interfaces capable of producing synthetic speech. This could benefit countless patients.

“Disruptions in the speech and language networks are observed in a wide variety of neurological disorders—including stroke, traumatic brain injury, tumors, neurodegenerative disorders, neurodevelopmental disorders, and more,” explains Arjun Khanna, who is a co-author of the study. “Our hope is that a better understanding of the basic neural circuitry that enables speech and language will pave the way for the development of treatments for these disorders.”

Moving forward, study authors hope to expand on these findings by studying more complex language processes. This will allow them to assess questions related to how people choose the words that they intend to say and how the brain assembles those words into sentences conveying internal thoughts and feelings.

The study is published in the journal Nature.

About John Anderer

Born blue in the face, John has been writing professionally for over a decade and covering the latest scientific research for StudyFinds since 2019. His work has been featured by Business Insider, Eat This Not That!, MSN, Ladders, and Yahoo!

Studies and abstracts can be confusing and awkwardly worded. He prides himself on making such content easy to read, understand, and apply to one’s everyday life.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

Chris Melore

Editor

Sophia Naughton

Associate Editor