old person computer

(Credit: MART PRODUCTION from Pexels)

Memory and reasoning training showed no protective effect, only speed training + follow-up sessions

In A Nutshell

  • Older adults who did computerized speed-training exercises plus booster sessions had 25% lower risk of dementia diagnosis over 20 years (49% vs 40% diagnosis rates)
  • Memory and reasoning training showed no protective effect: only speed training worked, and only with follow-up sessions
  • The effective training required 10 initial sessions over 6 weeks, plus booster sessions at 1 year and 3 years
  • Skipping the booster sessions erased all benefits, even for those who completed initial training

Can a handful of training sessions in your 70s reduce your dementia risk over the next two decades? A study tracking more than 2,000 older adults suggests the answer is yes, but there’s a caveat. Only one particular type of cognitive training delivered this lasting benefit, and only when people stuck with it.

Scientists followed older adults for two decades after they completed brain exercises. Those who had engaged in a specific speed-training program and came back for tune-ups had about 25% lower risk of being diagnosed with dementia.

Older adults who completed a computerized speed-training program and attended booster sessions had about 25% lower risk of being diagnosed with Alzheimer’s disease and related dementias over the next 20 years compared to those who received no training. In real numbers: nearly half the control group (about 49%) were eventually diagnosed with dementia, compared to roughly 40% of those who did speed training with boosters.

But here’s what makes the finding both exciting and frustrating: researchers tested three different types of brain training, and only one worked. Memory exercises didn’t lower dementia diagnoses. Reasoning puzzles didn’t slow cognitive decline. And even the people who did the effective speed training only benefited if they came back for follow-up sessions. Skip those, and the benefit vanished.

The study offers a reality check for anyone who thinks downloading a brain-training app will ward off dementia. Not all cognitive training is created equal, and doing it right matters more than just doing it.

The Training That Worked Was Different from the Start

The study, known as ACTIVE, enrolled adults 65 and older from six U.S. cities starting in 1998. After baseline testing, participants were randomly assigned to one of four groups: memory training, reasoning training, speed training, or no training at all. Each program involved ten hour-long group sessions over about six weeks.

Memory training taught techniques for remembering shopping lists and medication schedules: the kind of strategies you might learn in a memory improvement class. Reasoning training focused on solving pattern-based problems like figuring out bus schedules. Both approaches taught conscious strategies people could deliberately apply.

Speed training took a completely different path. Participants sat at computers doing exercises that flashed objects on screens for increasingly brief moments. The task required identifying what they saw while simultaneously tracking other information in their peripheral vision. No strategies, no mnemonics, just practice at processing information faster while juggling multiple things at once.

The computer automatically ramped up difficulty as people improved, never letting the exercises become easy. That adaptive challenge may explain why this training worked when others didn’t. People weren’t learning tricks, they were rebuilding fundamental processing capacity.

It wasn’t just the speed training: Returning for tune-up sessions proved key. (Photo by Miljan Zivkovic on Shutterstock)

The Follow-Up Sessions Made or Broke the Effect

About a year after the initial training, and again nearly three years later, participants who had completed at least eight of the ten original sessions were randomly selected to come back for booster sessions: up to four more hours of training.

Among speed-training participants who returned for boosters, dementia diagnoses were about 25% lower compared to the control group over the next two decades. Among those who completed the initial speed training but didn’t receive booster sessions, dementia rates were no different from people who got no training at all.

That’s an important message: one-time training isn’t enough. Like going to the gym once and expecting to stay fit for 20 years, the brain apparently needs reinforcement. The booster sessions, spaced over nearly three years, seem to have locked in benefits that a single round of training couldn’t sustain.

Memory and reasoning training showed no benefits regardless of boosters, suggesting the effect wasn’t just about extra attention or staying engaged with the study. Something specific about speed training combined with follow-up was linked to long-term differences in dementia risk.

Twenty Years of Medicare Records Told the Story

Researchers tracked participants through 2019 by linking them to Medicare claims data, an unusually long follow-up that revealed real-world health outcomes rather than just test scores. In the control group, nearly half the people (48.7%) were eventually diagnosed with dementia. In the speed-training-plus-boosters group, that rate was about 40%: a difference of roughly 9 percentage points in absolute terms, though the relative risk reduction was 25%.

The approach had limitations. It only captured people who got diagnosed, meaning those who didn’t access healthcare or whose families didn’t notice changes could be missed. The study also excluded people in Medicare Advantage plans because those records aren’t complete. Still, following this many people for this long using objective medical records represents a major achievement in dementia research.

Why Speed Training Succeeded Where Others Failed

Scientists don’t fully understand why speed training was linked to lower dementia diagnoses while memory and reasoning training weren’t. But there are clues and plausible hypotheses.

Researchers suspect that speed training targeted divided attention and rapid processing: fundamental brain capabilities that underlie everything from driving to having conversations. Strengthening these basic systems may protect the brain more broadly than learning specific memory tricks. The training also worked on automatic, unconscious processing rather than conscious strategies that people have to remember to use.

As we age, the controlled processes that require deliberate effort decline faster than automatic ones. One possibility is that training which builds up automatic processing capacity may be hitting the brain’s vulnerabilities at exactly the right level.

There’s also the adaptive difficulty factor. The exercises never became routine because the computer kept pushing participants to their limits. That constant challenge may have driven neuroplasticity (the brain’s ability to form new connections) in ways that static training doesn’t.

What This Means If You’re Worried About Your Brain

The findings are encouraging but come with caveats. The training that worked was specific, structured, and required commitment over several years. Many commercial brain-training apps don’t share these features, and there’s no way to know if they’d produce similar benefits.

The study, published in Alzheimer’s & Dementia, can’t tell us whether the training prevented dementia entirely or delayed diagnosis. Still, even delaying diagnosis by a few years would be valuable.

Questions remain about who benefits most. Would starting earlier or later make a difference? Would more booster sessions at five or ten years extend protection even further? Could similar training help people who already have mild cognitive impairment, or does it only work for prevention?

What we know now is that a relatively brief intervention (weeks of initial training plus periodic tune-ups) was associated with lower dementia diagnoses over two decades.


Paper Notes

Study Limitations

The study excluded participants enrolled in Medicare Advantage at baseline (26% of the matched sample) because these plans don’t provide complete administrative claims data. Individuals in Medicare Advantage tend to be healthier, potentially biasing results toward the null hypothesis. The outcome relied on diagnosed dementia captured in Medicare claims rather than comprehensive cognitive assessments, meaning some cases of dementia may have been missed, especially among individuals with limited healthcare access. Booster training was offered only to participants who completed at least eight of the initial ten training sessions, introducing potential selection bias. The study sample represented approximately 73% of the original ACTIVE participants. Receiving a dementia diagnosis depends on factors including healthcare access, family awareness of cognitive changes, and education level, which could introduce systematic differences in who gets diagnosed.

Funding and Disclosures

The work was supported by grant R01AG056486 from the National Institute on Aging. The ACTIVE Cognitive Training Trial received support from the National Institutes of Health to six field sites and the coordinating center, including Hebrew Senior-Life Boston (NR04507), Indiana University School of Medicine (NR04508), Johns Hopkins University (AG014260), New England Research Institutes (AG014282), Pennsylvania State University (AG14263), University of Alabama at Birmingham (AG14289), and Wayne State University/University of Florida (AG014276). Karlene Ball serves as a consultant and owns stock in Posit Science Inc., which acquired and markets the Useful Field of View Test and speed of processing training software initially developed by the Visual Awareness Research Group, Inc. and used in the ACTIVE clinical trial. Ball continues collaborating on design and testing of these assessment and training programs as a member of the Posit Science Scientific Advisory Board. All other authors reported no disclosures.

Publication Details

Title: Impact of cognitive training on claims-based diagnosed dementia over 20 years: evidence from the ACTIVE study | Authors: Norma B. Coe, Katherine E. M. Miller, Chuxuan Sun, Elizabeth Taggert, Alden L. Gross, Richard N. Jones, Cynthia Felix, Marilyn S. Albert, George W. Rebok, Michael Marsiske, Karlene K. Ball, Sherry L. Willis | Journal: Alzheimer’s & Dementia: Translational Research & Clinical Interventions | Publication Year: 2026 | Volume/Issue: 12:e70197 | DOI: 10.1002/trc2.70197 | Received: October 13, 2025; Revised: December 1, 2025; Accepted: December 3, 2025 | The study represents an open access article under Creative Commons Attribution-NonCommercial-NoDerivs License.

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply