Scientific journal paper

(Photo by PolyPloiid on Shutterstock)

In A Nutshell

  • Fraudulent science is now an industry — paper mills, predatory journals, and publication brokers are producing fake research at industrial scale, overwhelming peer review systems.
  • Generative AI is accelerating the problem, enabling mass-produced papers with fabricated data, plagiarism, and hidden manipulations that trick automated review tools.
  • Batch retractions are surging, signaling systemic corruption; solving this requires reforming incentives in academia and strengthening safeguards before fraudulent work eclipses legitimate science.

Researchers are dealing with a disturbing trend that threatens the foundation of scientific progress: scientific fraud has become an industry. And it’s growing faster than legitimate peer reviewed science journals can keep up with.

This isn’t about individual bad actors anymore. We’re witnessing the emergence of an organized, systematic approach to scientific fraud. This includes paper mills churning out formulaic research articles, brokerages guaranteeing publication for a fee and predatory journals that bypass quality assurance entirely.

These organizations disguise themselves behind respectable-sounding labels such as “editing services” or “academic consultants.” In reality, their business model depends on corrupting the scientific process.

Paper mills operate like content farms, flooding journals with submissions to overwhelm peer review systems. They practice journal targeting, sending multiple papers to one publication, and journal hopping, submitting the same paper to multiple outlets simultaneously. It’s a numbers game. If even a fraction slip through, the fraudulent service profits.

Is this just a case of scientists being lazy? The answer is more complex and troubling. Today’s researchers face constraints that make these fraudulent services increasingly tempting. The pressure to continually produce new research or risk getting your funding cut, called the “publish or perish” culture, is a longstanding problem.

As well, governments around the world are facing financial struggles and are looking to trim costs, resulting in less funding for research. Less funding means increased competition.

This creates a catch-22 situation for researchers who need publications to win funding but need funding to conduct publishable research. Environmental factors compound the issue. Globalization means individual researchers are lost in an ocean of competing voices, making the temptation to game the system even stronger.

In this environment, the promise of guaranteed publication can seem like a lifeline rather than a Faustian bargain.

Glasses on top of a scientific research paper
Paper mills target peer-review systems by overwhelming journals with fraudulent submissions. (Photo by Rannev on Shutterstock)

AI: Acceleration At What Cost?

The rise of generative AI has supercharged this fraud industry. Researchers are witnessing an explosion in research articles that appear to exploit AI software to produce papers at an unprecedented speed. These papers mine public data sets that offer surface level evidence. These hastily generated papers bear hallmarks of a paper mill production process, including evidence fabrication, data manipulation, ethics misconduct and outright plagiarism.

Where a peer reviewer might once have received ten submissions for a conference or journal in a year, they’re now drowning in 30 or 40 submissions with a shorter time frame (six months or less), with legitimate research buried in the avalanche.

Overwhelmed reviewers, in turn, are tempted to use AI tools to summarize papers, identify gaps in the evidence and even write review responses. This is creating an arms race. Some researchers have started embedding hidden text in their submissions, such as white text on white backgrounds or microscopic fonts, containing instructions to override AI prompts and give the paper positive reviews.

The peer review system, academia’s safeguard against fraud, faces its own problems. Although it’s meant to ensure quality, it is a slow process where new ideas need careful examination and testing. History reminds us that peer review is essential but imperfect. Albert Einstein hated it.

Because the process is slow, many researchers share their findings first on pre-publication platforms, where work can be shared immediately. By the time the research reaches a legitimate science conference or journal, non-peer-reviewed publications are already being distributed to the world. Waiting for the peer review process means a researcher risks missing getting credit for their discovery.

The pressure to be first hasn’t changed since Isaac Newton let his calculus discovery languish unpublished while Gottfried Leibniz claimed the kudos. What has changed is the scale and systematization of shortcuts.

A rise in batch retractions (ten or more papers simultaneously withdrawn) signals that we’re not dealing with isolated incidents but with an industrial-scale problem. In the 1990s there were almost no batch retractions. In 2020 there were around 3,000 and over 6,000 in 2023.

In comparison, in 2023 there were 2,000 single paper retractions. This means that batch retractions of more than ten papers were three times higher than single paper retractions.

A Path Forward

If this were simply about weeding out unethical scientists, the systems we already have might suffice. But we’re facing a challenge to the network of checks and balances that makes science work. When fraudulent publications grow faster than legitimate science and when AI-generated content overwhelms human review capacity, we need better solutions.

The scientific community must reckon with how its own structures; the publication metrics, funding mechanisms and career incentives, have created vulnerabilities that unethical systems can exploit.

Until we address these systemic issues, the fraud industry will thrive, undermining the enterprise that has made our world safer, cleaner and more accessible. The question isn’t whether we can afford to fix this system—it’s whether we can afford not to.

Owen Brierley, Course Leader in the Department of Creative Industries, Kingston University. He does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

About The Conversation

The Conversation is a nonprofit news organization dedicated to unlocking the knowledge of academic experts for the public. The Conversation's team of 21 editors works with researchers to help them explain their work clearly and without jargon.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply