Some AI promtps cause more CO2 emissions than others. (petrmalinak/Shutterstock)
In a nutshell
- Advanced AI models produce significantly more carbon emissions, especially when they use reasoning to generate long, complex responses, sometimes emitting over 2,000 grams of COâ‚‚ equivalent to answer just 1,000 questions.
- The more powerful the model, the higher the environmental cost, but not all big models are equal: some, like Qwen 2.5, achieved strong performance with far lower emissions than similarly sized systems.
- Despite AI’s growing energy demands, only a tiny fraction of research papers mention carbon emissions, highlighting a major blind spot in how we evaluate and design AI systems.
MUNICH — Every time you ask ChatGPT to write an email or have Claude solve a math problem, you’re contributing to a growing carbon footprint. One recent estimate suggests generative AI models now use as much electricity annually as entire countries. New research from Germany reveals the shocking environmental cost of our AI conversations. The numbers might make you think twice about your next chat with a bot.
The study, published in Frontiers in Communication, focused on how much energy large language models actually consume when we use them. According to researchers, the most advanced AI models can emit over 2,000 grams of CO2 equivalent to answer just 500 questions.
While we’ve all heard vague warnings about AI’s environmental impact, this study actually measured energy consumption in real time as different AI models worked through problems.
Researchers tested 14 different AI models, ranging from relatively small 7-billion-parameter models to massive 72-billion-parameter systems. The more parameters, the “smarter” the AI, but also the more energy-hungry it is.

Each model tackled 1,000 questions total: 500 multiple-choice questions where they just had to pick A, B, C, or D, and 500 free-response questions where they could write lengthy answers. The questions came from diverse subjects including philosophy, world history, international law, abstract algebra, and high school mathematics.
How AI “Thinks” Matters
Using an NVIDIA A100 GPU, the kind of powerful computer chip that powers most AI services, the team measured exactly how much electricity each model consumed and converted that into CO2 emissions using global energy grid averages.
Larger models consistently performed better on the tests, but they also consumed dramatically more energy. The top performer, a 70-billion parameter reasoning model called Cogito, achieved 84.9% accuracy but emitted 1,341 grams of CO2 equivalent, nearly 50 times more than the smallest model tested.
There was also a difference between regular AI responses and “reasoning” responses. When AI models were allowed to “think out loud,” showing their work like a student solving a math problem, their energy consumption skyrocketed.
The study found that reasoning-enabled systems generated substantially more emissions than their standard counterparts. In some cases, reasoning modes consumed 4 to 6 times more energy than standard text generation.
When AI Gets Chatty, the Planet Pays
Part of the problem is that advanced AI models can’t seem to keep their answers short. When asked simple multiple-choice questions that should require just a one-letter answer, some reasoning models generated responses with over 14,000 words. One model produced a single answer that was 37,575 words long, longer than many novellas.
This verbosity comes with a cost. The study tracked “tokens,” which are units of text that AI models process and generate. While basic models might use 37 tokens (roughly 30 words) to answer a question, reasoning models averaged over 1,400 tokens per response, with some stretching into the thousands.
Different subjects also demanded varying amounts of computational power. Abstract algebra consistently stumped the models and required the most energy, while questions about world history were relatively easier for AI to handle efficiently.
Finding the Sweet Spot
Not all the findings were concerning. The research revealed that some models strike a balance between performance and environmental impact. The Qwen 2.5 model with 72 billion parameters achieved strong 77.6% accuracy while emitting just 427 grams of CO2 equivalent, less than one-third the emissions of comparable reasoning models.
This suggests that AI companies could potentially design models that are both smart and environmentally conscious, though it may require sacrificing some of the advanced reasoning capabilities that make headlines.

– stock.adobe.com)
With AI chatbots becoming as common as search engines, these energy costs add up quickly. The study notes that generative AI models already consume about 29.3 terawatt-hours annually, equivalent to Ireland’s entire national electricity consumption.
Yet despite growing awareness of climate change, the researchers found that only about 2% of AI research papers even mention carbon emissions or environmental impact. Most studies rely on theoretical estimates rather than real-world measurements like this one.
The research points to several potential solutions for reducing AI’s environmental footprint. Companies could focus on optimizing reasoning efficiency rather than simply maximizing model size and capabilities. The study suggests that developing more efficient reasoning strategies could maintain high accuracy while reducing emissions.
The wide variation in performance across different subject areas also indicates that specialized models designed for specific tasks might be more environmentally friendly than general-purpose reasoning systems.
“If users know the exact COâ‚‚ cost of their AI-generated outputs, such as casually turning themselves into an action figure, they might be more selective and thoughtful about when and how they use these technologies,” says study author Maximilian Dauner from the Munich University of Applied Sciences, in a statement.
How much environmental cost are we willing to accept for smarter artificial intelligence? This research suggests that our current trajectory toward more powerful, reasoning-capable AI comes with steep environmental trade-offs that most users never see.
Paper Summary
Methodology
Researchers from Munich University of Applied Sciences tested 14 large language models with parameter counts ranging from 7 billion to 72 billion. They used the MMLU (Massive Multitask Language Understanding) dataset, selecting 500 questions across five subjects: Philosophy, High School World History, International Law, Abstract Algebra, and High School Mathematics. Each model answered these questions in two formats: multiple-choice (requiring single-word answers) and free-response (allowing unlimited length responses). All testing was conducted on an NVIDIA A100 GPU with 80GB memory, with energy consumption measured using the Perun framework and converted to CO2 equivalent emissions using a factor of 480 gCO2/kWh.
Results
Larger models consistently achieved higher accuracy but consumed considerably more energy. The best-performing model (Cogito 70B reasoning) reached 84.9% accuracy but emitted 1,341 grams of CO2 equivalent across all 1,000 questions. Reasoning-enabled models consumed 4-6 times more energy than standard models due to increased token generation. The smallest model (Qwen 7B) emitted only 27.7 grams but achieved just 32.9% accuracy. Subject-wise, Abstract Algebra proved most challenging and energy-intensive, while High School World History showed the highest accuracy rates. Token generation varied dramatically, with reasoning models producing up to 37,575 words for single responses.
Limitations
The study’s findings are limited to the specific hardware setup (NVIDIA A100 GPU) and energy profile (480 gCO2/kWh emission factor) used. Results may not be generalizable to other model families due to architectural differences between AI systems. The research did not include models with several hundred billion parameters, limiting conclusions about the largest available AI systems. The emission calculations depend heavily on local energy grids and infrastructure, so results could vary considerably in different geographic locations or with different hardware configurations.
Funding and Disclosures
Authors declared that no financial support was received for the research or publication of this article. They reported no commercial or financial relationships that could constitute a potential conflict of interest. Authors disclosed that generative AI was used only to check for typos and spelling errors in the manuscript preparation.
Publication Information
The study “Energy costs of communicating with AI” was published by Maximilian Dauner and Gudrun Socher from Munich Center for Digital Sciences and AI (MUC.DAI), Hochschule MĂ¼nchen University of Applied Sciences. It appeared in Frontiers in Communication (Volume 10, Article 1572947) on June 19, 2025. The article is open access under the Creative Commons Attribution License (CC BY).








A great article except for one thing – CO2 will not destroy the planet – plants love it and will convert most of it to oxygen. Its the Waste Heat byproduct of our “skyrocketing” energy use that will destroy the environment (as we know it). Without producing oxygen that we can breathe.