
.png)
Every AI query has a carbon footprint. Here's what we know, what we don't, and what you can do about it.
The promise of artificial intelligence (AI) is efficiency: faster answers, automated workflows, and productivity gains that add up. What rarely makes the pitch deck is the energy required to deliver on that promise.
Behind every ChatGPT response, every AI-generated image, every "let me search that for you" sits a data center consuming electricity at a scale that's becoming impossible to ignore. And as AI embeds itself deeper into how companies operate, the sustainability teams responsible for tracking corporate emissions are facing an uncomfortable question: How do we account for something we can barely measure?
This post breaks down what we actually know about AI's environmental impact, why estimates vary so widely, and what companies can do today to measure and address emissions that are becoming harder to ignore.
To understand AI's carbon footprint, you first need to understand the infrastructure behind it: data centers.
According to the International Energy Agency's (IEA) 2025 "Energy and AI" report, global data centers consumed approximately 415 terawatt-hours (TWh) of electricity in 2024. That's about 1.5% of all electricity used worldwide. By 2030, the IEA projects this figure will more than double to 945 TWh, roughly equivalent to Japan's entire annual electricity consumption.
In the United States, the numbers are even starker. Data centers consumed 183 TWh in 2024, accounting for more than 4% of national electricity use. The IEA projects this will grow 133% by 2030, reaching 426 TWh. Data center electricity consumption is climbing at roughly 15% per year, four times faster than total electricity demand across all other sectors.
To put that in perspective: by the end of this decade, U.S. data centers are expected to consume more electricity than all energy-intensive manufacturing combined, including aluminum, steel, and cement production.
Data centers aren’t distributed evenly across the country. They cluster in regions with favorable conditions: reliable power, low land costs, tax incentives, and network connectivity. The result is that certain states bear a disproportionate share of the load.
Consider what 26% means for Virginia - more than a quarter of the state’s electricity is flowing to data centers. Northern Virginia alone has become the largest data center market in the world, and the facilities there will need enough energy to power six million homes by 2030.
As data center demand grows, so does the strain on power grids and the pressure to meet that demand with whatever generation is available. Currently, natural gas supplies over 40% of electricity for U.S. data centers, with coal still contributing around 15%.
Data centers have existed for decades. What's different now is artificial intelligence.
Traditional computing workloads (storing files, running websites, processing transactions) are relatively predictable. AI workloads are not. Training a large language model requires running thousands of specialized processors at full capacity for weeks or months. And once the model is trained, every query triggers a cascade of computations that dwarf what a standard database lookup requires.
When researchers first started quantifying AI's carbon footprint, they focused on training. They found that training OpenAI’s GPT-3 model consumed approximately 1,287 megawatt-hours of electricity and produced an estimated 552 metric tons of CO2 equivalent emissions. That's roughly equivalent to 550 passengers flying round-trip from New York to London.
But training happens once. Inference (the industry term for actually using a trained model over and over) happens billions of times.
Google estimates that 60% of AI energy consumption now comes from inference, with training accounting for the remaining 40%. Other researchers suggest the split may be even more skewed, with inference responsible for 80-90% of a model's total lifecycle energy consumption.
This is the shift that sustainability teams need to understand. The environmental cost of AI isn't primarily about the processing power required to build the model, but rather, the compute required to run the model, multiplied by every user, every query, every day. Early carbon estimates focused almost entirely on training, which means most published figures dramatically undercount AI's actual footprint.
"For very popular models, such as ChatGPT, it could take just a couple of weeks for the model's usage emissions to exceed its training emissions."
—Dr. Sasha Luccioni, Climate Lead & AI Researcher, Hugging Face
Here's where things get counterintuitive. AI systems are becoming dramatically more efficient on a per-query basis. Google reported that the energy and carbon footprint of a median Gemini text prompt dropped by 33x and 44x respectively over a recent 12-month period.
Yet Google's total data center electricity consumption has doubled since 2020, and the company's greenhouse gas emissions have increased 51% since 2019.
This is Jevons Paradox in action: efficiency gains get swallowed by demand growth. The cheaper and faster AI becomes, the more people use it, and the more use cases emerge. Each individual query gets cleaner while the aggregate footprint balloons.
"The efficiency gains are real, but so is the demand curve. Every improvement in cost-per-query makes new use cases viable, and those use cases add up fast."
—Daniel Kokotov, Chief Technology Officer, CNaught
The question of how to account for something that's constantly changing becomes even harder when the companies building AI won't share emissions data. Most don't.
If you wanted to calculate the carbon footprint of your company's AI usage today, you'd run into three problems almost immediately.
First, competitive secrecy: most generative AI providers don't disclose energy or emissions data for their Large Language Models (LLMs). Second, constant change: as models evolve and efficiency improves, any estimate becomes outdated within months. Third, inconsistent methodologies: when data does exist, different providers measure different things, making comparisons nearly impossible.
The Stanford Foundation Model Transparency Index, published in December 2025, quantified just how big this gap is…
There are early signs of progress. In August 2025, Google became the first major provider to publish detailed per-query environmental data for its Gemini model: 0.24 watt-hours of energy, 0.03 grams of CO2 equivalent, and 0.26 milliliters of water per median text prompt. The methodology was comprehensive, accounting for idle capacity, cooling overhead, and data center infrastructure.
Hugging Face, a leading AI platform and community that provides open-source tools, models, and datasets for machine learning, has also begun publishing emissions factors for models hosted on its platform, making it one of the few resources available for companies trying to estimate the footprint of open-source AI usage.
But these disclosures remain the exception. OpenAI and Anthropic, two of the most widely used AI providers, have not published sustainability reports or shared emissions data. Companies trying to account for those AI-related emissions are largely left guessing.
It depends on what you count. Does the estimate include the infrastructure overhead of running a data center? The energy consumed by machines sitting idle but ready to handle traffic spikes? The amortized cost of training? The embodied carbon in manufacturing the hardware?
Google's 0.03g figure accounts for all of these factors. Many independent estimates don't have access to the data required to do the same, so they make assumptions that can push the number higher. The takeaway isn't that one estimate is right and others are wrong. It's that without standardized disclosure, corporate sustainability teams have no reliable way to benchmark their AI usage against anything.
A 2025 benchmarking study analyzed 30 state-of-the-art models and found dramatic variation: the most energy-intensive models exceeded 29 Wh per long prompt - over 65 times more than the most efficient systems.
Even a relatively modest 0.42 Wh short query adds up at scale. At 700 million queries per day, that's annual electricity consumption equivalent to 35,000 U.S. homes, evaporative water equal to the drinking needs of 1.2 million people, and carbon emissions requiring a Chicago-sized forest to offset.
A standard Google search uses approximately 0.3 watt-hours of energy and produces roughly 0.2 grams of CO2. By most estimates, a ChatGPT-style query uses somewhere between 3x and 10x more energy than a traditional web search. Generating an AI image consumes roughly the same energy as fully charging a smartphone. To put individual impact in perspective, analyst Andy Masley calculates that it would take roughly 1,000 ChatGPT queries in a single day to increase a typical American's daily emissions by 1%.
The newest "reasoning" models like OpenAI's o1 series, which perform extended chains of computation before responding, can use 50-100x more energy than standard queries. A research paper from Hochschule München University found that "smarter" LLMs produce up to 50x more carbon emissions than simpler models, and that complex questions generate up to 6x more emissions than those requiring concise answers.
Google's 2024 environmental report acknowledged that greenhouse gas emissions rose 48% compared to 2019, driven primarily by data center energy consumption. The 2025 report showed emissions up 51% since 2019. The company dropped its "carbon neutral" claim in 2023 and now describes its 2030 net-zero goal as "extremely ambitious," noting that "reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute."
Microsoft reported a 29% emissions increase since 2020, attributing the growth to "the construction of more data centers that are designed and optimized to support AI workloads." Notably, 97% of Microsoft's emissions footprint comes from Scope 3 (supply chain and construction), which means the company's direct energy purchases tell only a fraction of the story.
OpenAI and Anthropic have not disclosed anything to date.
If AI emissions are hard to measure and providers aren't disclosing data, why should companies bother trying to track them?
Because the numbers are getting too big to ignore.
At 0.001% of your company's emissions, AI doesn't move the needle. At 10%, it becomes one of your largest line items. For companies with significant cloud and AI usage, that threshold is approaching faster than most realize.
The IEA projects that by 2030, AI-specific data center electricity consumption could reach 326 terawatt-hours annually in the US alone. That’s enough to power 22% of American households.
Three additional factors make the case urgent:
Investors, employees, and customers are asking more pointed questions about corporate climate commitments and the role AI plays in them. Whether a company is developing its own AI products or using AI to drive efficiency, stakeholders increasingly want clarity on how these additional emissions are being measured and addressed.
For companies making public sustainability claims, AI represents a growing blind spot. It's only a matter of time before that blind spot becomes a reputational liability.
The EU's Corporate Sustainability Reporting Directive (CSRD) is expanding mandatory climate disclosure to tens of thousands of companies. California's SB 253 and SB 261 will require large companies doing business in the state to report emissions across all three scopes.
None of these frameworks currently require AI-specific disclosure. But as AI becomes a larger share of corporate cloud and computing usage, it becomes a larger share of Scope 2 (purchased electricity) and Scope 3 (supply chain) emissions. Companies that wait for explicit requirements will find themselves scrambling to gather data they should have been collecting all along.
You can't reduce what you don't measure. Companies that establish baselines now will be positioned to make meaningful improvements: choosing more efficient models, selecting providers with cleaner energy mixes, optimizing when and how AI gets deployed.
Those improvements matter. Dr. Sasha Luccioni's research has shown that task-specific AI models (built to do one thing well) can use 30x less energy than general-purpose generative models applied to the same task.
"Task-specific models are often much smaller and more efficient, and just as good at any context-specific task."
—Dr. Sasha Luccioni, Climate Lead & AI Researcher, Hugging Face
Waiting for perfect data is a recipe for inaction. Here's a practical framework for getting started.
Any measurement is better than none. The right tool depends on your setup:
If you're using third-party APIs and need per-query benchmarks, the "How Hungry is AI?" dashboard provides model-level estimates for energy, water, and carbon across 30+ models - useful for approximating footprint based on your usage patterns.
For most companies, working backward from usage is the most practical starting point: how many queries are your teams making each day? Using mid-range estimates (2-4g CO₂ per query for text generation), you can develop a rough annual footprint to inform sustainability reporting.
Model selection matters more than most realize. Research suggests that "reasoning" models like OpenAI's o1 series, which perform extended chains of computation before responding, can use 50-100x more energy than standard queries.
A June 2025 study published in Frontiers in Communication found that "smarter" reasoning models can produce up to 50x more emissions than concise models for the same question. Complex queries, like abstract algebra or philosophy, generated up to 6x more emissions than straightforward questions. For routine tasks, simpler models aren't just faster - they're dramatically more efficient.
Hugging Face's AI Energy Score Leaderboard rates models on energy efficiency, making it easier to compare options before deployment. While coverage is still limited (closed-source providers have to opt in), it's one of the few independent benchmarks available for sustainability teams looking to measure and compensate for their AI emissions.
Some providers are also rethinking when and how compute happens. Companies like Crusoe Energy run AI workloads on stranded or flared natural gas that would otherwise be wasted, while others are experimenting with shifting non-urgent inference to times when grids are cleaner. These operational choices can meaningfully reduce the carbon intensity of AI workloads beyond just selecting efficient models.
Where your AI runs matters. Data center electricity is 48% more carbon-intensive than the U.S. average, according to MIT Technology Review. That’s largely due to grid location and reliance on peaker plants. A query processed in a facility powered by Quebec hydroelectricity has a different footprint than one processed in a coal-heavy grid.
Look for providers that disclose their energy sources and emissions. Google's Gemini disclosure, whatever its limitations, sets a benchmark that other providers should match. Transparency should factor into procurement decisions alongside performance and cost.
For emissions you can't yet eliminate, carbon credits offer a meaningful bridge. While credits should not replace reduction efforts, they can serve as an impactful complement. Microsoft purchased 3.5 million carbon removal credits in 2025 specifically to address emissions tied to its AI expansion. However, you don't need Microsoft's budget to take action.
The key is quality. Credits can create real impact when they come from projects with rigorous verification, transparent methodologies, and genuine climate impact.
CNaught is the easiest solution for companies of any size to offset their AI-related emissions with a diversified, science-backed portfolio of high-quality carbon credits. Rather than requiring you to vet individual projects or navigate the complexities of the voluntary carbon market, our portfolio approach balances project types, geographies, and methodologies to maximize impact while minimizing risk.
Every project in a CNaught portfolio goes through rigorous due diligence, including third-party ratings from independent agencies like BeZero, Calyx Global, Sylvera, and Renoster. Our team also provides ongoing support for emissions calculations and reporting, so you can account for your AI footprint without adding headcount or consultant fees.
Whether you're looking to offset cloud compute, employee AI usage, or broader operational emissions, the hardest part shouldn't be finding credits you can trust.
Ready to get started?
Estimates range from 0.03g to over 2g of CO2 per query, depending on methodology. Google's Gemini (the only model with official per-query disclosure) reports 0.03g CO2e per median text prompt. Most independent research puts ChatGPT-style queries in the 2-4g range when accounting for infrastructure overhead and amortized training costs.
AI queries typically use 3-10x more energy than traditional web searches. Google estimates a standard search uses approximately 0.3 watt-hours. Estimates for generative AI queries range from about 1 Wh to 3 Wh for most models, though Google's Gemini reports 0.24 Wh using a comprehensive methodology that accounts for efficiency optimizations.
Inference (using AI models) now accounts for 60-90% of total AI energy consumption. While training a model like GPT-3 consumes around 1,287 MWh in a single intensive effort, inference happens billions of times daily across all users, making it the larger contributor to ongoing emissions.
The IEA projects data center electricity consumption to double by 2030 (reaching 945 TWh globally). While per-query efficiency continues to improve, demand growth has historically outpaced those gains. Natural gas and coal are expected to meet over 40% of additional electricity demand through 2030, despite growth in renewables.
Yes, though credits should complement reduction efforts rather than replace them. Companies like Microsoft are purchasing millions of carbon removal credits specifically to address AI-related emissions. The key is selecting high-quality credits from verified projects with transparent methodologies.
AI's carbon footprint is real, growing, and largely invisible. The companies building these systems have disclosed almost nothing about their environmental impact. The companies using these systems have few tools to measure their own exposure.
That will change. Regulatory pressure, stakeholder expectations, and the sheer scale of projected growth will force transparency. The question is whether your company gets ahead of that curve or scrambles to catch up.
The good news: you don't need perfect data to start. Establish baselines with the estimates available. Make smarter choices about which models to deploy and where. Build AI's carbon cost into procurement decisions. And where emissions remain, offset them with integrity.
The companies that measure, disclose, and act now will be the ones setting the standard everyone else follows.
Ready to get started? CNaught makes it easy to measure, manage, and offset your company's carbon footprint, including the emissions from your AI usage.