Every time you ask ChatGPT to write an email, create an image, or summarize a document, something happens that you probably don’t think about: somewhere, a data center drinks the equivalent of a water bottle and uses enough electricity to power your home for several minutes.
It sounds dramatic, but it’s true. And as AI becomes woven into the fabric of our daily lives—embedded in our phones, computers, search engines, and workplace tools—the hidden environmental toll is becoming impossible to ignore.
The Shocking Numbers
By 2028, AI systems in the United States alone could consume as much electricity as 28 million households use in an entire year. That’s enough power to run every home in New York, Los Angeles, and Chicago combined.
But the numbers get even more startling when you look at water consumption. Research suggests that by 2025, AI’s water footprint could match the entire global consumption of bottled water—somewhere between 312 billion and 765 billion liters. To put that in perspective, that’s enough water to fill over 300,000 Olympic-sized swimming pools.
And the carbon emissions? AI systems could be responsible for putting 32 to 80 million metric tons of carbon dioxide into the atmosphere in 2025 alone—roughly equivalent to the annual emissions of New York City or a small European country like Norway.
Why Does AI Need So Much Water?
You might be wondering: why on earth does asking a computer a question require water?
The answer lies in how data centers work. When you use ChatGPT or any AI tool, your request is processed by massive server farms packed with thousands of powerful computer chips. These chips generate enormous amounts of heat—so much that without cooling, they would quickly overheat and fail.
Most data centers use water-based cooling systems. Cold water circulates through the facility, absorbs the heat from the servers, and then evaporates or gets discharged. According to researchers, for every kilowatt-hour of energy a data center consumes, it typically needs two liters of water for cooling.
Each time you enter a 100-word prompt into an AI system, it uses roughly one bottle of water (about 519 milliliters). That might not sound like much—but when billions of people worldwide are using AI tools every minute of every day, those water bottles add up fast.
And here’s the troubling part: approximately 80% of that water simply evaporates. It’s gone. Meanwhile, many of these data centers are being built in water-stressed regions like Nevada and Arizona, where freshwater is already scarce.
The Energy Appetite That Won’t Quit
The electricity demands are equally staggering. A simple ChatGPT query uses nearly 10 times more electricity than a traditional Google search. Creating an AI-generated image? That can use as much electricity as it takes to fully charge your smartphone—per image.
ChatGPT alone was using over 500,000 kilowatts of electricity every day as of 2024—the same amount used by 180,000 U.S. households. And that’s just one AI service.
Lawrence Berkeley National Laboratory forecasts that by 2028, U.S. data centers could consume up to 12% of the nation’s total electricity supply. To put that in context, in 2017, data centers used far less power despite serving the rise of streaming services like Netflix, cloud storage, and social media. What changed? AI.
From 2005 to 2017, data center electricity use remained relatively flat thanks to efficiency improvements. But when AI took off in 2017, everything changed. Data centers started getting built with energy-intensive hardware specifically designed for AI, and electricity consumption began climbing sharply.
The Fossil Fuel Connection
Here’s where things get even more concerning: about 56% of the electricity powering U.S. data centers comes from fossil fuels. When tech companies promise their AI expansions will run on renewable energy, the reality often falls short.
Take Microsoft as an example. The company pledged to be carbon-negative and water-positive by 2030. Yet by Microsoft’s own admission, its total emissions were about 30% higher in 2024 than they were in 2020—largely due to AI rollout.
A Guardian investigation found that between 2020 and 2022, the real emissions from company-owned data centers of Google, Microsoft, Meta, and Apple were more than 600% higher than officially reported. The discrepancy comes from how companies calculate and report their environmental impact.
And here’s the kicker: as demand for AI computing power skyrockets, fossil fuel companies are lining up to profit. In some cases, data centers are turning to natural gas generators to supplement grid power. In Memphis, Tennessee, Elon Musk’s xAI facility was found to be operating at least 18 gas turbines—allegedly without proper permits—spewing pollution into predominantly Black neighborhoods already burdened by air quality issues.
The Human Cost
The environmental impact of AI isn’t just about abstract numbers on a chart. It translates into real consequences for real communities.
Data centers disproportionately impact environmental justice communities. Power plants that feed these facilities are more likely to be located near Black, Brown, and low-income neighborhoods. Residents in places like Randolph, Arizona and Memphis, Tennessee are facing increased air pollution from gas-powered turbines—pollution linked to asthma and lung cancer.
Meanwhile, the massive infrastructure buildout for AI is driving up utility bills for everyday customers, even as data center operators often receive discounts on their electricity use. As one legal fellow at Harvard’s Environmental and Energy Law Program puts it: “Why should we be paying for this infrastructure? Why should we be paying for their power bills?”
In water-scarce regions, the competition for limited freshwater resources is intensifying. Data centers are tapping into surface water and underground aquifers that communities depend on for survival, potentially exacerbating droughts and water shortages that are already reducing water availability.
The Transparency Problem
One of the biggest challenges in addressing AI’s environmental impact is that we don’t actually know the full extent of the problem—and that’s by design.
There are currently no federal or state regulations requiring tech companies to disclose their AI-specific energy and water consumption. Major companies like Google, Microsoft, Meta, and Amazon publish sustainability reports, but these rarely distinguish between AI and non-AI computing activities.
When pressed, some companies have even argued that they shouldn’t have to report certain environmental impacts. Google, for instance, stated in a recent report that it didn’t want to disclose the water consumed by power plants that supply electricity to its data centers because it doesn’t “fully control” that water use—even though the water consumption directly results from Google’s electricity demand.
This lack of transparency makes it nearly impossible to hold companies accountable or to make informed decisions about AI’s role in our future.
Are There Solutions?
The picture painted so far is pretty bleak, but it’s not entirely hopeless. There are potential paths forward:
Better cooling technology: Novel approaches like direct-to-chip liquid cooling and immersion cooling can significantly reduce both water and energy usage. Some experimental systems could cut water consumption by 32% and carbon emissions by 73% compared to current methods.
Smarter location choices: Building data centers in regions with cleaner electricity grids and better water availability makes a huge difference. European data centers, for example, produce about half the carbon emissions per unit of electricity compared to U.S. facilities because Europe’s power grid is cleaner.
More efficient AI models: Not all AI models are created equal. Research has shown that with smarter design choices, it’s possible to create models that achieve similar results while using a fraction of the resources. The question is whether companies will prioritize efficiency when there’s pressure to constantly release bigger, more powerful models.
Regulatory action: Some legislators are trying to address the problem. Senator Edward Markey of Massachusetts has introduced a bipartisan bill to set federal standards for measuring AI’s environmental footprint. States like California and Connecticut are considering their own regulations.
What Can You Do?
As an individual user, you’re not powerless. Here are some ways to reduce your AI footprint:
- Be selective: Not every task needs AI. Save AI tools for when they genuinely add value rather than using them for everything.
- Choose text over images: Generating text uses far less energy than creating images or videos. If you don’t need a visual, stick with text.
- Demand transparency: When you see companies touting their AI capabilities, ask them about their environmental impact. Use your voice as a consumer.
- Support regulation: Contact your representatives and express support for legislation that would require tech companies to disclose and reduce their environmental impact.
The Bigger Question
Ultimately, the environmental crisis created by AI forces us to confront a fundamental question: What are we getting in return?
For all the electricity and water consumed, for all the carbon emissions and community impacts, what benefits is AI actually delivering? Are we using this technology to solve critical problems, or are we simply using it because it’s there?
Some AI applications genuinely improve lives—helping doctors diagnose diseases, accelerating scientific research, making information more accessible. But a lot of AI use is far more trivial: generating marketing copy, creating synthetic influencer photos, or adding “smart” features to products that don’t really need them.
As Cornell professor Fengqi You puts it: “This is the build-out moment. The AI infrastructure choices we make this decade will decide whether AI accelerates climate progress or becomes a new environmental burden.”
Right now, we’re at a crossroads. We can continue down the path of unchecked AI expansion, accepting the environmental costs as the price of progress. Or we can demand better—more transparency, more efficiency, more thoughtful deployment of this powerful technology.
The invisible environmental toll of AI won’t stay invisible forever. The water used, the electricity consumed, the emissions produced—they all accumulate. And unlike the ephemeral text and images AI generates, these environmental impacts are very, very real.
The next time you use AI, that awareness might just make you pause for a second. And maybe, just maybe, that pause is exactly what we need.







