Large-language models don’t just live in the cloud; they live on the grid. That one question you asked ChatGPT? It likely cost more energy than boiling your morning tea1. And it happened a billion times today.
Lifecycle research2 shows that maintaining and retraining large-scale models can generate emissions equivalent to hundreds of thousands of miles driven by a gas-powered vehicle, once infrastructure and inference are factored in. These aren't one-time costs—they're baked into the daily rhythm of how modern AI operates.
The International Energy Agency projects that data-center electricity demand could more than double to 945 TWh by 2030—roughly Japan’s current consumption—and AI is the biggest driver of that spike3. During model training, those loads come in short, brutal surges; according to Hitachi Energy4, these intense bursts of demand are already contributing to instability in some regions, overwhelming grid operators and forcing changes to electricity planning.
Water is another hidden input. That includes both direct water use for cooling and indirect impacts through electricity generation. Most data centers don’t publicly disclose their water sources, making it difficult to assess how this consumption affects already vulnerable aquifers, river systems, and municipal supplies.
What about AI’s potential to help?
AI doesn't have to be inherently extractive. In fact, it's already helping reduce emissions and environmental damage in a range of sectors—just not yet at scale. One compelling example is the use of digital twins: real-time, virtual replicas of physical systems that allow engineers and planners to simulate and optimize performance. In energy infrastructure, digital twins are used to monitor and manage hydropower dams—tracking everything from structural stress to flow dynamics—so operators can fine-tune turbine output, reduce unnecessary water releases, and time discharges to minimize ecological disruption downstream.
Outside of hydropower, digital twins help cities model urban heat islands, manage stormwater systems, and plan for climate adaptation. AI-powered remote sensing is also helping track deforestation in near real-time, while grid-balancing algorithms and predictive weather models are improving the integration of renewable energy. These tools show what regenerative AI could look like in practice. In agriculture, AI models are helping farmers reduce fertilizer overuse and improve crop rotation by analyzing real-time soil health and satellite imagery. Precision agriculture tools like AI-enabled drones and smart irrigation systems minimize water and chemical inputs, cutting both emissions and runoff. Meanwhile, IoT sensors embedded in smart buildings or manufacturing systems are using AI to optimize lighting, heating, and machinery usage—helping reduce energy waste across sectors.
These applications are promising—but they're still exceptions. The challenge now is to make these value-creating uses the default, not the outlier.
The Problem with Generative AI
To understand the need for regenerative design, we first have to look at what we’re up against.
AI’s footprint is massive—and growing
In 2024, the AI startup Constellation AI—backed by OpenAI’s former COO and other tech investors—purchased the dormant Three Mile Island nuclear reactor site5 to build a massive data center powered by small modular reactors. Their goal? To secure a dedicated energy source for AI compute to power Microsoft's AI data centers. While the site isn’t yet operational, the move signals just how extreme energy demands have become—and how far companies are willing to go to meet them.
Training GPT-3 burned the equivalent of 552 metric tons of CO₂6—the same emissions as driving a gas-powered car around the Earth over 60 times. GPT-4’s footprint is believed to be even larger, though details are scarce. There are rumors that GPT-5 is coming. With each new model leap, we’re not just getting smarter tools—we're paying for them in energy, emissions, and infrastructure strain. And new models will just keep coming. As adoption spreads, billions of daily interactions compound into a global carbon burden. And most users never think about it.
Water Justice
Data centers rely on large-scale water-cooled systems to keep high-performance AI clusters from overheating. In 2023, they used over 17 billion gallons of water7—about the same as a mid-sized city—and triple the 2014 total. The use is concentrated in water-stressed areas like Phoenix, Santa Clara, and Eastern Oregon, where it competes with household, agricultural, and ecological needs. It’s mostly hidden from the public, rarely disclosed, and often subsidized, and with little oversight or transparency, communities are left to absorb the consequences. The AI boom is now a water justice issue.
It’s not just about efficiency
Yes, companies are racing to build faster, leaner models. Technical strategies to reduce AI’s energy consumption like quantization, which lowers the precision of calculations to reduce compute needs; pruning, which cuts out unnecessary parts of a model; and neuromorphic chips, which mimic how the brain works to boost efficiency, have made progress. But these gains are quickly eclipsed by surging demand. The more efficient we get, the more people use AI—bringing us back to where we started. This is Jevons Paradox in action: efficiency alone drives more consumption unless paired with clear limits or a shift in values. Regeneration demands more than better tech—it requires a new direction.
Agentic AI could silently escalate the problem
Agentic systems don’t just complete tasks—they rewrite workflows. They remove the need for people. They multiply energy usage without permission, and they’re being deployed without oversight. I’ve been experimenting with them and I can tell you they’re powerful. For my small business, they help me stretch in ways I couldn’t if I had to rely on employee labor. But, as discussed in an earlier piece, this raises profound questions about labor, value, and economic participation.
Efficiency isn’t free. One prompt no longer means one process; it might trigger thousands of autonomous actions across distributed systems. Without carbon-aware throttling, emissions tracking, or policy boundaries, these agents could unleash runaway compute loads—amplifying environmental impacts faster than any single model training run. Multi-agent systems and AI swarms, in particular, can operate in parallel across platforms, issuing hundreds or thousands of actions without centralized oversight. Their collective power enables new forms of coordination and creativity—but without constraints, they risk embedding unsustainable behaviors at scale.
Projects like AutoGPT8 and work tracked by Stanford's Center for Research on Foundation Models9 have already demonstrated how quickly agentic behavior can scale. If not governed responsibly, these systems risk locking in inefficiencies before we even recognize their footprint.
RE-generative AI?
“Regenerative” is a paradigm shift borrowed from agriculture. In regenerative farming, the goal isn’t just to do less harm. It’s to do more good: rebuild topsoil, restore biodiversity, and bring depleted systems back to life. The same mindset can—and should—apply to AI.
What if AI systems were designed not just to avoid damage, but to actively repair what’s broken? To contribute more than they consume. To increase not just economic, but ecological and social value too.
That’s the promise of regenerative AI. But we’re not there yet.
What Regeneration Looks Like
In addition to some of the examples already mentioned like digital twins and AI analysis of IoT data, there are a lot of ways to use AI in a regenerative way. Here are just a few examples:
1. Use AI to optimize itself.
Organizations like Hugging Face are working on energy-efficient model architectures and dataset curation tools10 to cut down on compute needs. The MLCO2 project offers tools like CodeCarbon11 that help track emissions from specific AI workloads—making it possible to tune experiments or delay training until grid carbon intensity is lower.
2. Build with transparency.
Meta and Microsoft have begun publishing emissions reports for their AI training cycles, but very few labs report energy use for inference, water consumption, or embodied hardware emissions. MLCommons12 is trying to standardize these metrics. Still, most model and system cards—the transparency documents that describe how AI models work—don’t mention environmental impact at all.
3. Design for restoration.
Regenerative AI applications are gaining traction in agriculture and energy. Companies like Perennial13 are using satellite imagery and ML to monitor soil carbon and enable regenerative practices. Others like John Deere14 have demonstrated that precision spraying with computer vision can reduce herbicide use by over 50%.
Land O’Lakes’ sustainability business Truterra15 has launched a farmer-owned carbon marketplace called TruCarbon. It helps farmers adopt regenerative practices like cover cropping and conservation tillage, then quantify and sell carbon credits to buyers like Microsoft. The system puts farmers at the center and provides new revenue streams tied to ecological stewardship—offering a scalable model for climate-aligned AI applications.
In the energy sector, Google DeepMind’s AI for wind16 boosted output value by optimizing forecasts. These are small but real examples of AI helping restore balance, not just efficiency.
4. Embed community-led governance.
The Green Software Foundation17 is creating open protocols and communities of practice around sustainable software. But outside of high-income countries, few digital rights organizations have the resources to engage with AI governance. Projects like Data Stewardship for Development18 and Digital Public Goods Alliance19 are working to bridge this gap.
What hasn’t worked? Relying on voluntary disclosures and best intentions. Industry sustainability claims often lack independent verification. And techno-solutionism without equity has repeatedly led to extractive systems dressed up as progress.
Ultimately, AI is part of the same system that degraded our soil, polluted our air, and concentrated wealth. The regenerative lens reminds us that environmental repair isn’t just about new tools—it’s about rebalancing relationships between technology, ecosystems, and people. If AI is to serve the planet, it must be accountable not only to market incentives but to the communities and climates it touches.
What You Can Do
If you use AI tools:
Be intentional. Try lighter-weight models when possible. Don’t leave tools running in the background when you're not using them. Ask your platforms to disclose their energy and water impact and use the ones that do and set targets. If you're part of a company or nonprofit using AI, push internally for sustainability reviews and greener alternatives.
If you're concerned about tech’s direction:
Use your voice. Ask your elected officials to support transparency rules, emissions labeling, and ethical AI guidelines. Push for policies that ensure AI benefits people and the planet—not just corporations. Pressure your workplace. Support policy that mandates emissions transparency. Treat AI use like an environmental decision. Look for civic orgs advocating on these fronts and amplify them.
You don’t need to code to shape AI. Every click, prompt, and subscription is a form of power.
If you work in education, organizing, or community spaces:
Help others understand that AI has a footprint. Raise awareness, create conversations, and include environmental impact in how we talk about digital literacy and tech access.
If you're just paying attention:
Stay curious. Follow the power, the water, and the money. When AI gets hyped, ask what’s being left out—and who’s left bearing the cost.
Bottom Line
AI’s environmental impact isn’t an accident. It’s a choice—and it can be redesigned. But only if we stop thinking about “generative AI” as a magic content machine and start thinking about regenerative AI as a systems-level commitment.
Let’s build intelligence that heals, not harms.
If you found this valuable, subscribe to Anthralytic for more posts at the intersection of AI and social impact and share with your network.
Source estimate based on aggregated media coverage and industry disclosures: MIT Technology Review, Plan A estimates, and OpenAI emissions reports. No peer-reviewed source currently provides a definitive per-query estimate.
https://www.researchgate.net/publication/355843251_Sustainable_AI_Environmental_Implications_Challenges_and_Opportunities
https://www.iea.org/reports/world-energy-employment-2023
https://www.hitachienergy.com/news-and-events/blogs/2025/03/ai-load-impact-on-data-centers-adapting-to-the-future-of-infrastructure
https://newatlas.com/energy/three-mile-island-nuclear-power-microsoft-data-centers/
https://arxiv.org/abs/2104.10350
https://andthewest.stanford.edu/2025/thirsty-for-power-and-water-ai-crunching-data-centers-sprout-across-the-west/?utm_source=chatgpt.com
https://agpt.co/blog
https://crfm.stanford.edu/
https://huggingface.co/AIEnergyScore
https://mlco2.github.io/codecarbon/
https://mlcommons.org/
https://www.perennial.earth/post/perennial-featured-by-nasa-for-innovative-use-of-satellite-technology
https://www.deere.com/en/sprayers/see-spray/
https://www.landolakesinc.com/news-features/how-americas-farmers-are-helping-businesses-reduce-their-carbon-footprint/
https://deepmind.google/discover/blog/machine-learning-can-boost-the-value-of-wind-energy/
https://greensoftware.foundation/
https://medium.com/data-stewards-networkv
https://www.digitalpublicgoods.net/