Artificial Intelligence (AI) is reshaping our world—from powering voice assistants and chatbots to optimizing supply chains and enabling breakthroughs in medicine. But behind the marvel of machine learning and automation lies a growing environmental concern: the hidden carbon and resource cost of building and running AI systems.
At the heart of AI lies data—and a lot of it. Training modern AI models like large language models (LLMs) requires vast computational power. This power comes from massive data centers, filled with thousands of GPUs and servers, which consume huge amounts of electricity and water.
A 2019 study from the University of Massachusetts Amherst estimated that training a single AI model can emit as much carbon as five cars over their entire lifetimes. More recently, training GPT-3 was estimated to consume 1,287 MWh of electricity and generate over 550 metric tons of CO₂. These figures are likely even higher for more powerful successors like GPT-4.
It’s not just training that uses energy. Once deployed, AI models continue to require substantial computing power to serve millions of users in real time. This is particularly true for large-scale applications like search engines, recommendation systems, and conversational AI.
Even simple queries to an LLM can require 10 to 100 times more energy than a traditional web search. Multiply that by billions of interactions daily, and the energy demand grows fast.
Most AI models are hosted in large data centers, many of which are powered by a mix of renewable and fossil-fuel-based energy. Companies like Google, Microsoft, and Amazon have made commitments to carbon neutrality, but not all AI infrastructure is powered by clean energy.
And there’s more: data centers also use water for cooling. In 2023, it was reported that OpenAI’s GPT-3 model required an estimated 700,000 liters of clean water to cool servers during training. With increasing global water stress, this raises new concerns.
Beyond energy and water, building the hardware for AI systems—GPUs, semiconductors, and servers—requires rare earth minerals and generates e-waste. Mining and manufacturing these components have well-documented environmental and human rights implications, including deforestation, water pollution, and labor exploitation.
Despite these concerns, AI also holds tremendous potential for environmental good:
The challenge is ensuring the environmental costs of developing AI don’t outweigh its benefits.
To mitigate AI’s environmental footprint, companies and policymakers must:
AI has the potential to help solve some of humanity’s biggest problems—but only if we build it responsibly. As we race to develop more advanced models, we must also consider their unseen environmental impact. Sustainability should be a core principle—not an afterthought—of the AI revolution.
Goldman Sachs, AI is poised to drive 160% increase in data center power demand
Greenpeace, ChatCO2 – Safeguards Needed For AI’s Climate Risks