While ChatGPT continues to reshape how we work, write, and even think, there’s a growing conversation behind the code—and it’s not about productivity. It’s about power. Literally.
Training and running large language models like ChatGPT requires massive amounts of electricity and water. Every time you prompt it, somewhere a data center kicks into high gear, often powered by fossil fuels. Researchers estimate that training a single AI model can emit as much carbon as five cars over their entire lifespans.
And then there’s the water. Cooling data centers for optimal performance consumes millions of gallons—a fact rarely mentioned in Silicon Valley press tours. A study out of the University of California found that ChatGPT-style tools can “drink” up to a half-liter of water for every 20–50 prompts. That adds up fast.
Tech leaders are starting to feel the heat—literally. OpenAI, Google, and Microsoft have all released statements committing to “green AI,” but critics argue that transparency is still lacking. Meanwhile, sustainability advocates are calling for standardized reporting on AI’s energy use and a serious look at decentralized, low-impact alternatives.
AI isn’t going anywhere. But as the tech races forward, the real test will be whether companies can innovate without draining the planet. Intelligence shouldn’t come at the cost of Earth’s bandwidth.
