What The Tech: How artificial intelligence is impacting your utility bill
BY JAMEY TUCKER, Consumer Technology Reporter
Most of us never think about the environment when we hop on ChatGPT. But behind every response are massive data centers and their servers are running around the clock.
The Power Problem
In the U.S. alone, data centers use roughly 4 percent of all electricity generated, but that information comes from a study back in 2023. And that number is expected to more than double as new facilities come online.
Consider this: some AI-focused data centers currently under construction could use as much electricity as 2 million homes.
Water: The Other Resource Drain
Electricity is just part of the story. It takes a massive amount of water to keep those data centers cool. According to a Department of Energy study, some centers use millions of gallons every day, the equivalent of the water needs of a town of 50,000 people.
And it’s not just large queries that are taxing the system. A UC Riverside study found that every session with an AI chatbot uses roughly a half-liter of fresh water to cool the servers.
Growth Is Accelerating
Currently, there are approximately 4,000 data centers in the U.S., with major expansions underway in Virginia, Georgia, Arizona, and Ohio. Environmentalists warn that some new facilities are requesting as much power as small cities, even before they break ground.
Small Actions, Big Footprint
Even the smallest interactions add up. If 3 million people type a simple “thank you” into a chatbot, that could consume around 1,500 kilowatt hours of electricity, roughly what a typical home uses in a month and a half.
What’s Being Done?
President Trump’s “Ratepayer Protection Pledge” encourages large tech companies to cover their own energy costs, thereby preventing local families from bearing the burden. But as AI continues to grow, the conversation around its environmental impact is only getting started.
For local news, click here.



