The True Cost of AI: Energy, Emissions, and Your Electric Bill
- Eugenie Lewis
- Feb 24
- 2 min read
As the use of artificial intelligence rapidly grows mainstream, the demand for high-quality AI tools and models is matched by companies building warehouse-sized data centers to effectively store and process large amounts of information. For the average American, AI is typically used to make daily life easier, whether it be organizing tasks, helping with homework, or answering queries. ChatGPT, the most ubiquitous among AI tools, is now thought to be the fifth most visited website in the world, and its success has led the White House to collaborate with OpenAI to spend $500 billion dollars building new data centers. However, the public is increasingly becoming aware that this newfound technology does not come without a cost, as these data centers demand ever-increasing amounts of money and energy to keep running.Â

Currently, sources estimate that there are 5,400 data centers in America, and 326 data centers in California. According to CalCCA, California data centers combined are thought to have consumed about 5.58 terawatt-hours as of 2024, enough energy to power 526,000 average US homes for a year. Nationwide, data centers have taken up 4.4% of all the energy consumed, and this number is thought to climb to around 8.0% of the energy consumed by the United States in 2024. Natural gas and coal, which notably generate greenhouse gases, are used to produce about 40% and 15% of the energy used by these data centers respectively.
As data centers expand, the energy costs to keep them running will fall on Americans who foot the bill—electricity bills are estimated to rise 8% nationally, with one-fifth of the American population affected by the price hikes. The responsibility of creating policies to regulate AI growth falls on the government, while citizens must pressure their officials to control energy costs and usage concerning these data centers.
