| Author Name | Willem THORBECKE (Senior Fellow, RIETI) |
|---|---|
| Creation Date/NO. | February 2026 |
| Download / Links |
This Non Technical Summary does not constitute part of the above-captioned Discussion Paper but has been prepared for the purpose of providing a bold outline of the paper, based on findings from the analysis for the paper and focusing primarily on their implications for policy. For details of the analysis, read the captioned Discussion Paper. Views expressed in this Non Technical Summary are solely those of the individual author(s), and do not necessarily represent the views of the Research Institute of Economy, Trade and Industry (RIETI).
Artificial Intelligence (AI) use is exploding. OpenAI launched the generative AI chatbot ChatGPT in November 2022. By October 2025 it had 800 million weekly users, up from 400 million in February 2025 (Singh, 2025). Monthly visits reached 5.8 billion and the number of daily queries reached 2 billion.
The four leading tech firms, Amazon, Google, Meta, Microsoft, are riding the AI juggernaut. They have weaved AI into their offerings. Amazon uses AI to enhance customer service. Google offers the Gemini AI chatbot. Meta employs AI to enhance its social media and messaging apps. Microsoft partners with OpenAI.
This paper investigates how these four leading firms have performed after ChatGPT launched the AI revolution in November 2022. Their stock market capitalizations have soared. Empirical evidence presented indicates that their stock market capitalizations have increased by between 500 billion dollars more than predicted for Amazon to trillions of dollars more for Meta.
Despite these windfalls, Big Tech companies negotiate aggressively with public utilities to reduce their electricity expenses. This transfers their electricity costs to other customers. Figure 1 shows that, since the launch of ChatGPT in November 2022, average electricity prices in the U.S. have risen more than 15%. Rapier (2025) observed that the number one reason why electricity prices in the U.S. soared in 2025 is because of energy demand for AI. Saul et al. (2025), analyzing 25,000 Locational Marginal Pricing (LMP) nodes across the U.S. grid, found that 75% of LMPs within 50 miles of data centers experienced electricity price increases between 2020 and 2025. In contrast, they reported that nodes that experienced electricity price falls tend to be located farther from data centers. Electricityrates.com (2025) reported that the costs of new power infrastructure needed to supply data centers are spread across all consumers.
This paper argues that current rate setting practices must change. Rather than investigating Big Tech’s energy needs and deciding what the grid should do at regulatory hearings, AI companies should be required to obtain their own power, transmission and backup. In this case public utility companies would not have to confront the lack of transparency surrounding AI’s energy use. If data centers bore their own costs, this would protect ratepayers from subsidizing Big Tech, spare the grid from the mushrooming data center power, and give AI companies an incentive to economize on energy use.
Soaring data center demand also forces utilities to prop up gas-fired and other fossil fuel-powered plants. It causes them to maintain their oldest and dirtiest generating plants. For instance, Martin and Peskoe (2025) documented how the Mississippi Power Company in 2025 propped up a coal plant that it was going to retire to meet surging demand. Utility companies have an incentive to meet the extra demand not by innovating but by increasing the use of gas-fired and other fossil fuel-powered plants. As Hyman and Tilles (2025) noted, the present approach is for more electrification for AI without decarbonization.
A factor that would incentivize Big Tech to limit the environmental costs of AI would be to demand greater transparency concerning their energy consumption. The leading companies have all committed to reaching net zero. Stakeholders should hold them accountable by demanding more information about their environmental footprints. This would activate what Bhagwati (1988, page 85) labeled the Dracula Effect: “exposing evil to sunlight helps to destroy it.” Peer and community pressure could then goad tech firms to be more ecologically responsible.
This paper also considers innovations that reduce data center energy demand. Greater visibility at the transistor level could reduce voltage overmargining. Moving away from assuming bigger AI models are better could reduce energy requirements. Placing dielectric fluids close to processing elements or inside of packages could improve cooling efficiency. Powering data racks with 800 V distribution systems rather than 48 V systems could reduce current and the concomitant heat loss. Recycling energy within circuits by transforming information in reversible ways could reduce heat loss. Tailoring chips to the tasks at hand could save energy compared to using general purpose chips. Diffusion caching and low precision inference could economize the power requirements to generate videos.
AI can also be used to reduce energy costs. For data centers it can help devise ways to reduce heat, save energy associated with voltage step downs, choose smaller inference models, reduce energy wastage along cables in between racks and servers, and conserve water. Palladino (2025) also noted that AI can reduce energy waste in the production of batteries, steel, glass, hydrogen, ammonia, copper, and other goods. AI can thus promote sustainability.
While industry, government, universities and others pursue these innovations, they should not forget Jevon’s Paradox. William Stanley Jevons observed that, as coal use in the 19th century became cheaper and more efficient, coal consumption actually increased. Data center electricity demand is exploding and multiplying CO2 emissions. As data center energy efficiency increases, society must ensure that increased AI use does not harm the environment even more.
- Reference(s)
-
- Bhagwati, J.N. 1988. Protectionism. Cambridge: MIT Press.
Electricityrates.com. 2025. Rising Energy Costs From Data Centers: Who Pays the Price? Weblog, 25 August. Available at: https://electricityrates.com/resources/rising-energy-costs/ - Elsworth, C., Huang, K., Patterson, D., Schneider, I., Sedivy, R., Goodman,S., Townsend, B., Ranganathan, P., Dean, J., Vahdat A., Gomes, B., and Manyika, J. 2025. Measuring the Environmental Impact of Delivering AI at Google Scale. Google Working Paper. Available at: https://arxiv.org/pdf/2508.15734 .
- Federal Reserve System. 2025. Beige Book. August. Available at: https://www.federalreserve.gov/monetarypolicy/files/BeigeBook_20250903.pdf
- Hyman, L. and Tilles, W. 2025 The Hidden Cost of Electrification in the United States. OilPrice Weblog, 13 October. Available at: https://oilprice.com/Energy/Energy-General/The-Hidden-Cost-of-Electrification-in-the-United-States.html
- Martin, E., and Peskoe, A. 2025. Extracting Profits from the Public: How Utility Ratepayers Are Paying for Big Tech’s Power. Harvard Law School Working Paper. Available at: https://eelp.law.harvard.edu/wp-content/uploads/2025/03/Harvard-ELI-Extracting-Profits-from-the-Public.pdf
- Palladino, C. 2025. How AI Might Save More Energy than it Soaks Up. Financial Times, August 17. Rapier, R. 2025. The Real Reasons Your Power Bill Is Exploding. Oilprice.com Weblog, 22 August. Available at: https://oilprice.com/Energy/Energy-General/The-Real-Reasons-Your-Power-Bill-Is-Exploding.html
- Saul, J., Nicoletti, L., Pogkas, D., Bass, D., and Malik, N. 2025. AI Data Centers Are Sending Power Bills Soaring. Bloomberg, 30 September.
- Singh, S. 2025. Latest ChatGPT Users Stats 2025 (Growth & Usage Report). Demandsage Weblog, 7 October. https://www.demandsage.com/chatgpt-statistics/
- Bhagwati, J.N. 1988. Protectionism. Cambridge: MIT Press.