This came up in a conversation with a colleague today, and also at a round table conversation at a conference earlier this week.
I am mostly re-sharing links from Simon Willisons blog, but they are all worth reading.
MIT Tech Review:
https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
Conclusion:
- AI’s energy footprint is vastly underestimated, poorly disclosed, and growing fast. Without immediate action—especially transparency and regulation—AI risks becoming one of the largest unsupervised drivers of carbon emissions and energy demand in the modern world. (however future predictions of cost of inference are really hard, and probably inaccurate).
https://about.bnef.com/blog/liebreich-generative-ai-the-power-and-the-glory/
Conclusion:
AI companies need to invest in power sources, but read on to understand why!
https://www.sustainabilitybynumbers.com/p/carbon-footprint-chatgpt
Hanna is an editor of Our World In Data
Conclusions:
- Max energy query of a single use of AI is three Watts, this is an irrelevant amount (a few seconds of a microwave, a few seconds of a video call).
- One query might be 2 to 3 grammes of carbon, " if we’re fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere.".
- Energy use might be over estimated by a factor of 10.
- Data centres use around 1 to 2% of the world’s electricity. When cryptocurrency is included, it’s around 2%. (this is in contrast to the report above from the MIT technology review).
https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for
Conclusions:
- 300 GPT queries uses about 1 gallon of water, the same amount as 15 minutes of watching TV.
- One Hamburger uses 660 gallons of water.
These news items do agree that global AI usage is currently at about the usage of ~20K US households, which for context is an additional 0.015% of energy use in the US overall, and a small fraction of use compared to other digital services, such as Zoom, or Netflix.
So ...
- Global use is increasing, but it's still relatively small compared to other types of digital consumption we have right now.
- Demand for AI use will be pushed more, but will it become ubiquitous?
- Will small embedded models on edge prevail, in which case they will be able to be powered more effectively via renewables.
- Diffusion models will likely take over, and the cost of inference of these models will be dramatically lower than inference today.
- Global energy use will go up, and so we need to move to data centres having the lowest possible footprint we can, but we need to do this anyway.
- There are significant local downsides to living next door to a data centre that has been built in a region that has low level of consumer protection - but most people don't live in places like that.
- This means that the poor and disadvantaged will continue to get the brunt of things, as usual.
I am mostly re-sharing links from Simon Willisons blog, but they are all worth reading.
MIT Tech Review:
https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
Conclusion:
- AI’s energy footprint is vastly underestimated, poorly disclosed, and growing fast. Without immediate action—especially transparency and regulation—AI risks becoming one of the largest unsupervised drivers of carbon emissions and energy demand in the modern world. (however future predictions of cost of inference are really hard, and probably inaccurate).
https://about.bnef.com/blog/liebreich-generative-ai-the-power-and-the-glory/
Conclusion:
AI companies need to invest in power sources, but read on to understand why!
https://www.sustainabilitybynumbers.com/p/carbon-footprint-chatgpt
Hanna is an editor of Our World In Data
Conclusions:
- Max energy query of a single use of AI is three Watts, this is an irrelevant amount (a few seconds of a microwave, a few seconds of a video call).
- One query might be 2 to 3 grammes of carbon, " if we’re fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere.".
- Energy use might be over estimated by a factor of 10.
- Data centres use around 1 to 2% of the world’s electricity. When cryptocurrency is included, it’s around 2%. (this is in contrast to the report above from the MIT technology review).
https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for
Conclusions:
- 300 GPT queries uses about 1 gallon of water, the same amount as 15 minutes of watching TV.
- One Hamburger uses 660 gallons of water.
These news items do agree that global AI usage is currently at about the usage of ~20K US households, which for context is an additional 0.015% of energy use in the US overall, and a small fraction of use compared to other digital services, such as Zoom, or Netflix.
So ...
- Global use is increasing, but it's still relatively small compared to other types of digital consumption we have right now.
- Demand for AI use will be pushed more, but will it become ubiquitous?
- Will small embedded models on edge prevail, in which case they will be able to be powered more effectively via renewables.
- Diffusion models will likely take over, and the cost of inference of these models will be dramatically lower than inference today.
- Global energy use will go up, and so we need to move to data centres having the lowest possible footprint we can, but we need to do this anyway.
- There are significant local downsides to living next door to a data centre that has been built in a region that has low level of consumer protection - but most people don't live in places like that.
- This means that the poor and disadvantaged will continue to get the brunt of things, as usual.