Ian Mulvany

May 13, 2024

two view on future energy needs of LLMs


This interview in Vox paints a concerning picture about the future energy use of LLMs
https://www.vox.com/climate/2024/3/28/24111721/ai-uses-a-lot-of-energy-experts-expect-it-to-double-in-just-a-few-years
It points to this research: https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/

In contrast this post looking at cost inference: https://semaphore.substack.com/p/the-cost-of-reasoning-in-raw-intelligence shows a potentially more optimistic view. 

I think both can be true at the same time, the cost per inference is going to come down, but the overall demand is going to go up. 

As far as I know data centre energy consumption remains single digit in terms of our overall energy consumption, but that is going to go up. It is very easy to think about as data centres are directly plugged into the grid, compared to some of our other energy needs, and it is easy to see how data centres energy use is often for very trivial junky use. Can that additional compute and access to proto-intelligence help move the needle on the wider sustainability challenges that we have, or will it just be an energetic distraction? 





About Ian Mulvany

Hi, I'm Ian - I work on academic publishing systems. You can find out more about me at mulvany.net. I'm always interested in engaging with folk on these topics, if you have made your way here don't hesitate to reach out if there is anything you want to share, discuss, or ask for help with!