7 January 2025 - working notes
AI is here to stay. And AI is famously compute-intensive, which means it’s energy-intensive, which means its carbon emissions are a worry. I think the emissions are likely to come from initial training of the model, running the model (inference), data centre stuff (mostly energy use), and lifecycle emissions associated with hardware production and disposal.
What’s being done to measure and reduce the emissions?
Stories and sources to read
AI’s emissions are about to skyrocket even further | MIT Technology Review - Dec24
- “For the 12 months ending August 2024, data centers were responsible for 105 million metric tons of CO2, accounting for 2.18% of national emissions (for comparison, domestic commercial airlines are responsible for about 131 million metric tons). About 4.59% of all the energy used in the US goes toward data centers, a figure that’s doubled since 2018.”
- Primarily about energy use in data centres, and links to this, which looks useful: Environmental Burden of United States Data Centers.
- Nb not all data centre use is AI, and data centres in the US are often located in parts of the country that have dirtier electricity generation.
Carbon Emissions in the Tailpipe of Generative AI - Jun24
- “We argue that the field must reframe the scope of machine learning research and development to include carbon and other resource considerations across the lifecycle and supply chain, rather than setting these aside or allowing them to remain on the field’s margins.”
Notes from my own emissions measurement - Feb24:
- “I can’t find any emission numbers published by OpenAI for ChatGPT :( . Many have made estimates, eg this, which estimates 24,860kgCO2e per day for 13m users with 5 queries/day in Feb 2023, or this which estimated electricity use at 77,160kWh per day. Both of those numbers are for the whole system - they don’t reflect your organisation’s use of an LLM. If ChatGPT has 13m daily users, then my estimated use is 24.86t/13m users * working days/5 because I use it about once a week. (For an LLM that does measure carbon, see Hugging Face BLOOM.)”
AI’s Growing Carbon Footprint — State of the Planet - Jun23
AI’s carbon footprint is bigger than you think | MIT Technology Review - Dec23
A Computer Scientist Breaks Down Generative AI’s Hefty Carbon Footprint | Scientific American - May23
Nature: The carbon impact of artificial intelligence - Aug20
Various stuff to look at here?
Google
Microsoft
OpenAI
- OpenAI - doesn’t disclose emissions but if you ask ChatGPT it will say “A single query to ChatGPT might emit 1–5 grams of CO₂, depending on data center efficiency and energy sources. For perspective, sending an email with a large attachment generates about 50 grams of CO₂”
- and “As of now, OpenAI has not publicly released a detailed carbon emissions report or a specific carbon reduction plan. […] training GPT-3 was estimated to emit approximately 552 metric tons of CO₂, equivalent to the annual emissions of 123 gasoline-powered passenger vehicles.” I’d love to see the detail on those numbers.
Other big tech
- Meta, Apple, Amazon, Nvidia etc - to find
Approaches to documenting existing emissions of LLMs
Other/questions
- Will carbon-aware software let LLMs etc be trained during periods of lower-emission energy?
- If the AI action moves to inference, does that change the story? If the inference is at the edge, will we see consumer hardware cos update their Product Environmental Reports to include the emissions of expected AI workload?
- Will AI drive adoption of renewables? If you squint, big tech are pushing renewable energy and doing nuclear energy deals because they have rapidly increasing data centre energy use, some of which is AI. But you do have to squint, because eg Google missed its emissions targets in 2024 thanks to AI/data centre investment and use.)
- Water use strongly correlated with energy use in AI, eg here
- More energy efficient hardware/chips might reduce future emissions, eg Vaire and “reversible computing”.