Google’s Post

Google reposted this

View profile for James Manyika
James Manyika James Manyika is an Influencer

SVP, Google-Alphabet

There are a number of important questions on AI and energy that we’re working on at Google: How do we use AI to increase energy efficiency? How do we leverage AI to help address environmental and climate challenges? Can those approaches and their beneficial impact be scaled globally? At the same time, a key question is the amount of energy needed to use generative AI tools and apps, and if it can be reduced. To help answer this question, my colleagues and I did the math on the energy, carbon and water footprint per Gemini prompt. This is an area where state of the art is rapidly evolving, but we believe a clear understanding is important to enable us and others to continue to improve efficiency. Here’s a quick summary: We estimate the median Gemini App text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water. One way to explain the per prompt energy impact is that it's equivalent to watching TV for less than 9 seconds. We’re sharing our findings and methodology in our Technical Paper (link below) aiming to encourage industry-wide progress on more comprehensive reporting on the energy use of AI. We’ve been making rapid ongoing progress on the efficiency of our AI systems. Over a period of 12 months, as we continued to deliver higher-quality responses using more capable models, the median energy consumption and carbon footprint per Gemini Apps text prompt decreased by factors of 33x and 44x, respectively. All this thanks to our full stack approach to AI development, our latest data-center efficiencies, decades of experience serving software at scale (details in Technical Paper) – and the work of many teams at Google.  To be clear, there is much more still to do. We’re continuing to improve efficiency at every layer of our operations (hardware, software, models, data centers), and we’re continuing to invest in energy infrastructure, grid improvements, and clean energy (existing tech and next-gen initiatives such as geothermal and nuclear). Thanks to my co-authors on this study Cooper Elsworth Tim Huang, Dave Patterson, Ian Schneider, Robert Sedivy, Savannah Goodman, Ben Townsend, Partha Ranganathan, Jeff Dean, Amin Vahdat, Ben Gomes. If you’d like to learn more, here are some resources:  Blog post on our approach to energy innovation and AI: https://lnkd.in/gh8HX7zR  Blog post on the math behind the numbers we shared today: https://lnkd.in/gQgq-Eir  Technical Paper: https://lnkd.in/g8T88xMf

prakash meena

Sophomore at IIITD | CSE Student | Learning DSA | Passionate About Tech & Problem Solving

10h

 Big respect to Google for leading with transparency and responsibility in AI efficiency 👏. Truly inspiring work toward a sustainable, scalable AI future 🚀🙌

Annabel Chua

Global Commodity Lead (MRO) at GlobalFoundries

1h

Insightful and thanks for sharing, James!

Nicole M.

Strategic Communications Executive | Architect of Narrative, Reputation & Public Trust in Emerging Tech | Meta + Google Alum | Driving Change at the Intersection of Innovation, Engagement & Media Influence

7h

This level of transparency on AI’s energy footprint is market-shaping -- thanks for sharing! These metrics begin setting baselines for trust, policy, and how fast responsible adoption can scale.

Patrick Lin

Director of the Ethics + Emerging Sciences Group (Cal Poly)

5h

Interesting work, James! I wonder if it's possible to add an ENERGY-COST COUNTER to Google Gemini and other LLMs for each prompt? Since so many folks are clearly concerned about AI's environmental impact, if your research is showing that's overblown, then it may lessen those concerns to see the energy cost per interaction. Of course, if an LLM developer has a goal of encouraging more or unlimited use of their services, then I can see why they won't want to reveal costs to users. That can deter some/many users from using the LLM more. But maybe that's not a bad thing, if the energy costs become significant and overuse is possible? Seems important for people to be aware of the true cost of things, whether it's LLMs, plastic straws, credit card debt, or whatever...

Imanathi Ncube

Founder @InkubitTech | Co-Founder @Ryntek.AI | Computer Science Student

15h

Appreciate your team’s work on efficient large-model compute. One question I’m exploring is how do you handle the big diurnal swings in energy availability and request volume for Gemini without compromising cost, latency, or reliability?

Vivi Cediel

Head of Google Research Partnerships | AI for Good: Climate, Health, Education | AI for Creativity | AI literacy | Responsible AI | AI Governance | Purpose-Driven Leader | Building Amazing Teams to Scale Impact

16h

Thanks for sharing, James. Very insightful.

Like
Reply
Felix Nkama

I help busy CEOs to handle their Admin tasks, to enable them to have full focus on growths as an Administrative Virtual Assistant || Transcriber ||Data Entry ||Audio Data Annotator||Specializing in Workflow Optimization.

15h

Very impressive and well articulated!! Efficiency is what makes users enjoy AI tools without complaining or encountering any difficulties. Thanks for sharing James Manyika

Terima kasih telah berbagi, James

Like
Reply
Joshua Woodruff

CEO & Founder, MassiveScale.AI | AI Transformation Expert | Author | CSA Research Fellow | IANS Faculty

10h

Currently designing an AI agent system for Sales. I'm looking at energy per completed task. Because an efficient failure is still a failure. I'm seeing best results with narrow agents that nail it first try. I'd love to hear more about case studies measuring total energy cost to successful outcome.

Vlad Lukashonok

Software Engineer | Full-Stack | Yii2 | Angular | Golang | CI/CD | 5+ years

8h

Interesting stats! It would be great to see a comparative table with other models

See more comments

To view or add a comment, sign in

Explore topics