Google says its Gemini AI model now processes prompts with far less energy, comparing each request to the cost of watching only nine seconds of television. This reflects the focus on Google AI energy efficiency in recent developments. The company revealed these findings in a new technical paper, stating that efficiency has improved 33-fold in energy use and 44-fold in carbon footprint since 2024.
What’s Happening & Why This Matters
Measuring AI Prompts’ Energy
According to Google, running one text prompt through Gemini consumes about 0.24 watt-hours of energy. That equals 0.03 grams of carbon dioxide and just five drops of water used to cool the data centre equipment. By comparison, an OpenAI ChatGPT query consumes nearly 3 watt-hours, while a traditional search engine query uses about 0.3 watt-hours. These figures indicate the importance of AI energy efficiency improvements.
The company’s study factored in more than direct machine use. It included the idle power of AI chips, supporting IT equipment, and data centre cooling. Google argued that many external reports focus only on raw machine consumption, giving a less accurate picture of real-world operations.

Global Energy Demands
The International Energy Agency (IEA) warns that AI energy consumption could double in the next five years, reaching 945 terawatt-hours annually — equal to Japan’s total power use. The warning denotes the vital role of AI energy efficiency in addressing potential resource strain. Google admits its own emissions have risen 51% since 2019, driven by hardware manufacturing and supply chain activity.
Still, executives claim Gemini’s per-prompt efficiency gains demonstrate progress. “Our numbers show what is possible at scale, not just in theory,” the company said in its blog post on how Google AI energy efficiency sets a standard for the industry.
Missing Data and Comparisons
One gap in Google’s disclosure is scale. The company did not reveal how many prompts Gemini handles each day. That missing detail makes it difficult to calculate the total energy demand of its AI systems. Experts argue that even small per-prompt costs can add up quickly when billions of queries run daily.
This energy focus comes as rivals like Microsoft and OpenAI face growing scrutiny over AI’s climate impact. Data centres worldwide already strain water supplies and grid systems. While Google’s improvements look strong on paper, the environmental implications remain tied to overall adoption rates and infrastructure demands.
TF Summary: What’s Next
Google’s claim that an AI prompt is equivalent to just nine seconds of TV may ease public concern, but questions remain about transparency and total consumption. As AI adoption expands, the company and its peers will need to provide clearer reporting on real-world energy costs. Expect regulators and environmental groups to push for disclosure of query volumes and broader supply chain impacts.
The efficiency improvements suggest progress. But the global conversation around AI’s energy footprint, particularly regarding AI energy efficiency, is only growing louder.
— Text-to-Speech (TTS) provided by gspeech