
Data centers need an upgraded dashboard to guide their journey to greater energy efficiency, one that shows progress running real-world applications.
The formula for energy efficiency is simple: work done divided by energy used. Applying it to data centers calls for unpacking some details.
Today's most widely used gauge - power usage effectiveness (PUE) - compares the total energy a facility consumes to the amount its computing infrastructure uses. Over the last 17 years, PUE has driven the most efficient operators closer to an ideal where almost no energy is wasted on processes like power conversion and cooling.
Finding the Next Metrics PUE served data centers well during the rise of cloud computing, and it will continue to be useful. But it's insufficient in today's generative AI era, when workloads and the systems running them have changed dramatically.
That's because PUE doesn't measure the useful output of a data center, only the energy that it consumes. That'd be like measuring the amount of gas an engine uses without noticing how far the car has gone.
Many standards exist for data center efficiency. A 2017 paper lists nearly three dozen of them, several focused on specific targets such as cooling, water use, security and cost.
Understanding What's Watts When it comes to energy efficiency, the computer industry has a long and somewhat unfortunate history of describing systems and the processors they use in terms of power, typically in watts. It's a worthwhile metric, but many fail to realize that watts only measure input power at a point in time, not the actual energy computers use or how efficiently they use it.
So, when modern systems and processors report rising input power levels in watts, that doesn't mean they're less energy efficient. In fact, they're often much more efficient in the amount of work they do with the amount of energy they use.
Modern data center metrics should focus on energy, what the engineering community knows as kilowatt-hours or joules. The key is how much useful work they do with this energy.
Reworking What We Call Work Here again, the industry has a practice of measuring in abstract terms, like processor instructions or math calculations. So, MIPS (millions of instructions per second) and FLOPS (floating point operations per second) are widely quoted.
Only computer scientists care how many of these low-level jobs their system can handle. Users would prefer to know how much real work their systems put out, but defining useful work is somewhat subjective.
Data centers focused on AI may rely on the MLPerf benchmarks. Supercomputing centers tackling scientific research typically use additional measures of work. Commercial data centers focused on streaming media may want others.
The resulting suite of applications must be allowed to evolve over time to reflect the state of the art and the most relevant use cases. For example, the last MLPerf round added tests using two generative AI models that didn't even exist five years ago.
A Gauge for Accelerated Computing Ideally, any new benchmarks should measure advances in accelerated computing. This combination of parallel processing hardware, software and methods is running applications dramatically faster and more efficiently than CPUs across many modern workloads.
For example, on scientific applications, the Perlmutter supercomputer at the National Energy Research Scientific Computing Center demonstrated an average of 5x gains in energy efficiency using accelerated computing. That's why it's among the 39 of the top 50 supercomputers - including the No. 1 system - on the Green500 list that use NVIDIA GPUs.
Because they execute lots of tasks in parallel, GPUs execute more work in less time than CPUs, saving energy. Companies across many industries share similar results. For example, PayPal improved real-time fraud detection by 10% and lowered server energy consumption nearly 8x with accelerated computing.
The gains are growing with each new generation of GPU hardware and software.
In a recent report, Stanford University's Human-Centered AI group estimated GPU performance has increased roughly 7,000 times since 2003, and price per performance is 5,600 times greater.
Data centers need a suite of benchmarks to track energy efficiency across their major workloads. Two Experts Weigh In Experts see the need for a new energy-efficiency metric, too.
With today's data centers achieving scores around 1.2 PUE, the metric has run its course, said Christian Belady, a data center engineer who had the original idea for PUE. It improved data center efficiency when things were bad, but two decades later, they're better, and we need to focus on other metrics more relevant to today's problems.
Looking forward, the holy grail is a performance metric. You can't compare different workloads directly, but if you segment by workloads, I think there is a better likelihood for success, said Belady, who continues to work on initiatives driving data center sustainability.
Jonathan Koomey, a researcher and author on computer efficiency and sustainability, agreed.
To make good decisions about efficiency, data center operators need a suite of benchmarks that measure the energy implications of today's most widely used AI workloads, said Koomey.
Tokens per joule is a great example of what one element of such a suite might be, Koomey added. Companies will need to engage in open discussions, share information on the nuances of their own workloads and experiments, and agree to realistic test procedures to ensure these metrics accurately characterize energy use for hardware running real-world applications.
Finally, we need an open public forum to conduct this important work, he said.
It Takes a Village Thanks to metrics like PUE an
More from Nvidia
10/11/2025
Editor's note: This post is part of Think SMART, a series focused on how lea...
06/11/2025
NVIDIA founder and CEO Jensen Huang and chief scientist Bill Dally were honored ...
06/11/2025
Editor's note: This blog has been updated to reflect the correct launch date for Call of Duty: Black Ops 7', November 14.
A crisp chill's in the...
04/11/2025
In Berlin on Tuesday, Deutsche Telekom and NVIDIA unveiled the world's first...
04/11/2025
When inspiration strikes, nothing kills momentum faster than a slow tool or a frozen timeline. Creative apps should feel fast and fluid - an extension of imagin...
03/11/2025
Two out of every three people are likely to be living in cities or other urban c...
31/10/2025
Amidst Gyeongju, South Korea's ancient temples and modern skylines, Jensen H...
30/10/2025
An unassuming van driving around rural India uses powerful AI technology that...
30/10/2025
Get ready, raiders - the wait is over. ARC Raiders is dropping onto GeForce NOW and bringing the fight from orbit to the screen.
To celebrate the launch, gamer...
29/10/2025
Editor's note: This post is part of Into the Omniverse, a series focused on ...
28/10/2025
Governments everywhere are racing to harness the power of AI - but legacy infras...
28/10/2025
AI is moving from the digital world into the physical one. Across factory floors...
28/10/2025
NVIDIA is delivering the telecom industry a major boost in open-source software for building AI-native 5G and 6G networks.
NVIDIA Aerial software will soon be ...
28/10/2025
The race to bottle a star now runs on AI.
NVIDIA, General Atomics and a team of international partners have built a high-fidelity, AI-enabled digital twin for ...
28/10/2025
Along the Pacific Ocean in Monterey, California, the Naval Postgraduate School (...
28/10/2025
To democratize access to AI technology nationwide, AI education and deployment c...
28/10/2025
Leading technology companies in aerospace and automotive are accelerating their ...
26/10/2025
This year's ROSCon conference heads to Singapore, bringing together the global robotics developer community behind Robot Operating System (ROS) - the world&...
24/10/2025
Monday, Oct. 27, 12:30 p.m.
How Medium-Sized Cities Are Tackling AI Readiness
L to R: Mark Muro, senior fellow at Brookings Metro; Micah Runner, city manag...
23/10/2025
The nights grow longer and the shadows get bolder with Vampire The Masquerade: B...
21/10/2025
Coastal communities in the U.S. have a 26% chance of flooding within a 30-year period. This percentage is expected to increase due to climate-change-driven sea-...
20/10/2025
NVIDIA and Google Cloud are expanding access to accelerated computing to transform the full spectrum of enterprise workloads, from visual computing to agentic a...
17/10/2025
As Open Source AI Week comes to a close, we're celebrating the innovation, c...
17/10/2025
AI has ignited a new industrial revolution.
NVIDIA and TSMC are working togethe...
16/10/2025
GeForce NOW is more than just a platform to stream fresh games every week - it offers celebrations for the gamers who make it epic, with member rewards to sweet...
14/10/2025
AI is transforming the way enterprises build, deploy and scale intelligent applications. As demand surges for enterprise-grade AI applications that offer speed,...
14/10/2025
At Oracle AI World, NVIDIA and Oracle announced they are deepening their collabo...
13/10/2025
The future of AI took flight at Starbase, Texas - where NVIDIA CEO Jensen Huang ...
13/10/2025
At the OCP Global Summit, NVIDIA is offering a glimpse into the future of gigawa...
09/10/2025
NVIDIA Blackwell swept the new SemiAnalysis InferenceMAX v1 benchmarks, deliveri...
09/10/2025
Microsoft Azure today announced the new NDv6 GB300 VM series, delivering the ind...
09/10/2025
Lock, load and stream - the battle is just beginning. EA's highly anticipated Battlefield 6 is set to storm the cloud when it launches tomorrow with GeForce...
08/10/2025
Telecommunication networks are critical infrastructure for every nation, underpi...
02/10/2025
Editor's note: This blog has been updated to include an additional game for October, The Outer Worlds 2.
October is creeping in with plenty of gaming treat...
01/10/2025
Many users want to run large language models (LLMs) locally for more privacy and control, and without subscriptions, but until recently, this meant a trade-off ...
30/09/2025
Quantum computing promises to reshape industries - but progress hinges on solvin...
30/09/2025
Editor's note: This blog is a part of Into the Omniverse, a series focused o...
25/09/2025
Suit up and head for the cloud. Mecha BREAK, the popular third-person shooter, is now available to stream on GeForce NOW with NVIDIA DLSS 4 technology.
Catch i...
24/09/2025
Canada's role as a leader in artificial intelligence was on full display at ...
24/09/2025
Open technologies - made available to developers and businesses to adopt, modify...
23/09/2025
Energy efficiency in large language model inference has improved 100,000x in the...
22/09/2025
OpenAI and NVIDIA just announced a landmark AI infrastructure partnership - an initiative that will scale OpenAI's compute with multi-gigawatt data centers ...
19/09/2025
AI is no longer solely a back-office tool. It's a strategic partner that can...
18/09/2025
The U.K. was the center of the AI world this week as NVIDIA, U.K. and U.S. leade...
18/09/2025
GeForce NOW is packing a monstrous punch this week. Dying Light: The Beast, the latest adrenaline fueled chapter in Techland's parkour meets survival horror...
17/09/2025
Today's creators are equal parts entertainer, producer and gamer, juggling game commentary, scene changes, replay clips, chat moderation and technical troub...
16/09/2025
The U.K. is driving investments in sovereign AI, using the technology to advance...
13/09/2025
Celtic languages - including Cornish, Irish, Scottish Gaelic and Welsh - are the U.K.'s oldest living languages. To empower their speakers, the UK-LLM sover...
10/09/2025
GeForce NOW Blackwell RTX 5080-class SuperPODs are now rolling out, unlocking a new level of ultra high-performance, cinematic cloud gaming.
GeForce NOW Ultima...
09/09/2025
Inference has emerged as the new frontier of complexity in AI. Modern models are...