
Data centers need an upgraded dashboard to guide their journey to greater energy efficiency, one that shows progress running real-world applications.
The formula for energy efficiency is simple: work done divided by energy used. Applying it to data centers calls for unpacking some details.
Today's most widely used gauge - power usage effectiveness (PUE) - compares the total energy a facility consumes to the amount its computing infrastructure uses. Over the last 17 years, PUE has driven the most efficient operators closer to an ideal where almost no energy is wasted on processes like power conversion and cooling.
Finding the Next Metrics PUE served data centers well during the rise of cloud computing, and it will continue to be useful. But it's insufficient in today's generative AI era, when workloads and the systems running them have changed dramatically.
That's because PUE doesn't measure the useful output of a data center, only the energy that it consumes. That'd be like measuring the amount of gas an engine uses without noticing how far the car has gone.
Many standards exist for data center efficiency. A 2017 paper lists nearly three dozen of them, several focused on specific targets such as cooling, water use, security and cost.
Understanding What's Watts When it comes to energy efficiency, the computer industry has a long and somewhat unfortunate history of describing systems and the processors they use in terms of power, typically in watts. It's a worthwhile metric, but many fail to realize that watts only measure input power at a point in time, not the actual energy computers use or how efficiently they use it.
So, when modern systems and processors report rising input power levels in watts, that doesn't mean they're less energy efficient. In fact, they're often much more efficient in the amount of work they do with the amount of energy they use.
Modern data center metrics should focus on energy, what the engineering community knows as kilowatt-hours or joules. The key is how much useful work they do with this energy.
Reworking What We Call Work Here again, the industry has a practice of measuring in abstract terms, like processor instructions or math calculations. So, MIPS (millions of instructions per second) and FLOPS (floating point operations per second) are widely quoted.
Only computer scientists care how many of these low-level jobs their system can handle. Users would prefer to know how much real work their systems put out, but defining useful work is somewhat subjective.
Data centers focused on AI may rely on the MLPerf benchmarks. Supercomputing centers tackling scientific research typically use additional measures of work. Commercial data centers focused on streaming media may want others.
The resulting suite of applications must be allowed to evolve over time to reflect the state of the art and the most relevant use cases. For example, the last MLPerf round added tests using two generative AI models that didn't even exist five years ago.
A Gauge for Accelerated Computing Ideally, any new benchmarks should measure advances in accelerated computing. This combination of parallel processing hardware, software and methods is running applications dramatically faster and more efficiently than CPUs across many modern workloads.
For example, on scientific applications, the Perlmutter supercomputer at the National Energy Research Scientific Computing Center demonstrated an average of 5x gains in energy efficiency using accelerated computing. That's why it's among the 39 of the top 50 supercomputers - including the No. 1 system - on the Green500 list that use NVIDIA GPUs.
Because they execute lots of tasks in parallel, GPUs execute more work in less time than CPUs, saving energy. Companies across many industries share similar results. For example, PayPal improved real-time fraud detection by 10% and lowered server energy consumption nearly 8x with accelerated computing.
The gains are growing with each new generation of GPU hardware and software.
In a recent report, Stanford University's Human-Centered AI group estimated GPU performance has increased roughly 7,000 times since 2003, and price per performance is 5,600 times greater.
Data centers need a suite of benchmarks to track energy efficiency across their major workloads. Two Experts Weigh In Experts see the need for a new energy-efficiency metric, too.
With today's data centers achieving scores around 1.2 PUE, the metric has run its course, said Christian Belady, a data center engineer who had the original idea for PUE. It improved data center efficiency when things were bad, but two decades later, they're better, and we need to focus on other metrics more relevant to today's problems.
Looking forward, the holy grail is a performance metric. You can't compare different workloads directly, but if you segment by workloads, I think there is a better likelihood for success, said Belady, who continues to work on initiatives driving data center sustainability.
Jonathan Koomey, a researcher and author on computer efficiency and sustainability, agreed.
To make good decisions about efficiency, data center operators need a suite of benchmarks that measure the energy implications of today's most widely used AI workloads, said Koomey.
Tokens per joule is a great example of what one element of such a suite might be, Koomey added. Companies will need to engage in open discussions, share information on the nuances of their own workloads and experiments, and agree to realistic test procedures to ensure these metrics accurately characterize energy use for hardware running real-world applications.
Finally, we need an open public forum to conduct this important work, he said.
It Takes a Village Thanks to metrics like PUE an
More from Nvidia
11/03/2026
Launched today, NVIDIA Nemotron 3 Super is a 120 billion parameter open model with 12 billion active parameters designed to run complex agentic AI systems at sc...
10/03/2026
Game developers and artists are building cinematic worlds and iconic characters ...
10/03/2026
Game development teams are working across larger worlds, more complex pipelines and more distributed teams than ever. At the same time, many studios still rely ...
10/03/2026
The Cat 306 CR mini-excavator weighs just under eight tons and fits inside a standard shipping container. It's the machine a contractor rents when the job s...
10/03/2026
NVIDIA and Thinking Machines Lab announced today a multiyear strategic partnersh...
09/03/2026
AI is everywhere and accelerating everything - becoming essential infrastructure...
09/03/2026
ABB Robotics and NVIDIA today announced a breakthrough partnership that brings i...
05/03/2026
March is in full bloom, and that means a fresh wave of games heading to the cloud. 15 new titles are joining the GeForce NOW library this month.
Leading the Ma...
28/02/2026
AI-RAN is moving from lab to field, showing that a software-defined approach is ...
28/02/2026
Autonomous networks - intelligent, self-managing telecommunications operations -...
26/02/2026
GeForce NOW's anniversary celebration reaches a chilling crescendo as Capcom...
26/02/2026
GeForce NOW's anniversary celebration reaches a chilling crescendo as Capcom...
24/02/2026
AI is accelerating every aspect of healthcare - from radiology and drug discover...
23/02/2026
As technologies and systems become more digitalized and connected across the world, operational technology (OT) environments and industrial control systems (ICS...
19/02/2026
The GeForce NOW anniversary celebration keeps on rolling, and this week is all about the games that make it possible. With more than 4,500 titles supported in t...
19/02/2026
AI is accelerating the telecommunications industry's transformation, becomin...
17/02/2026
India is entering a new age of industrialization, as AI transforms how the world...
17/02/2026
Agentic AI is reshaping India's tech industry, delivering leaps in services ...
17/02/2026
India is the nexus of AI innovation this week as the host of the AI Impact Summit, which brings together global heads of state and industry to chart the future ...
16/02/2026
The NVIDIA Blackwell platform has been widely adopted by leading inference provi...
12/02/2026
At leading institutions across the globe, the NVIDIA DGX Spark desktop supercomputer is bringing data center class AI to lab benches, faculty offices and studen...
12/02/2026
A diagnostic insight in healthcare. A character's dialogue in an interactive...
12/02/2026
The GeForce NOW sixth-anniversary festivities roll on this February, continuing a monthlong celebration of NVIDIA's cloud gaming service.
This week brings ...
05/02/2026
Break out the cake and green sprinkles - GeForce NOW is turning six.
Since launch, members have streamed over 1 billion hours, and the party's just getting...
04/02/2026
Editor's note: This post is part of the Nemotron Labs blog series, which exp...
03/02/2026
At 3DEXPERIENCE World in Houston, NVIDIA founder and CEO Jensen Huang and Dassau...
29/01/2026
Mercedes-Benz is marking 140 years of automotive innovation with a new S-Class b...
29/01/2026
Editor's note: This post is part of Into the Omniverse, a series focused on ...
29/01/2026
Get ready to game - the native GeForce NOW app for Linux PCs is now available in beta, letting Linux desktops tap directly into GeForce RTX performance from the...
28/01/2026
Quantum technologies are rapidly emerging as foundational capabilities for economic competitiveness, national security and scientific leadership in the 21st cen...
22/01/2026
AI-powered driver assistance technologies are becoming standard equipment, funda...
22/01/2026
The wait is over, pilots. Flight control support - one of the most community-requested features for GeForce NOW - is live starting today, following its announce...
22/01/2026
AI has taken center stage in financial services, automating the research and exe...
22/01/2026
AI-powered content generation is now embedded in everyday tools like Adobe and Canva, with a slew of agencies and studios incorporating the technology into thei...
21/01/2026
From skilled trades to startups, AI's rapid expansion is the beginning of th...
21/01/2026
From skilled trades to startups, AI's rapid expansion is the beginning of th...
15/01/2026
NVIDIA kicked off the year at CES, where the crowd buzzed about the latest gaming announcements - including the native GeForce NOW app for Linux and Amazon Fire...
13/01/2026
NVIDIA and Lilly are putting together a blueprint for what is possible in the f...
09/01/2026
Every that was easy shopping moment is made possible by teams working to hit s...
08/01/2026
The next universal technology since the smartphone is on the horizon - and it ma...
08/01/2026
In the rolling hills of Berkeley, California, an AI agent is supporting high-stakes physics experiments at the Advanced Light Source (ALS) particle accelerator....
08/01/2026
NVIDIA is wrapping up a big week at the CES trade show with a set of GeForce NOW...
07/01/2026
AI has transformed retail and consumer packaged goods (CPG) operations, enhancin...
05/01/2026
At the CES trade show running this week in Las Vegas, NVIDIA announced that the ...
05/01/2026
Open-source AI is accelerating innovation across industries, and NVIDIA DGX Spar...
05/01/2026
NVIDIA DGX SuperPOD is paving the way for large-scale system deployments built on the NVIDIA Rubin platform - the next leap forward in AI computing.
At the CES...
05/01/2026
AI is powering breakthroughs across industries, helping enterprises operate with...
05/01/2026
NVIDIA founder and CEO Jensen Huang took the stage at the Fontainebleau Las Vega...
05/01/2026
At the CES trade show, NVIDIA today announced DLSS 4.5, which introduces Dynamic...
05/01/2026
2025 marked a breakout year for AI development on PC.
PC-class small language m...