
It used to be that computing power trickled down from hulking supercomputers to the chips in our pockets.
Over the past 15 years, innovation has changed course: GPUs, born from gaming and scaled through accelerated computing, have surged upstream to remake supercomputing and carry the AI revolution to scientific computing's most rarefied systems.
JUPITER at Forschungszentrum J lich is the emblem of this new era.
Not only is it among the most efficient supercomputers - producing 63.3 gigaflops per watt - but it's also a powerhouse for AI, delivering 116 AI exaflops, up from 92 at ISC High Performance 2025.
This is the flip in action. In 2019, nearly 70% of the TOP100 high-performance computing systems were CPU-only. Today, that number has plunged below 15%, with 88 of the TOP100 systems accelerated - and 80% of those powered by NVIDIA GPUs.
Across the broader TOP500, 388 systems, 78%, now use NVIDIA technology, including 218 GPU-accelerated systems (up 34 systems year over year) and 362 systems connected by high-performance NVIDIA networking. The trend is unmistakable: accelerated computing has become the standard.
But the real revolution is in AI performance. With architectures like NVIDIA Hopper and Blackwell and systems like JUPITER, researchers now have access to orders of magnitude more AI compute than ever.
AI FLOPS have become the new yardstick, enabling breakthroughs in climate modeling, drug discovery and quantum simulation - problems that demand both scale and efficiency.
At SC16, years before today's generative AI wave, NVIDIA founder and CEO Jensen Huang saw what was coming. He predicted that AI would soon reshape the world's most powerful computing systems.
Several years ago, deep learning came along, like Thor's hammer falling from the sky, and gave us an incredibly powerful tool to solve some of the most difficult problems in the world, Huang declared.
At SC16, Huang explained how AI would reshape the world's most powerful scientific computing systems. The math behind computing power consumption had already made the shift to GPUs inevitable.
But it was the AI revolution, ignited by the NVIDIA CUDA-X computing platform built on those GPUs, that extended the capabilities of these machines dramatically.
Suddenly, supercomputers could deliver meaningful science at double precision (FP64) as well as at mixed precision (FP32, FP16) and even at ultra-efficient formats like INT8 and beyond - the backbone of modern AI.
This flexibility allowed researchers to stretch power budgets further than ever to run larger, more complex simulations and train deeper neural networks, all while maximizing performance per watt.
But even before AI took hold, the raw numbers had already forced the issue. Power budgets don't negotiate. Supercomputer researchers - inside NVIDIA and across the community - were coming to grips with the road ahead, and it was paved with GPUs.
To reach exascale without a Hoover Dam sized electric bill, researchers needed acceleration. GPUs delivered far more operations per watt than CPUs. That was the pre AI tell of what was to come, and that's why when the AI boom hit, large-scale GPU systems already had momentum.
The seeds were planted with Titan in 2012 at the Oak Ridge National Laboratory, one of the first major U.S. systems to pair CPUs with GPUs at unprecedented scale - showing how hierarchical parallelism could unlock huge application gains.
In Europe in 2013, Piz Daint set a new bar for both performance and efficiency, then proved the point where it matters: real applications like COSMO forecasting for weather prediction.
By 2017, the inflection was undeniable. Summit at Oak Ridge National Laboratory and Sierra at Lawrence Livermore Laboratory ushered in a new standard for leadership class systems: acceleration first. They didn't just run faster; they changed the questions science could ask for climate modeling, genomics, materials and more.
These systems are able to do much more with much less. On the Green500 list of the most efficient systems, the top eight are NVIDIA accelerated, with NVIDIA Quantum InfiniBand connecting 7 of the Top 10.
But the story behind these headline numbers is how AI capabilities have become the yardstick: JUPITER delivers 116 AI exaflops alongside 1 EF FP64 - a clear signal of how science now blends simulation and AI.
Power efficiency didn't just make exascale attainable; it made AI at exascale practical. And once science had AI at scale, the curve bent sharply upward.
What It Means Next This isn't just about benchmarks. It's about real science:
Faster, more accurate weather and climate models
Breakthroughs in drug discovery and genomics
Simulations of fusion reactors and quantum systems
New frontiers in AI-driven research across every discipline
The shift started as a power-efficiency imperative, became an architectural advantage and has matured into a scientific superpower: simulation and AI, together, at unprecedented scale.
It starts with scientific computing. Now, the rest of computing will follow.
More from Nvidia
01/01/2026
New year, new games, all with RTX 5080-powered cloud energy. GeForce NOW is kicking off 2026 by looking back at an unforgettable year of wins and wildly high fr...
25/12/2025
Holiday lights are twinkling, hot cocoa's on the stove and gamers are settling in for a well-earned break.
Whether staying in or heading on a winter getawa...
22/12/2025
The works of Plato state that when humans have an experience, some level of change occurs in their brain, which is powered by memory - specifically long-term me...
18/12/2025
NVIDIA will join the U.S. Department of Energy's (DOE) Genesis Mission as a ...
18/12/2025
Top-notch options for AI at the desktops of developers, engineers and designers ...
18/12/2025
Step out of the vault and into the future of gaming with Fallout: New Vegas streaming on GeForce NOW, just in time to celebrate the newest season of the hit Ama...
17/12/2025
The Hao AI Lab research team at the University of California San Diego - at the forefront of pioneering AI model innovation - recently received an NVIDIA DGX B...
17/12/2025
Editor's note: This post is part of Into the Omniverse, a series focused on ...
15/12/2025
NVIDIA today announced it has acquired SchedMD - the leading developer of Slurm, an open-source workload management system for high-performance computing (HPC) ...
15/12/2025
Modern workflows showcase the endless possibilities of generative and agentic AI on PCs.
Of many, some examples include tuning a chatbot to handle product-supp...
12/12/2025
In Las Vegas's T-Mobile Arena, fans of the Golden Knights are getting more than just hockey - they're getting a taste of the future. ADAM, a robot devel...
11/12/2025
Unveiling what it describes as the most capable model series yet for professional knowledge work, OpenAI launched GPT-5.2 today. The model was trained and deplo...
11/12/2025
Hunters, saddle up - adventure awaits in the cloud.
Journey into the world of M...
10/12/2025
The NVIDIA accelerated computing platform is leading supercomputing benchmarks once dominated by CPUs, enabling AI, science, business and computing efficiency w...
10/12/2025
The world's top-performing system for graph processing at scale was built on...
10/12/2025
As the scale and complexity of AI infrastructure grows, data center operators need continuous visibility into factors including performance, temperature and pow...
04/12/2025
Developers, researchers, hobbyists and students can take a byte out of holiday s...
04/12/2025
Editor's note: The Game Pass edition of Hogwarts Legacy' will also be supported on GeForce NOW when the Steam and Epic Games Store versions launch on t...
03/12/2025
The top 10 most intelligent open-source models all use a mixture-of-experts arch...
02/12/2025
Today, Mistral AI announced the Mistral 3 family of open-source multilingual, multimodal models, optimized across NVIDIA supercomputing and edge platforms.
M...
02/12/2025
At AWS re:Invent, NVIDIA and Amazon Web Services expanded their strategic collab...
01/12/2025
Researchers worldwide rely on open-source technologies as the foundation of their work. To equip the community with the latest advancements in digital and physi...
27/11/2025
Black Friday is leveling up. Get ready to score one of the biggest deals of the season - 50% off the first three months of a new GeForce NOW Ultimate membership...
25/11/2025
Black Forest Labs - the frontier AI research lab developing visual generative AI models - today released the FLUX.2 family of state-of-the-art image generation ...
24/11/2025
Editor's note: This post is part of the AI On blog series, which explores the latest techniques and real-world applications of agentic AI, chatbots and copi...
20/11/2025
Editor's note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows u...
20/11/2025
The NVIDIA Blackwell RTX upgrade is nearing the finish line, letting GeForce NOW Ultimate members across the globe experience true next-generation cloud gaming ...
20/11/2025
Tanya Berger-Wolf's first computational biology project started as a bet wit...
18/11/2025
Timed with the Microsoft Ignite conference running this week, NVIDIA is expandin...
18/11/2025
Today, Microsoft, NVIDIA and Anthropic announced new strategic partnerships. Anthropic is scaling its rapidly growing Claude AI model on Microsoft Azure, powere...
18/11/2025
AI agents have the potential to become indispensable tools for automating complex tasks. But bringing agents to production remains challenging.
According to Ga...
17/11/2025
NVIDIA Apollo - a family of open models for accelerating industrial and computat...
17/11/2025
To power future technologies including liquid-cooled data centers, high-resoluti...
17/11/2025
At SC25, NVIDIA unveiled advances across NVIDIA BlueField DPUs, next-generation networking, quantum computing, national research, AI physics and more - as accel...
17/11/2025
Across quantum physics, digital biology and climate research, the world's researchers are harnessing a universal scientific instrument to chart new frontier...
17/11/2025
It used to be that computing power trickled down from hulking supercomputers to ...
14/11/2025
Today's AI workloads are data-intensive, requiring more scalable and afforda...
13/11/2025
Editor's note: This post is part of the AI On blog series, which explores the latest techniques and real-world applications of agentic AI, chatbots and copi...
13/11/2025
Chaos has entered the chat. It's GFN Thursday, and things are getting intense with the launch of Call of Duty: Black Ops 7, streaming at launch this week on...
12/11/2025
In the age of AI reasoning, training smarter, more capable models is critical to scaling intelligence. Delivering the massive performance to meet this new age r...
12/11/2025
Large language model (LLM)-based AI assistants are powerful productivity tools, but without the right context and information, they can struggle to provide nuan...
10/11/2025
Editor's note: This post is part of Think SMART, a series focused on how lea...
06/11/2025
NVIDIA founder and CEO Jensen Huang and chief scientist Bill Dally were honored ...
06/11/2025
Editor's note: This blog has been updated to reflect the correct launch date for Call of Duty: Black Ops 7', November 14.
A crisp chill's in the...
04/11/2025
In Berlin on Tuesday, Deutsche Telekom and NVIDIA unveiled the world's first...
04/11/2025
When inspiration strikes, nothing kills momentum faster than a slow tool or a frozen timeline. Creative apps should feel fast and fluid - an extension of imagin...
03/11/2025
Two out of every three people are likely to be living in cities or other urban c...
31/10/2025
Amidst Gyeongju, South Korea's ancient temples and modern skylines, Jensen H...
30/10/2025
An unassuming van driving around rural India uses powerful AI technology that...
30/10/2025
Get ready, raiders - the wait is over. ARC Raiders is dropping onto GeForce NOW and bringing the fight from orbit to the screen.
To celebrate the launch, gamer...