Thousands flocked to Munich this week for a major gathering - not Oktoberfest, but GTC Europe.The conference, now in its third year, is a celebration of groundbreaking GPU-accelerated work across the region. Nearly 300 developers, startups and researchers took the stage, sharing innovative projects. Among them were some of the major science centers in Europe, spanning fields as diverse as particle physics, climate research and neuroscience.
Understanding the Universe T cnico Lisboa, Portugal
Nuclear energy today is generated through nuclear fission: the process of splitting apart an atom's nucleus, which creates both usable energy and radioactive waste. A cleaner and more powerful alternative is nuclear fusion, the joining together of two nuclei.
But so far, scientists haven't been able to sustain a nuclear fusion reaction long enough to harness its energy. Using deep learning algorithms and a Tesla P100 GPU, university researchers at Portugal's T cnico Lisboa are studying the plasma shape and behavior that takes place in a fusion reactor.
Gaining insight into the factors at play during nuclear fusion is essential for physicists. If researchers are able to predict when a reaction is about to be disrupted, they could make changes to take preventive action to prolong the reaction until enormous amounts of energy can be captured.
GPUs are essential to make these neural network inferences in real time during a fusion reaction. The deep learning models currently predict disruption with 85 percent accuracy, matching state-of-the-art systems. By adding more probes that collect measurements within the reactor, and using a multi-GPU system, the researchers can reach even higher accuracy levels.
European Organization for Nuclear Research, Switzerland
Physicists have long been in search of a theory of everything, a mathematical model that works in every case, even at velocities approaching the speed of light. CERN, the European Organization for Nuclear Research, is a major center for this research.
Best known in recent years for the discovery of the Higgs boson, often called the God particle, the organization uses a machine called the Large Hadron Collider to create collisions between subatomic particles.
The researchers use software first to simulate the interactions they expect to see in the collision, and then to compare the real collision with the original simulation. These experiments require a system that can handle five terabytes of data per second.
We are working to speed up our software and improve its accuracy, to face at best the challenges of the next Large Hadron Collider phase, said CERN researcher Andrea Bocci. We are exploring the use of GPUs to accelerate our algorithms and to integrate fast inference of deep learning models in our next-generation real-time data processing system.
Using GPUs will allow CERN to raise the bar for detailed and highly accurate analysis, while going up to 100x faster.
Observatoire de Paris, France
Astronomers use large telescopes to get a closer look at the universe from Earth, scanning the skies for planets outside the solar system. But our planet's atmosphere is turbulent, distorting the images collected by ground-based telescopes.
Large, ground-based telescopes use deformable mirrors to get a clear picture of stars and exoplanets. To counteract this distortion, astronomers use deformable mirrors that can shapeshift in real time.
A combination of high-performance linear algebra and machine learning algorithms determine how the telescope must move to correct for the atmospheric distortion, but they must run extremely quickly, since atmospheric distortion changes constantly. The algorithms must predict what the distortion will be like at the exact time the deformable mirrors will shift to the correct shape.
Researchers at the Observatoire de Paris, in collaboration with Subaru Telescope and KAUST ECRC, are using NVIDIA DGX-1 AI supercomputers to run the algorithms' inferencing at the multi-kHz frame rate required.
Climate Modeling and Natural Disaster Response Swiss National Supercomputing Center, Switzerland
A new report from the Intergovernmental Panel on Climate Change Description warns of dire effects if global temperature rises beyond 1.5 degrees Celsius above pre-industrial levels. To make these forecasts, researchers rely on climate models.
Climate models that look years into the future and at large geographical areas are computationally intensive, so scientists often use a low resolution, making inferences at the level of multiple square kilometers.
That means climate models have to approximate the impact clouds, sometimes just a few hundred meters across, can have on global temperature. Clouds are a key element of the weather system, so this approximation has a significant effect on a climate model's results.
To better account for clouds in their climate model, the Swiss National Supercomputing Center runs their model on Piz Daint, the fastest supercomputer in Europe. The system is loaded with more than 5,000 Tesla P100 GPUs. Piz Daint allows the center's researchers to run a global climate model at a resolution of one square kilometer, resolving the cloud problem and paving the way for accurate climate modeling to understand the effects of global warming.
DFKI's research analyzes both satellite images (above) and visuals shared on social media (below) to analyze the impact of natural disasters like wildfires. German Research Centre for Artificial Intelligence, Germany
DFKI, the leading German AI research center, is using deep learning to estimate the damages of natural disasters like floods and wildfires. Its researchers use satellite imagery data as well as multimedia posted on social networks to identify and gauge the scope of crises.
Time is critical for any assessment in a c










