What is the metaverse? The metaverse is a shared virtual 3D world, or worlds, that are interactive, immersive, and collaborative.Just as the physical universe is a collection of worlds that are connected in space, the metaverse can be thought of as a bunch of worlds, too.
Massive online social games, like battle royale juggernaut Fortnite and user-created virtual worlds like Minecraft and Roblox, reflect some elements of the idea.
Video-conferencing tools, which link far-flung colleagues together amidst the global COVID pandemic, are another hint at what's to come.
But the vision laid out by Neal Stephenson's 1992 classic novel Snow Crash goes well beyond any single game or video-conferencing app.
The metaverse will become a platform that's not tied to any one app or any single place - digital or real, explains Rev Lebaredian, vice president of simulation technology at NVIDIA.
And just as virtual places will be persistent, so will the objects and identities of those moving through them, allowing digital goods and identities to move from one virtual world to another, and even into our world, with augmented reality.
The metaverse will become a platform that's not tied to any one place, physical or digital. Ultimately we're talking about creating another reality, another world, that's as rich as the real world, Lebaredian says.
Those ideas are already being put to work with NVIDIA Omniverse, which, simply put, is a platform for connecting 3D worlds into a shared virtual universe.
Omniverse is in use across a growing number of industries for projects such as design collaboration and creating digital twins, simulations of real-world buildings and factories.
BMW Group uses NVIDIA Omniverse to create a future factory, a perfect digital twin designed entirely in digital and simulated from beginning to end in NVIDIA Omniverse. How NVIDIA Omniverse Creates, Connects Worlds Within the Metaverse So how does Omniverse work? We can break it down into three big parts.
NVIDIA Omniverse weaves together the Universal Scene Description interchange framework invented by Pixar with technologies for modeling physics, materials, and real-time path tracing. The first is Omniverse Nucleus. It's a database engine that connects users and enables the interchange of 3D assets and scene descriptions.
Once connected, designers doing modeling, layout, shading, animation, lighting, special effects or rendering can collaborate to create a scene.
Omniverse Nucleus relies on USD, or Universal Scene Description, an interchange framework invented by Pixar in 2012.
Released as open-source software in 2016, USD provides a rich, common language for defining, packaging, assembling and editing 3D data for a growing array of industries and applications.
Lebardian and others say USD is to the emerging metaverse what hyper-text markup language, or HTML, was to the web - a common language that can be used, and advanced, to support the metaverse.
Multiple users can connect to Nucleus, transmitting and receiving changes to their world as USD snippets.
The second part of Omniverse is the composition, rendering and animation engine - the simulation of the virtual world.
Simulation of virtual worlds in NVIDIA DRIVE Sim on Omniverse. Omniverse is a platform built from the ground up to be physically based. Thanks to NVIDIA RTX graphics technologies, it is fully path traced, simulating how each ray of light bounces around a virtual world in real-time.
Omniverse simulates physics with NVIDIA PhysX. It simulates materials with NVIDIA MDL, or material definition language.
Built in NVIDIA Omniverse Marbles at Night is a physics-based demo created with dynamic, ray-traced lights and over 100 million polygons. And Omniverse is fully integrated with NVIDIA AI (which is key to advancing robotics, more on that later).
Omniverse is cloud-native, scales across multiple GPUs, runs on any RTX platform and streams remotely to any device.
The third part is NVIDIA CloudXR, which includes client and server software for streaming extended reality content from OpenVR applications to Android and Windows devices, allowing users to portal into and out of Omniverse.
NVIDIA Omniverse promises to blend real and virtual realities. You can teleport into Omniverse with virtual reality, and AIs can teleport out of Omniverse with augmented reality.
Metaverses Made Real NVIDIA released Omniverse to open beta in December, and NVIDIA Omniverse Enterprise in April. Professionals in a wide variety of industries quickly put it to work.
At Foster + Partners, the legendary design and architecture firm that designed Apple's headquarters and London's famed 30 St Mary Axe office tower - better known as the Gherkin - designers in 14 countries worldwide create buildings together in their Omniverse shared virtual space.
Visual effects pioneer Industrial Light & Magic is testing Omniverse to bring together internal and external tool pipelines from multiple studios. Omniverse lets them collaborate, render final shots in real-time and create massive virtual sets like holodecks.
Multinational networking and telecommunications company Ericsson uses Omniverse to simulate 5G wave propagation in real-time, minimizing multi-path interference in dense city environments.
Ericsson uses Omniverse to do real-time 5G wave propagation simulation in dense city environments. Infrastructure engineering software company Bentley Systems is using Omniverse to create a suite of applications on the platform. Bentley's iTwin platform creates a 4D infrastructure digital twin to simulate an infrastructure asset's construction, then monitor and optimize its performance throughout its lifecycle.
The Metaverse Can Help Humans and Robots Collaborate These virtual worlds are ideal for training robots.
One of the essential features of










