Nvidia plans for a more robust Omniverse with avatars, synthetic data

Ar & Vr

Nvidia has unveiled plans to significantly expand the capabilities of their Omniverse platform for 3D design collaboration and simulated environments. The focus is on introducing new synthetic data generation and photorealistic avatar technology to enable more immersive and productive virtual worlds.

Omniverse is Nvidia’s real-time simulation and collaboration platform built for 3D workflows. It allows teams to work together in shared virtual spaces using photorealistic simulation. At the annual GTC conference, Nvidia demonstrated how they aim to build out and enhance Omniverse with AI-powered avatars and simulated data.

A key part of the expansion is bringing more human-like avatars into Omniverse environments. Nvidia is leveraging generative AI models like Megatron and real-time rendering powered by RTX GPUs to create avatars that look real and move naturally. This will enable more lifelike virtual collaboration and training scenarios.

The avatars can represent real people with facial animation driven by audio or synthetic AI-generated faces. They have hair and clothing that moves realistically and can make eye contact and gestures. The avatars will be able to hold conversations, understand context, and react appropriately.

Nvidia is also developing physics-based digital twins of humans that can walk, jump, squat, and move dynamically like real people. This will allow simulating ergonomics and interactions for industrial design and other applications.

To populate Omniverse worlds, Nvidia is utilizing AI to generate synthetic data like buildings, furniture, vehicles, clothes, and simulated sensor data. Generative adversarial networks create varied, realistic data samples that can be customized via simulation to avoid costly data collection.

For example, Omniverse users will be able to take synthetic LiDAR scans of a digitally created room and furniture to test reinforcement learning for robotics. The AI-generated data saves significant manual effort in creating digital twins.

Nvidia believes this combination of virtual AI characters and simulated data will enable new use cases spanning industries. Companies can set up virtual factories to optimize manufacturing line processes and train digital employees. Automakers can test autonomous vehicle perception against hazardous edge case scenarios.

Retailers can create photorealistic virtual stores to trial new layouts, displays, and signage. Fashion designers can showcase new clothing and model dynamics on digital avatars walking down runways. Workers can onboard and skill up inside detailed simulations of their actual jobs and tools.

The enhanced Omniverse platform aligns with Nvidia’s plans to establish an open metaverse ecosystem. The company believes blurring the physical and virtual worlds with Always-On Avatars, Robotics, and Digital Twins will drive the next evolution in computing on their hardware and cloud infrastructure.

Several partnerships will help grow Omniverse usage initially in automotive, architecture, manufacturing, and retail industries:

– BMW – Creating a synthetic factory to simulate manufacturing processes for training and optimization.

– Ericsson – Developing digital twins for designing and testing 5G equipment and networks virtually.

– WPP – Building virtual stores with photorealistic products and materials to evaluate retail experiences.

– Autodesk & Adobe – Integrating 3D design tools like Revit and Substance with Omniverse Studio for enhanced workflows.

To support the upcoming Omniverse expansion, Nvidia is introducing Gertrude, a new AI conversation agent created by Anthropic to interact naturally with users. Gertrude can be customized for different industries and avatars.

Nvidia also discussed enhancements to Modulus – the physics engine behind Omniverse Nucleus simulations. Modulus will add soft body dynamics, fluids, and other upgrades to make digital worlds behave more realistically.

Overall, Nvidia’s roadmap aims to build out Omniverse into an accessible platform for creating highly realistic shared virtual environments powered by AI-generated synthetic data. The avatars and digital twins will enable businesses to improve designs, streamline operations, and increase efficiencies across the product lifecycle.

Leave a Reply

Your email address will not be published. Required fields are marked *