Nvidia’s Isaac robot simulations debut on Omniverse

Elevate your enterprise data technology and strategy at Transform 2021.


Nvidia has launched a new version of its Isaac robot simulation engine on its Omniverse, which is the company’s metaverse simulation for engineers.

The Omniverse is a virtual tool that allows engineers to collaborate. It was inspired by the science fiction concept of the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. The project started years ago as a proprietary Nvidia project called Holodeck, named after the virtual reality simulation in Star Trek.

But it morphed into a more ambitious industry-wide effort based on the plumbing made possible by the Universal Scene Description (USD) technology Pixar developed for making its movies. Nvidia has spent years and hundreds of millions of dollars on the project, and now it’s updating its robot simulations for it.

An open beta

Above: These are simulations of Isaac robots in action.

Image Credit: Nvidia

The new Isaac simulation engine is now in open beta so companies and designers can test how their robots function in a simulated environment before they make the commitment of manufacturing the robots, said Gerard Andrews, senior product marketing manager at Nvidia, in an interview with VentureBeat.

Andrews showed me some images and videos of robots working in a digital factory being created by BMW as a “digital twin.” Once the factory design is done, the digital design will be replicated in the real world as a physical copy. And now the Isaac-based robots will operate more realistically, based on newly available sensors for the robots and more robust simulations.

The simulation not only creates better photorealistic environments but also streamlines synthetic data generation and domain randomization to build ground-truth datasets to train robots in applications from logistics and warehouses to factories of the future.

“Isaac Sim is going into open beta. We’ve had an early adopter program, which has reached thousands of developers in hundreds of individual companies,” Andrews said. “They tried it out and kicked the tires and gave us some good feedback. And we’re proud to take this to the market based on that feedback and a lot of enthusiasm we are seeing from these customers.”

He said Isaac Sim is a realistic simulation, derived from core technologies such as accurate physics, real-time ray tracing, path tracing, and materials that behave like they’re supposed to.

“One of the big problems you have is the sim-to-real gap, where the gap between the virtual world and the real world — if it exceeds a certain amount — then the engineers or developers just won’t use simulation,” Andrews said. “They’ll just abandon it and say is not working.”

Andrews said the Isaac Sim running on Omniverse will be a game-changer in the utility of simulators. And he said the simulation has to be good enough that it’s worth the time it takes to learn how to use the tools for the simulation.

“A lot of the use cases we have around manipulation robots, navigating robots, generating synthetic data to train the AI in those robots — we have those use cases built into Isaacs already,” Andrews said. “And then finally, the big benefit that we get from being a part of the Omniverse platform is seamless connectivity and interoperability with all these other tools that people may be using in their 3D workloads. We can bring those assets into our simulation environment where we’re developing the robot, training the robot, or testing the robot.”

The Omniverse and Isaac

Above: Dofbot manipulation robot in Isaac Sim.

Image Credit: Nvidia

The Omniverse is the underlying foundation for Nvidia’s simulators, including the Isaac platform — which now includes several new features.

Built on the Nvidia Omniverse platform, Isaac Sim is a robotics simulation application and synthetic data generation tool. It allows roboticists to train and test their robots more efficiently by providing a realistic simulation of the robot interacting with compelling environments that can expand coverage beyond what is possible in the real world.

This release of Isaac Sim also adds improved multi-camera support and sensor capabilities, and a PTC OnShape CAD importer to make it easier to bring in 3D assets. These new features will expand the breadth of robots and environments that can be successfully modeled and deployed in every aspect: from design and development of the physical robot, then training the robot, to deploying in a “digital twin” in which the robot is simulated and tested in an accurate and photorealistic virtual environment.

Developers have long seen the benefits of having a powerful simulation environment for testing and training robots. But all too often, the simulators have had shortcomings that limited their adoption. Isaac Sim addresses these drawbacks, Andrews said.

Realistic simulation

Above: A scene in a BMW digital twin factory.

Image Credit: Nvidia

I was looking at the images of Isaac robots in the press material, and I thought they were photos. But those are 3D-animated images of robots in the Omniverse.

In order to deliver realistic robotics simulations, Isaac Sim leverages the Omniverse platform’s powerful technologies including advanced graphics processing unit (GPU)-enabled physics simulation with PhysX 5, photorealism with real-time ray, and path tracing, and Material Definition Language (MDL) support for physically-based rendering.

Isaac Sim is built to address many of the most common robotics use cases including manipulation, autonomous navigation, and synthetic data generation for training data. Its modular design allows users to easily customize and extend the toolset to accommodate many applications and environments.

“This image is a digital twin of BMWs new factory that their factory planners worked on. They brought it into the Omniverse world. And the cool thing about being in Omniverse is that I can put my simulated robot right in this world, and collect the training data that I’m going to use for my AI models, do my testing, do all sorts of different scenarios. And that’s kind of one of the beauties of being a part of the Omniverse platform,” Anders said. “I’ve been challenged to come up with a catchy phrase, and I ever really come up with a catchy phrase, but it’s something around the realistic robot models and the complex scenes that they’re going to operate in.”

To me, it’s kind of like designing products inside one of Pixar’s film worlds, only one that is far more realistic.

With Omniverse, Isaac Sim benefits from Omniverse Nucleus and Omniverse Connectors, enabling the collaborative building, sharing, and importing of environments and robot models in Pixar’s Universal Scene Description (USD) standard. Engineers can easily connect the robot’s brain to a virtual world through Isaac SDK and ROS/ROS2 interface, fully-featured Python scripting, plugins for importing robot and environment models.

Synthetic Data Generation is an important tool that is increasingly used to train the perception models found in today’s robots. Getting real-world, properly labeled data is a time-consuming and costly endeavor. But in the case of robotics, some of the required training data could be too difficult or dangerous to collect in the real world. This is especially true of robots that must operate in close proximity to humans.

Issac Sim has built-in support for a variety of sensor types that are important in training perception models. These sensors include RGB, depth, bounding boxes, and segmentation, Andrews said.

How realistic should it be?

“You just want to, within reason, close that sim-to-real gap,” Andrews said. “If you have a small error, that can accumulate in your simulation. It can pick up over time, like an error in physics modeling where you don’t do something right with how the wheels [function], then the first time you simulate it, your robot may be fine. But that error builds up and the robot may find itself completely off course in the real world.”

He added, “The closer you can get into the reality, there’s just a better experience you’re going to have when the engineers try to use it. In the world of simulation, you always face this idea of now that I have the real hardware, what’s the value of still using the simulator.”

Getting better data

Above: Isaac Sim gets into the engineering details for materials.

Image Credit: Nvidia

In the open beta, Nvidia has the ability to output synthetic data in the KITTI format. This data can then be used directly with the Nvidia Transfer Learning Toolkit to enhance model performance with use case-specific data, Andrews said.

Domain Randomization varies the parameters that define a simulated scene, such as the lighting, color and texture of materials in the scene. One of the main objectives of domain randomization is to enhance the training of machine learning (ML) models by exposing the neural network to a wide variety of domain parameters in simulation. This will help the model to generalize well when it encounters real world scenarios. In effect, this technique helps teach models what to ignore.

Isaac Sim supports the randomization of many different attributes that help define a given scene. With these capabilities, the ML engineers can ensure that the synthetic dataset contains sufficient diversity to drive robust model performance.

Simulations can save time and other things

BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.

Above: BMW Group is using Nvidia’s Omniverse to build a digital factory that will mirror a real-world place.

Image Credit: Nvidia

In real life, 50 engineers may be working on a project, but they might have only one hardware prototype. With something like Isaac, all 50 software engineers could work on it at the same time, Andrews said. No longer do all of the engineers have to be in the same place, as they can work on parts of it remotely. And they don’t all have to be in the same physical space.

“I was designing processor cores and people always wanted to simulate it before they had the real hardware, but when their chip came back, the simulator was put on the side,” Andrews said. “In the robotics use case, I still feel like there’s value for the simulator, even when you have hardware because the robots themselves are expensive.”

On top of that, it could be dangerous to test a robot in the real world if its controls aren’t right. It might run into a human. But if you test it in the Omniverse, the simulation won’t hurt anybody.

Over time, Nvidia has added things like multi-camera support, a fisheye camera lens, and other sensors that improve the functions of the robot and its ability to sense the environment. The more components are improved in the real world, the more the Isaac simulation can be updated in the Omniverse, Andrews said.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Leave a Comment