NVIDIA announces Unity connector for Omniverse at GTC 2022

(Image by NVIDIA)

[Disclaimer: This article has been written as part of a paid collaboration with NVIDIA. It’s been written by me, on a topic I had an interest in, and trying to convey useful information to the community, so I hope you’ll like it]

A few weeks ago, while reading the news about SIGGRAPH, I saw NVIDIA teasing the release of the Omniverse connector for Unity, and as a Unity developer, I found it intriguing. I asked for the opportunity to have more info about it, and I had the pleasure to have a talk with Dane Johnston, Director of Omniverse Connect at NVIDIA, to ask for more details about it. 

The article is a summary of the most interesting information that came from our chat… including the mind blowing moment when I realized that with this technology, people in Unity and Unreal Engine could work together on the same project :O

NVIDIA Omniverse

Omniverse is the collaborative and simulation tool by NVIDIA. I had some difficulties grasping what it does until some people at the company explained it to me in detail. Long story short, Omniverse is a system composed of three parts:

  • A central core, called Nucleus, which holds the representation of a scene in USD format in the cloud and that cares about integrating all the distributed modifications in his common scene;
  • Some connectors, which are used by people working remotely on the scene. A connector connects a specific local application (e.g. Blender) to the Nucleus on the cloud, and sends to it the work that has been done in that application. There are connectors for many applications: people creating 3D models may use the connector for 3D Studio Max, while people working with materials, may use the one with Substance. Nucleus will take care of merging all the assets created by the various users using the various applications in a common scene;
  • Some NVIDIA modules, which can be run on top of Nucleus, to perform some operations on the scene. E.g. you can have a module to perform a complex physics simulation on the scene that the team has shaped.

Omniverse lets people of a team collaborate remotely on the same scene: in this sense, it is a bit like Git, but for 3D scenes. And it also offers the possibility of running NVIDIA AI services (e.g. for digital twins) on the scene you created.

Diagram that shows the pipeline of NVIDIA Omniverse (Image by NVIDIA)

Unity connector for Omniverse

At launch, Omniverse has been made compatible with Unreal Engine, and support for Unity was lacking. I asked Dane why, and he said that there is no specific reason. Actually, NVIDIA started developing both connectors together, but the UE one was developed much faster, probably due to the greater expertise inside NVIDIA.

As a Unity developer, this was disappointing, because this made Omniverse much less interesting to me for professional use. But now finally, NVIDIA has officially announced the development of a Unity connector for Omniverse at GTC 2022. It will be released in beta before the end of the year, so Unity developers can soon enter the world of Omniverse and start creating scenes together with other professionals.

How to use Unity with Omniverse

I asked Dane how to make Unity work with Omniverse, and I’m a bit sorry for him that I probably asked for too many technical details 🙂 By the way, this is what I was able to understand.

The first thing needed to use Omniverse is to set up a common Nucleus server (the “repository” in Git terms). You can do this by following the instructions at this link: https://docs.omniverse.nvidia.com/prod_install-guide/prod_nucleus/enterprise/installation/quick_start_tips.html 

Then you have to install Omniverse on your PC, open the Omniverse Launcher, and find the Unity connector in the “Exchange Connectors” section. You install it, and it basically installs a plugin for Unity.

This is the Exchange Connectors section where to look for the Unity connector (Image by NVIDIA)

This plugin will give you an Omniverse dashboard inside Unity, with which you can choose how you want to collaborate with your peers. There are two ways to start a collaboration, one offline and the other online (I invented these terms… they are not official ones).

The offline collaboration works in a way similar to version control systems, or like Dropbox, if we make a parallel with document writing. You open the shared project, make some modifications, then save the modifications. When your colleague opens the project, he/she gets from the server your modified scene and works on it, then it saves it again for others to use.

The online collaboration works in a way similar to Google Docs. You and your colleagues decide to work together and start a live session together. While you are in the live session, you can work on the scene together, and every modification made by a professional is reflected in real-time into the scene seen by the others. So a material artist could create a new material for a sofa in the scene in Substance, push it to Omniverse, and all the other workers in the live session would see it changing immediately in their local version. At the end of the session, the team can see the list of modifications it has made to the scene, decide if to keep them all, or keep only a part, and then confirm that to Omniverse. After the confirmation, the scene in the servers is updated for all the other employees to use.

USD and Unity

Omniverse exploits the novel support for USD provided by Unity to offer its functionalities. Behind the scenes, for every modification, the system sends a “delta” of the changes to the Nucleus server, which integrates it into the common scene. USD offers this possibility of working with deltas, and this makes its use perfect for the purpose of working with a common 3D environment. Furthermore, since only deltas are sent and not the whole scene, the collaboration system is very lightweight on the network.

How teams can use it

I know that Omniverse is mainly used for simulations, but I wondered if it could be useful also for small game studios to work together on a common Unity game. Dane told me that yes, it is a possible use: Omniverse is good both for enterprise applications and to make games.

Using Omniverse a 3D artist and a game designer could collaborate live on the same scene to create a level together, and then save everything when the level is complete. Since I work on creative projects with designers and artists that are remote, I can tell you that this would be a fantastic tool to work together, because the current workflow now doesn’t allow us to truly work on a scene at the same time.

NVIDIA is working with developers to try to understand how to evolve Omniverse to support them. For instance, Dane told me that some developers like to use Omniverse because it makes it easy to plug in the application NVIDIA AI services like audio to face (that creates facial expressions from a voice) for NPCs. Another feature the company is working on is in offering a “packaging process” for the scenes created with Omniverse. This means that before you build your game, Omniverse “converts” the scene into the native format of your engine, so that the build process of the game can proceed exactly as if you did everything in Unity without using Omniverse at all.

An open system

Omniverse makes people with different expertise work together on the same project (Image by NVIDIA)

I asked Dane which are the functionalities of Omniverse he likes the most. He said that at the end of the day, one of the favorite things for him is that everyone in a team can work with the tool he/she knows the most, and the work of everyone is integrated seamlessly into a common scene. So someone working in Substance can create a material, add it to the scene, and then the developers working in Unity will see that material converted automatically to a Unity material by the system. Everything integrates seamlessly so that a team of different professionals can work together with the tool they work best with.

And a consequence of this openness is that… people in Unity and Unreal could work together! Once there will be a connector for Unity, people using Unity could modify a scene that gets automatically updated for the designers working in Unreal and vice-versa. So it is a unique thing that for the first time, people working with different engines could work together on the same project. This shows the power of Omniverse and the USD format.

He added that the idea of Omniverse is to be open and offer many functionalities, and then let the teams decide how they want to use it, and how it can improve their current production processes. In the case of Unity, the vision is mixing the advantages of using Unity with the ones of using Omniverse.

Talking about VR, he also told me that he loves the fact that Omniverse now offers amazing scenes rendered with real-time ray-tracing in VR.

How you can try it

The Unity connector for Omniverse has been announced at GTC 2022 and will be released in beta at the end of 2022. Be sure to follow Omniverse on Twitter to be informed when it gets released. NVIDIA warns that it is a beta, and it is looking for studios that are interested in using it and provide feedback not only about the bugs but also about what are the features that software companies need from it.

And if you try it, please let me know what you think about it! I’m very curious to hear the opinion of gaming professionals about using Omniverse for working with their peers

A final word…

This post is part of a promotion for the GTC 2022 event, which will be held on September 19-22.

If you register for it using my unique code, you will participate in a raffle to win an RTX3080 Ti! To do that, you have to go through the following procedure:

Step-1: Register for NVIDIA GTC using this link: https://www.nvidia.com/gtc/?ncid=ref-crea-201724. To qualify, registrants need to be residents in the EMEA region (Europe, Middle East, or Africa).

Step-2: Attend GTC sessions, there is even one about VR in Omniverse that you can watch here. NOTE: Prizes will be awarded only to those who register for GTC using the link above and attend at least one session.

Step-3: Wait and hope to win a graphics card!

Good luck 😉

(Header image by NVIDIA)

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...