NVIDIA Announces Unity Connector For Omniverse At GTC 2022
A few weeks back, while reading the news about SIGGRAPH, I saw NVIDIA teasing the launch of the Omniverse connector for Unity. Also, as a Unity creator, I found it intriguing. I asked for the chance to have more info about it, and I had the pleasure of having a talk with Dane Johnston, Director of Omniverse Connect at NVIDIA, to ask for more details concerning it.
The article is one summary of the most exciting information that came from our chat … including the mind-blowing moment when I realized that with this technology, people in Unity and also Unreal Engine could function with each other on the same project: O.
NVIDIA Omniverse
Omniverse is a collaborative and simulation device by NVIDIA. I got some difficulties grasping what it does until some people at the company explained it to me in detail. Lengthy tale short, Omniverse is one system composed of 3 parts:.
- One central core, called Nucleus, which holds the representation of one scene in USD format in the cloud and also that cares about integrating all the distributed modifications in his typical scene;.
- Some connectors, which are utilized by people working remotely on the scene. A connector links a specific local application (e.g., Blender) to the Nucleus on the cloud. It sends to it the work that has been done in that application. There are connectors for different applications: people creating 3D models may utilize the connector for 3D Studio Max, while people functioning with materials may utilize the one with Substance. Nucleus will take care of merging all the assets produced by the different users using the multiple applications in a typical scene;.
- Some NVIDIA modules that can be run on top of Nucleus to perform some operations on the scene. For E.g., you can have one module to perform a complex physics simulation on the scene that the group has shaped.
Omniverse allows people of a team to collaborate remotely on the same scene: in this case, it is a bit like Git, but for 3D scenes. It also allows running NVIDIA AI services (e.g., for digital doubles) on the scene you created.
Unity connector for Omniverse
At release, Omniverse has been made compatible with Unreal Engine, and support for Unity was lacking. I asked Dane why, and he stated that 3 is no specific factor. Actually, NVIDIA began developing both connectors together, but the UE one was created much faster, probably due to the higher expertise inside NVIDIA.
As one Unity developer, this was disappointing because this made Omniverse much less enjoyable to me for professional use. But currently, finally, NVIDIA has officially announced the development of a Unity connector for Omniverse at GTC 2022. It will be launched in beta before the end of the year, so Unity developers could soon enter the world of Omniverse and begin producing scenes together with other professionals.
How to utilize Unity with Omniverse
I asked Dane how to make Unity function with Omniverse, and I am a bit sorry for him that I probably asked for too many technical details. By the way, this is what I could understand.
Then you get to install Omniverse on your computer, open the Omniverse Release, and find the Unity connector in the “Exchange Connectors” section. You install it, and it basically installs one plugin for Unity.
This plugin will provide you an Omniverse dashboard inside Unity, with which you can choose how you want to collaborate with your peers. There are two forms to start a collaboration, one offline and the other online (I invented these terms … they are not official ones).
The offline collaboration works similarly to version control systems, or like Dropbox, if we make a parallel with document writing. You open the typical project, make some modifications, then save the changes. When your partner opens the project, he/she gets from the server your modified scene and functions on it. It keeps it again for others to utilize.
The online collaboration functions in a way similar to Google Documents. You and your colleagues collaborate and begin one live session together. While you are in the live session, you could function on the scene together. Every modification made by one professional is reflected in real-time into the scene seen by the others. So a product artist could create a new product for a sofa in the scene in Substance, press it to Omniverse, and all the other live session employees would immediately observe it changing in their local version. At the end of the session, the team can see the list of changes it has made to the scene, decide if to keep them all or keep just a part, and then confirm that to Omniverse. Afterward, the confirmation and the server scene are updated for all the other workers to use.
USD and Unity
Omniverse exploits the unique support for USD offered by Unity to provide its functionalities. Behind the scenes, for every change, the system sends a “delta” of the changes to the Nucleus server that integrates it into the common scene. USD gives this possibility of working with deltas, and this makes its use perfect to work with a typical 3D environment. Moreover, since only deltas are sent and not the entire scene, the collaboration system is extremely lightweight on the network.
How teams can use it
I know that Omniverse is mainly utilized for simulations, but I wondered if it could be helpful also for small game studios to work together on a common Unity game. Dane told me that, indeed, it is a possible use: Omniverse is suitable both for enterprise applications and for making games.
Utilizing Omniverse, a 3D artist and a game designer could collaborate live on the same scene to create a degree together and then save everything when the level is complete. Since I work on creative projects with remote designers and remote artists, I can tell you that this would be a great device to work together because the current workflow now does not allow us to truly work on a scene simultaneously.
NVIDIA is functioning with developers to try to understand how to develop Omniverse to support them. For example, Dane told me that some developers like to utilize Omniverse because it makes it simple to plug in the application NVIDIA AI services like audio to face (that produces facial expressions from a voice) for NPCs. Another feature the company functions on is offering a “packaging process” for the scenes created with Omniverse. This implies that before you construct your game, Omniverse “transforms” the scene into the native format of your motor so that the build process of the game could proceed precisely as if you did everything in Unity without utilizing Omniverse at all.
As open system
I asked Dane which are the functionalities of Omniverse he loves the most. He stated that one of his favorite things at the end of the day is that everyone in a team can function with the device they know the most. The function of everyone is integrated seamlessly into a typical scene. So someone operating in Substance can create a material and include it to the scene. Then the developers working in Unity will observe that material is converted automatically to a Unity product by the system. Everything integrates seamlessly so that a group of professionals can work with the device they function best with.
And one consequence of this openness is that … people in Unity and Unreal could function together! Once there will be a connector for Unity, people utilizing Unity could change a scene that gets automatically updated for the designers operating in Unreal and vice-versa. So it is one unique thing that, for the first time, people working with different engines could work together on the same project. This reveals the power of Omniverse and the USD format.
He included that the idea of Omniverse is to be open and offer many functionalities and then let the groups decide how they want to utilize it and how it can improve their current production processes. In terms of Unity, the vision is mixing the advantages of utilizing Unity with the ones of using Omniverse.
Talking about VR, he also informed me that he loves the fact that Omniverse now offers amazing scenes provided with real-time ray-tracing in VR.
How you could try it
The Unity connector for Omniverse has been published at GTC 2022 and will be launched in beta at the end of 2022. Be sure to follow Omniverse on Twitter to be told when it gets launched. NVIDIA warns that it is one beta, and it is seeking for studios that are interested in utilizing it and provide feedback not just about the bugs but also concerning what are the features which software companies need from it.
And if you try it, please allow me to know what you think about it! I’m very curious to listen to the opinion of gaming professionals about using Omniverse for working with their peers.
Read the original article on Skarred Ghost.