After a long time with my lovely Unity 2019.4 LTS, I have decided it was time to switch to something new, not to miss the new features that Unity has implemented these years. I have so started using Unity 2021.3 LTS (2021.3.5 to be exact) and experimenting with it. One of the new features offered by the new Unity is the ability of building OpenXR applications, and I wanted absolutely to test it! Let me tell you what I discovered in this process, in particular, how to build using OpenXR, and how to run the OpenXR applications on your PC.
This tutorial also comes in video format. If you want, instead of reading a wall of text, you can see it in the video here below:
OpenXR
A very short primer about OpenXR, in case you forgot what it is: it is a standard, created by the Khronos Group, which has been adopted by all the major XR companies and that allows for interoperability of XR hardware and software from different vendors. That is, an application built for the Oculus runtime can also run for the Valve Index on SteamVR. This is something amazing, because it lets us developers build an application once and then deploy it everywhere.
How to build an OpenXR application
Let’s see how to build a Unity 2021 application with OpenXR. Notice that in this written tutorial I assume you are already confident with Unity. If you are a beginner, please watch the above video where you can see me doing all the steps in front of your eyes.
Project preparation
Let’s build an application using OpenXR. Instead of making the application ourselves, I’m going to let you download an already-prepared sample application, so we can skip all the creation process and have out of the box an app that lets us test both the headset and the controllers interactions. I’m talking about the XR Interaction Toolkit Example app, which you can download from GitHub (link).
Download the source code, and then open the project that is in its “VR” directory. It is a Unity 2019.4 project, but please force Unity Hub to open it with Unity 2021.3. Unity will show you various dialog windows asking you many confirmations about if you are sure you want to convert to a higher version of Unity, you want to convert the database, the materials, etc… You always answer that yes, you are sure. We won’t come back…
Browse some memes until Unity has finished all the importing process. After that, open the scene WorldInteractionDemo from the directory Assets/Scenes/. It’s a sample demo of the XR Interaction Toolkit with various interactions with the environment. It uses the new Unity Input System and all the input is wired through actions. This is a detail I’m specifying because I’ve noticed that Action-based XR-origins fit better with OpenXR builds than Device-based ones.
The scene is already working, so you don’t have to modify anything. Also the packages have already been updated by Unity during the project version upgrade, so you don’t have to care about them at all. We can go directly to the building phase!
How to build for OpenXR – PC
Let’s see how to configure the project so that to build for OpenXR. Open the build window (File -> Build Settings…) and add the current scene to the scenes to build, if needed. Then click on “Player Settings…” and in the project settings, go to the XR Plugin Management tab. Select the “PC” platform in it (it is the one with the small icon of a display) and see that there are various build targets. Click on OpenXR, and wait for the right plugin to load.
After that, you will see that there is an error, because you have to configure the profile for it. Go on the OpenXR subtab and add in the “Interaction Profiles” section the profile of the controllers you are using. Since I use Quest+Link, I have added Oculus Touch Controller Profile. If you plan to build for different headsets, here you have to add all the controllers that you plan to support. And that’s it! Now we can build the application and have the executable that we want.
In case you also import the Oculus Utilities plugin, you have another flag to check, “OculusXR Feature”, which lets you allow the features contained in the Oculus Package to run even with OpenXR. I doesn’t suggest you to do this, though, as we’ll see in a while.
How to build for OpenXR – Android
If you switch the build target to Android, you have to repeat for Android the same things done before, that is you have to select OpenXR as the XR plugin for the build in the Android subtab and you have to add the interaction profiles of the controllers you are planning to use.
Notice that most probably when building for a standalone headset (e.g. the Quest), you also have to change other settings, like setting the color space as Linear and setting the build architecture as ARM 64, but these are settings not related to OpenXR, so I won’t cover them here.
Things to be careful about when building for OpenXR
Some suggestions from my experience:
- Build with OpenXR for PC, but not for standalone VR headsets. Quest support for OpenXR is still not production-ready (as you can see in this table), and I had some builds not working properly myself. Pico doesn’t support OpenXR yet, making you lose the advantage of deploying for “multiple platforms”. So, for Pico Neo and Quest, I still build with the XR plugin related to the destination platform. I hope this will change in the future, but at the time of writing, things are this way;
- Use Action-based XR Rigs when building for OpenXR. I’ve noticed that Device-based Rigs produce incorrect interaction rays orientation when you use OpenXR as runtime
- Perform testing on your project: I’ve noticed that there are still some overall instabilities and incompatibilities when using OpenXR with Unity. I still consider it a beta implementation;
- Remember that when using OpenXR in Unity, at the moment, you can not use the functionalities of the vendors’ plugins in your projects. This means that if you are using OpenXR to build for Quest, you can’t use the functionalities of the Oculus Utilities package. There is a “OculusXR Feature” flag in the OpenXR settings tab which is meant exactly to allow that, but if you activate it, the system doesn’t detect the position of the Oculus Touch controllers anymore (this is a bug with the Unity 2021 version, someone suggests that reverting to Unity 2020 solves the issue). To make things work again, you have to deactivate the flag, but then you have an annoying dialog suggesting you to activate it every time you run in play mode. So, my suggestion is not to import the Oculus plugin from the Asset Store at all.
All in all, I think that OpenXR has still a long road to go to become adopted by all engines, headsets, and runtimes in a stable way.
Testing an OpenXR application in play mode
How can you test your OpenXR application in play mode in Unity? Well… hitting the play button… as usual.
But there is one feature more for PC… you can choose the runtime to test it on! Since OpenXR can work both with Oculus and SteamVR runtimes, you can select in a drop-down menu what kind of runtime you want your play test to run on. This way, you can verify easily if your game works on all platforms. The dropdown is in the XR Plugin Management -> OpenXR subtab of the project settings: it is called “Play Mode OpenXR Runtime”. Try selecting different options and hitting play and you will notice that the application will be redirected to the right runtime every time.
Building and running an OpenXR application
This is easy peasy, too. You just build the application in the usual way, that is hitting “Build” button in the Build Settings window and choosing where do you want it to be built.
At this point, you have an executable, you run it, and it (hopefully) works.
But now I have a question for you: how do you test it with the different runtimes? I mean, before OpenXR, you had different builds for different headsets, and your computer was smart enough to understand if open it with Oculus or SteamVR, but what about now that the executable file is only one? How do you choose that you want to run it through SteamVR, for instance? Of course, I don’t have only the question, but also the answer…
This point puzzled me a lot, and I had to ask the help of Saint Google to solve it. Because I kept having the application running with the Quest runtime even if I had SteamVR open and even if I opened it via Steam shortcuts. It turns out that at the system level, you have only one active OpenXR system, and you have to choose which one it is. And whatever you run, gets executed by that runtime, which is hopefully your preferred one, that is the one that you use the most for your games. For instance, since I use Oculus Link a lot, my default one was the Oculus one, and that’s why all my OpenXR builds were executed through the Oculus runtime. When in Unity you select “System Default” as your “Play Mode OpenXR Runtime”, this is exactly the one that is used in play mode.
To change your active runtime, you have to open the settings of your runtime and select that you want that as the preferred system.
In Oculus, it means opening the Oculus application on your PC and going to Settings -> General, and clicking the button Set Oculus As Active at the OpenXR Runtime line.
For Steam, first of all, you must have a headset connected, then you can head into the Settings of Steam VR, activate the Advanced Settings, choose the “Developer” tab, and there you find a button to set SteamVR as the active OpenXR environment. Of course, only one runtime can be active at the same time, so if you activate Steam, you are going to deactivate Oculus, and all the next times you will launch an OpenXR application, it will be launched through SteamVR.
This way, you can play your build with different runtimes. Isn’t it fascinating how a single EXE file can be run in different ways by different headsets and runtimes? For me it is! It is the magic of OpenXR…
I hope this tutorial has been useful for you, and if it is the case please subscribe to my newsletter and share this article with all your fellow VR game developers!
(Header image containing Unity logo and OpenXR logo by Khronos group)