Some days ago, I was looking at the code of HitMotion: Reloaded, the fitness game that we of New Technology Walkers developed some years ago for the Vive Focus Plus, and all the intricate system I had to make in Unity to have a cross-platform VR game… and I wondered: “Is all of this mess still needed today?” Given my recent experience with Unity developing, I could give myself only a clear answer: “NO”. With nowadays Unity tools, it is much easier to make cross-platform content, and with OpenXR this will be even better.
So let me show you how it is easy to create cross-platform VR content in Unity! I will develop the wonderful “Unity Cube app” (yes, always that one!) and make it work with Oculus Quest, Vive Focus+ and all other Vive Wave standalone devices… plus all PC SteamVR-compatible headsets! Writing this, I hope that more indie developers will consider porting their games also for other standalone platforms (e.g. Wave), so that one side they can earn more money because their games will be available on more places, and on the other side, they can give more life to the Oculus Store alternatives, so that we can have a more competitive VR landscape!
Remember that most probably a new HTC Vive standalone headset is coming, and it could be an interesting opportunity for all the VR devs to publish content for it, so it’s better that you know how to create content that is compatible with both the Quest and this new device.
Are you ready to learn how to adapt your content easily to all devices?
How to make a cross-platform VR application in Unity – Video
I made a video about how to develop an Oculus+Wave+SteamVR compatible Cube application in Unity, and you can find it here below! In it, I will develop it right in front of you and explain step-by-step the development and the final results. Enjoy it!
If you are a textual person like me, don’t worry… for you there’s also a written tutorial!
How to make a VR application compatible with Oculus, Wave and Steam VR runtimes
I didn’t know about it, because it hasn’t been advertised much, but now Vive Wave is compatible with the Unity XR Plugin Management and the XR Interaction Toolkit. This means that if you use these two frameworks, you can easily make an application that works both with the Quest and the Vive Focus Plus.
Let me show how to do it. I will guide you step by step in this tutorial, but I will assume you have at least a basic understanding of Unity. The version of Unity I’m going to use is 2019.4.10f1 LTS, and for other versions the procedure may be slightly different.
Create a Unity project
So, let’s start by creating a new Unity project, a 3D one, and let’s give it a cool name, like “CrossUnityCube”. Yes, Unity Cube is love, Unity Cube is life.
Add Unity cross platform management to Unity
Once Unity is open, head to Edit -> Project Settings… and open the settings of the project. Select the “XR Plug-In Management” tab. You should see a big button to install the XR Plug-In Management. Hit it, and let Unity do its stuff. The XR Plug-in Management is Unity’s new framework to manage multiple VR platforms in an easy way inside your Unity project. This is one of the two components that will make our solution cross-platform in an easy way.
The other one that we need is the XR Interaction Toolkit. This is very useful to add to our VR experience all the most basic interactions (grabbing, throwing, teleporting, etc…) so that they are cross-platform and cross-device. Thanks to it we can just specify for instance that the left hand can grab objects and that a cube in the scene can be grabbed, and this way we will be able to grab a cube with our left hand and throw it with all the headsets that the XR Interaction Toolkit supports.
To add it, head to Windows -> Package Manager… . In the search box of the popup that gets shown (it is on the upper-right corner), write “Interaction”, then click on the listbox next to it (the one with the “Advanced” label) and choose “Show preview packages”. Now on the left side of the popup, you should see the “XR Interaction Toolkit”. Select it, and the hit the button “Install” on the lower-right corner of the window to install it inside your project.
If a popup asks you something about an input system, answer “Yes” and let Unity reboot.
After the reboot, select Edit -> Project Settings… . In the Player tab, “Other Settings” group, look for “Active Input Handling” and choose “Both” and then let Unity reboot. I don’t know if this is necessary, but I usually set it this way while I get used to the new Input System.
Create the sample scene
Unity has created for you a scene in the project called Sample Scene. It should already be opened. Let’s add to it our grabbable cube.
In Unity menus, select GameObject -> XR -> Device Based -> Room-Scale XR Rig. This will remove the main camera in the scene, and substitute it with a cross-platform XR rig that lets the user move in the room. Notice that is important that you pick the “Device Based” rig, because I’ve noticed that the Action-based one doesn’t work properly on the Vive Focus Plus. The rig also contains two “Ray interactors” in your two hands, meaning that you will emit from your hands two red rays through which you can interact with objects and UI elements.
Let’s add our fantastic cube. Select GameObject -> 3D Object -> Cube. Then select the cube in the Hierarchy view by clicking on it, and in the inspector set its position to (0, 0.5, 3). Then, keeping the cube selected, scroll down in the inspector and click Add Component. In the resulting search box, look for “XR Grab Interactable” and select it. Leave the parameters of the new script as is, just check that “Throw on detach” and “Gravity on detach” are selected. This will make sure that our cube will interact with the two rays that you have in your hands. You will be able to grab it in your hands and to throw it. We had to script nothing, all these basic interactions are courtesy of the Interaction Toolkit.
Select GameObject -> 3D Object -> Plane and add so a plane to the scene. Leave it as it is. If you want, you can change the material of the plane so that it has not the same gray color of the cube. In the video, I have given it the “SpatialMappingWireframe” material, but it is not compatible with Wave, so it’s better that you create your own material based on the Standard Shader and assign it to the plane.
We have just made a simple scene in which there is a cube that you can grab by pointing it with the rays attached to your hands and pressing the grip button, and then launch it by releasing the grip button. Isn’t it cool?
Let’s celebrate this success with a fancy dance.
Add Oculus Support
Support for Oculus devices is integrated inside Unity, so you have to do nothing to enable it! It’s good to relax, isn’t it?
Add Wave VR Support
Something that almost no one knows in the community is that lately, HTC Vive has added support for the Unity XR Plugin Management and Interaction Toolkit inside Vive Wave. This means that it is incredibly easy to make a project that works both on Quest and Focus Plus.
But Wave VR Support must be added to the project, it is not ready out of the box like Oculus’s one. To do this, open the directory of the CrossUnityCube project with your file manager in Windows, and then open the file manifest.json inside the Packages directory. Any text editor should do the trick.
This file is a JSON file that specifies the packages needed by our solution. As it is written in Wave official guide, add these lines immediately after the opening curl bracket:
"scopedRegistries": [ { "name": "VIVE", "url": "https://npm-registry.vive.com", "scopes": [ "com.htc.upm" ] } ],
then save the file to disk. Doing this, we have just said Unity that besides its usually repositories for the packages, it should also consider the one by HTC.
Return to Unity, go to Window -> Package Manager. In the upper-left corner, you should see a dropdown with the label “Unity Registry”. Click on it, and in the resulting listbox select “My Registries”. At this point, you should see all the Vive packages that are now available. Select “Vive Wave XR Plugin – Essence”. I always use the “Essence” variant because it also offers me the native capabilities of Wave, but it is not mandatory. Install the plugin by hitting the “Install button” on the lower right corner of the window.
Wave SDK should show you a popup asking you to accept some settings. Select “Accept all” and live happily with it!
You can check that the installation has been successful by going to Edit -> Project Settings… -> XR Plug-In Management and checking that in the tab related to Android (the one with the droid icon) now there is a “WaveXR” checkbox.
Add SteamVR support
Steam VR is a bit problematic to add, since the official plugin made by Valve is not compatible with the XR Interaction Toolkit. Luckily, I have found a hack to make it work, and I describe it with great detail in this article. I will avoid copy-pasting all the long procedure here, so please go to my article and follow all the steps of the hack until the very end. You should be able to start directly from the 4th step, since we have already made the first 3 for Oculus and Wave. Be sure also to read all the recent updates to the article that cope with the needs of the new Interaction Toolkit plugin. Click on the post… and let the Force be with you!
Let’s build everything!
To build everything, head to File -> Build Settings… and check that the current platform is PC. Then click on “Add Open Scenes” to add the current scene to the build pool.
Then click on Player Settings… in the lower left corner of the window, select again XR Plug-In Management and check that in the PC tab the “OpenVR Loader” is selected. At this point, return to the build window and select Build. Choose your build location (for PC it is required that you create a new folder for the build) and let it do the build for you. When it is finished, you have finally your Steam VR build!
In the Build Settings window, click on Android and then Switch Platform. It will take a while, so go eathing some snacks while Unity switches everything to Android. When the process is finished, head again to the XR Plug-In Management settings and select ONLY Oculus. Return to the Build Settings window, and build again, specifying the name for the resulting APK. Fantastic, you have now a build for your Quest!
After that, return to the XR Plug-In Management settings and select ONLY Wave XR. If you forgot to uncheck Oculus, most probably your app will crash at startup, so be careful not doing it. Return to the build window, and build again, specifying the name for the resulting APK, that should be different than the one that you used for the Quest. Fantastic, you have now a build for your Vive Focus Plus!
As you have noticed, the build process just required us to change some switches and build. We had to make ZERO changes to the underlying code to cope for the different hardware. Of course this is a very simple application and in a real-life scenario, most probably some tweaks are needed to adapt the application for the different platforms (e.g. to show a different controller depending on the platform, or to fix a shader that doesn’t work on Android but works on PC, etc…), but what I wanted to show is that the basic interactions of the application, i.e. its skeleton, work out of the box on all platforms! This is the power of the XR Plugin Management and XR Interaction Toolkit. Theoretically, you can develop once and then deploy everywhere!
Enjoy your experience!
You can now install and launch your executables on PC, Oculus and Wave devices and see them work in the exactly same way! The Unity Cube is real, and you can grab it and launch it on all your devices! Isn’t this super-cool?
I hope you have enjoyed this tutorial, and if it is the case, please subscribe to my newsletter to not miss my next posts, and consider also donating some money on Patreon to sustain my hard work in informing the community! Thank you!