How to make an Augmented Virtuality app for Quest in Unity
A few days ago I posted this video on my social media channels where I poured some biscuits from a real-world biscuit box into my VR kitchen to have some breakfast:
The video got some interest from the community and some people asked me how I did it. Since I’m always favorable about sharing knowledge, let me tell you the secrets behind this application, so that you can do your augmented virtuality application for your Quest, too!
Before we start, let me tell you just a few caveats about this article:
- What I did was a prototype, so the solutions I’ve found are not optimized. I wanted just to experiment with technology, not make a product
- I’m not going to do a step-by-step tutorial, but if you are a bit experienced with Unity, you can use the info I’m providing you to create something similar yourself.
Augmented Virtuality and the connections between the real and virtual worlds
One thing that fascinates me the most about mixed reality is creating a connection between the real and the virtual world. I’m seeing many demos for the Vision Pro or the Quest in mixed reality where the passthrough is just a background for a virtual game, but I think that’s not the best way to exploit this technology. The best way to use MR is when there is a full blend between the real and the virtual elements. And I am particularly fascinated by how we can create a connection between these worlds, and how one can influence the other.
That’s why I decided to do a few experiments on the matter, with the breakfast experience being one of them. The main point of that demo is that the real biscuit box exists both in the real and the virtual worlds, and even if it is a real element, it has agency in the virtual world (it pours biscuits in it).
The experience uses a technology called Augmented Virtuality and I’ve been inspired by my friend Chris Koomen in using it. I like to classify realities using Milgram’s continuum and augmented virtuality substantially means that you are in a virtual world, but there are some real elements in it. The kitchen is the virtual setting, and the box is the real element living in it.
How to create an augmented virtuality experience for Quest in Unity
Creating this experience has been easier than I thought, thanks to the facilities offered by the Meta SDK.
Initialization
I launched Unity (I’m using version 2022.3 LTS, in case you are wondering), and created a new URP project. Then I imported the new Meta All-In-One SDK to add all Meta packages to the project.
At that point, I used the new cool tools offered by Meta to set up passthrough into my project. There is now an amazing feature in the Meta SDK that lets you add specific functionalities of your app as “building blocks”, with Meta taking care of their setup and their dependencies. I removed the Main Camera from the scene, then I selected the menu voice Oculus-> Tools -> Building Blocks and added the camera rig and the passthrough to my project. By just doing so, I already set up the whole project to be a mixed reality application, with just two clicks. Pretty impressive, if you see all the steps I had to do in my tutorial on how to set up a passthrough app on Quest.
After the app was set up for passthrough, it was time to add the virtual elements. Since I wanted to prototype, I didn’t want to spend time in asset creation, so I just downloaded some very cool free packages from the asset store. For the kitchen, I picked up this one, and for the cookies this other one. I put everything in the scene… now I had everything I needed, I had just to find a way to do Augmented Virtuality.
How to create “holes” in your virtual world
Launching the scene at this point, I could see the kitchen all around me, with no visible passthrough. A little delay in updating the image unveiled to me that the passthrough was correctly being rendered behind the kitchen, so it was there, but I could not see it because it was like the skybox of my world. Since I was in a closed VR kitchen room, I could see no background behind it. What I needed was a way to create a “hole” in the kitchen visualization to see the background “skybox”. But how to do it?
Heading to the Passthrough API documentation, you can discover that there are many tools to manipulate passthrough. What I chose was working at the shader level to create a hole in the VR world to see the passthrough behind it. I created a cube in the scene, and I applied to it a Material based on the “PunchThroughPassthrough” shader that can be found in the Meta XR Core SDK package. If you use this tool, the whole mesh that uses that shader becomes a hole in your virtual world to unveil the passthrough.
A test in the editor with the simulator showed that it was working, so I found a way to see some real things inside a virtual world, which was my purpose for doing augmented virtuality! But how could I show something meaningful?
How to show a specific object in Augmented Virtuality?
I did not just want to show a random hole, I wanted to see my real box of biscuits in the virtual world, so the hole should have shown exactly that box, even if I moved it. But how to do that?
Well, making a hole like the biscuit package is rather easy: I just took the cube of the step before, and I made it of the same dimensions as the real box: Unity works with meters, so it was very easy to create a mapping of the two sizes.
Localizing and tracking the biscuit package was a bit more complex. The ideal solution would have been to have that sort of 3D object tracking that AR SDKs like Vuforia or ARKit have. The problem is that Meta does not offer this yet. And since we developers do have not access to the camera frames, I could not even think about trying some external SDK to try to implement something similar. So I had to resort to the only tracking options that Meta offers: hand tracking and controller tracking. Since I wanted to do a quick test, I went for the fastest and most reliable one: I taped my right controller on top of the box, so it could track the box position in the virtual world.
Now I had just to put the 3D cube I generated above as a child of the controller in the Unity scene to have my box tracked in real and virtual worlds. The cube should be put at a local position and rotation that represents the pose of the physical box with regard to the physical controller.
To do that, I used two tricks to facilitate my work: first of all, I put the real box standing on the table, so that its only rotation was on the Y axes in my global physical coordinates (removing two degrees of freedom of variability), then I put the controller more or less at the center of the top face of the box, aligned with the orientation of the box. All of this was necessary to make the alignment between the real and virtual world easier to do. I pressed play in Unity, and put the cube as a child of the controller, making sure that its rotation was only on the Y axis in global coordinates and adjusting the local position and Y orientation so that they more or less matched the description above, so with the controller sitting more or less in the middle of the top face of the box, and the controller and the box having a similar orientation. It worked fairly well: when I took my controller in hand, the “cube” was moving together with it, resembling the shape of the box, and creating a hole in the virtual world that was very similar to the shape of the box: I had my biscuits in augmented virtuality!
There is a problem I want to flag about using your Touch Plus controllers as trackers. Or better, two problems. The first one is that these controllers go on standby pretty quickly to save the battery, so if you don’t move them, they go on standby and you lose the tracking of your object. The second is that since Touch Plus are tracked with a fusion of their IR LED tracking and hand tracking, if you do not put your hand around the controller, the tracking can become unstable, or even sometimes the system may start following your hand instead of the controller (especially over the Link connection). That’s why when I grabbed the box, I always grabbed it in a way that my hand was around the controller.
Adding biscuits
The addition of the biscuits was something relatively easy since it’s just Unity physics. I removed the cube, and substituted it with an “open cube box”, that is a cube without the top face and with thicker lateral faces, adding colliders all over them. Inside this virtual box I created, I put the cookies, with them being rigidbodies with colliders. This way, I let the physics engine do everything: when the virtual box was put upside down, the virtual gravity would have pulled the biscuits to fall from the box.
The only thing to be careful about was not to put the biscuits as children of the box… their transforms shouldn’t be moved by the parent one, but should be the colliders of the box moving them through physical interactions. Also, they should be “activated” only when the controller begins to be tracked, otherwise when the controller has its tracking start, it goes from the origin coordinate to its actual first detected position, and this jump makes all the cookies fly away.
Testing and building
I want to tell you some info also about the testing and building process. For testing, I’ve found it super useful to test the application via the Quest Link. There is a setting in the PC Oculus App (now called Meta Quest Link App) that lets you stream passthrough data via Oculus Link so that you can press Play in the Unity editor and test your passthrough app in the editor. It has potato quality of the passthrough, but it’s good enough to check that your app works before doing a long build on the device.
There are some caveats when testing in Editor, though: the controller tracking seemed to me more reliant on hand tracking while on Link, so I had always to put my hands around the controller when doing the test, I could not just grab the box. Also, the shader I selected for Augmented Virtuality on the box was working only on one eye on the PC, while it was perfect on the Quest build. Also, the execution of the whole application was more choppy over Link, while it was smooth in Build.
As for the building process, it was the same as for any Quest application. The Meta SDK offers facilities to help you with setting the required settings before building, like the OVR Performance Lint Tool or the Project Setup Tool which are also in the Oculus -> Tools menu.
And that’s it: I hope this little guide has inspired you to create your augmented virtuality application on Quest! If you do something with it, please let me know, I’m curious to see your experiments…
(…or if you want me to consult you in building some AV experiences, just contact me)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.