On Friday, I’ve published on my social media channels a video where I showcased that I have been able to create an augmented reality app for the Vive Focus. I promised that today I would have shown you how I did it and so now I am here writing this article to keep my promise. So, how can you create an augmented reality application for the Vive Focus? Let me tell you the story of how I did it…
My desire to create an application for the Focus has started in March, when, during the #30DaysInVR project, Enea Le Fons and I discussed a lot of times of the cool things that the Vive Focus enabled and which we could experiment on. One of the ideas that we liked the most was the one of offering augmented reality on the Focus, exploiting the camera stream of the device. The Focus features two frontal cameras that are used for the inside-out tracking and through scripting, in Unity, it is possible to access their stream. The problem is in the fact that the Vive Wave SDK only offers the camera mirror as a single texture containing the frame of both cameras side by side and to obtain an AR mode, the developer should redirect the left half of this texture only to the left eye and the right half only to the right eye. Then there are distortions and other things to take in count. So, implementing it is not trivial.
I have been super busy in the latest months, so I haven’t found the time to truly experiment with this, but this idea has always run as a thread inside my head and my desire to do it has increased over time. So, last week, I forced myself to find the time to do AR on the Focus and I have obtained interesting results. And since I’ve done it inspired by the #30DaysInVR event, I decided to do this experiment carrying on Enea Le Fons’s spirit, offering my results completely for free forever to the community, so that everyone can develop AR apps on the Focus using my work. Are you interested in it? Then keep reading!
My idea to develop this augmented reality functionality in Unity was this one:
- Create two identical quads in the scene, completely centered in front of the user’s eyes: let’s call these quads QuadL and QuadR;
- Make sure that QuadL has a material that only gets the left half of its associated texture; the other quad, QuadR, should take only the right half;
- Playing around with Unity layers and Cameras’ layer masks, make sure that left eye camera can’t see QuadR and right eye camera can’t see QuadL;
- Grab cameras’ texture at each frame and assign it to the material of the two quads;
- Enjoy 3D camera stream in front of your eyes;
- Move the quads far away in the scene, so that all the other objects appear in front of them so that you can augment reality by having virtual objects in front of them with the stream of the real world in the background.
As you can see, the idea is basically to show on the left eye the stream of the left camera and on the right eye the stream of the right camera.
I was going to develop exactly that when I opened the CameraTexture_Test scene of the Vive Wave SDK samples to see how to properly obtain the cameras stream on the Focus. The sample takes the texture with the cameras’ stream and attach it on a cube, just to show you how to access the stream from code. But in the latest version of the SDK, I noticed that there’s in the scene an object that is disabled by default called GameObject and that has attached to it a strange behavior called HalfUVSquare. Analyzing the code, I discovered that basically this code was devoted to doing all the points described above. Even better, thanks to a custom shader and a custom rendering technique that lets you show a quad in a different way for both eyes, the process is even optimized. I was quite thrilled: the AR functionality was already implemented then! I activated the object and launched the project on my Focus, ready to experiment AR. The result anyway, was this one:
There was no stream in front of my eyes. Analyzing the code, I discovered that there were some crucial lines commented out, so I uncommented them and finally, I could see something. It was weird that these lines were commented… I think that maybe this script is a work-in-progress and Vive is making it unusable until it has been completed. Anyway, restoring the lines the program somewhat worked, but the stream appeared distorted and mirrored; furthermore, removing the headset and putting it on again, the stream always froze. It was unusable.
So I decided to make my hands dirty into the code and I started modifying this code written by HTC: I fixed the UVs so there was no mirroring, adjusted the dimensions of the quad so that the stream had the right aspect ratio, added some callbacks to make the app resume correctly after a pause, and implemented the process to request the authorization to use the camera stream. In the end, I managed to see the cameras stream correctly in 3D in front of my eyes! It worked!!
I was excited…now I had only to add a virtual object: Max proposed me to add a little carrot planet that he created himself during #30DaysInVR and to add some custom animations when the object was pointed by the controller. I got a simple Event Handler from Vive Wave SDK that made objects rotate when pointed with the controller and made them draggable with the controller as well. I tried to build the experience with the planet in AR and it worked! I could see a little planet and interact with it in augmented reality inside the Vive Focus! It was amazing!
Of course, being just an experiment, this AR features a lot of issues, like for instance:
- The camera stream is in low resolution and only black and white (this is because of the cameras on the device, so can’t be fixed);
- The camera stream is distorted: since the frontal cameras must have the biggest FOV possible, they also have high radial distortions. Without a proper undistortion stage, the images appear as “curved” and make your eyes hurt a bit. This has been solved in the passthrough by Vive, but I haven’t had the time to do that in this project;
- The shown AR stream appears as letterboxed since it doesn’t fill all the field of view of the user: this is because the camera frames have a landscape orientation (640×400), while the screen for each eye is portrait(1440×1600). This means that the upper and lower parts of the screen aren’t covered by the images of the real world;
- The program is computationally heavy and makes the Focus fan throttle as if it had to take off;
- AR objects don’t appear completely fixed in space, but float a bit;
- There’s no detection of objects, planes, surfaces, meshes of the surrounding environment;
- Because of the previous point, everytime that you launch the app, the AR elements will be in a different position.
So, this is not Hololens nor ARCore, it just an AR overlay. Anyway, I found having developed augmented reality superpowers for the Focus cool… and also Max was excited by it. We also made a video to showcase what we have been able to accomplish. Have a look at it… isn’t it awesome?
If you’re interested in experimenting with it notwithstanding the above problems, the good news is that I’ve created a GitHub repo for that. All you have to do is download its content and follow the instructions in the README.md file to add AR to your project. I’ve made a handy package, so adding AR functionality is now super-easy… you don’t have to code anything if you want to experiment with AR: you just import the package and add a prefab to your scene… and voilà, you have AR. You can also start from the sample Unity project and modify it as you wish to implement the logic that you want. Or you can just give a try to the APK to see how AR is on the Focus. You have a lot of choices. Ah and of course I’ve commented the code, and the license for all of this is the MIT License, so basically you can do what you want with the code!
So, the answer to the question “How to do augmented reality with the Focus?” is: download my repository and use it to implement AR. And if you have time, you can also improve it and make its functionalities better, for instance by undistorting the camera frames. So, start with my experimental code and evolve it to create the AR that you need for your project.
Of course, HTC can do the same. The Chinese company could:
- Use my code to fix the one of the sample (or to create a new AR sample);
- Add radial distortion correction in the quad shader so that the stream appears better (HTC knows the parameters of the cameras, so this job should be easy for its engineers);
- On the long run, port on the Vive Focus the SRWorks SDK that currently works on the Vive Pro. I know that the Focus has limited horsepower, but only something like SRWorks, that performs environment scanning, can offer true augmented reality for the Focus. AR gives its best when it can understand the environment around us, otherwise, it is just an overlay. I can’t wait for the time when this will happen.
And that’s it! If you have a Focus, I recommend you to give a try to my project and experiment with augmented reality. I’m curious to see what you will create with it! Have fun in AR 😉
UPDATE: Having fun in AR myself, I slightly modified the code and added a shader to the shown stream of the real world and obtained a very cool effect: