HTC Vive Focus review

Experimenting with Unity instant remote preview on the Vive Focus

These days I’m playing a bit with the two standalone devices that I have in the office: the Vive Focus and the Oculus Go. I like these two toys… they offer easy to use virtual reality. But there is one thing that I hate about developing for standalone headsets: the deploy time.

If you have ever developed for mobile, you surely know the pain of having to continuously build to test everything: every time you want to test for a modification, you have to hit the “Build and Run” button, wait for ages for the program to build and especially to deploy on the device and in the end you can finally test the program… maybe just to discover that you made the build with a wrong parameter and so you have to build again to actually perform the test. Ages of development spent this way.

That’s why some XR developing environments offer you the Instant Preview feature. This feature lets you just hit Play inside the Unity Editor to magically see the content of the app inside your XR headset, without building anything. This magic is possible because on the device gets installed a companion app that works together with the Unity editor, so that:

  • Unity actually runs the app on the PC and then streams the camera render to the companion app, that displays it onto the screen of the device;
  • The companion app gets the data of the headset (rotation, position, etc…) and sends them to Unity that can so change the pose of the main camera.

I’ve used this awesome feature with HoloLens, where I really found it super-handy to prototype fast… and I know that it has also been implemented in Daydream platform. Other platforms, included Gear VR, do not feature this useful trick… that’s why when developing for Gear VR it is usually suggested to develop on PC with a Rift attached (the Oculus Utilities for Unity are the same for all Oculus devices…) and then use an actual Gear VR device only for the final refinements and tests.

how to record stream video of the vive focus
Me playing with the Vive Focus headset

I tried looking for something similar for the Vive Focus and… I actually found nothing. So, I had this crazy idea: why don’t I write a prototype myself? I’m a mediocre developer and I’m super-lazy, but maybe I could be able to find a way to do something that somewhat works. I felt empowered by the spirit of the 30 Days In VR, so I decided to dive into this experimentation, with the idea of creating something cool for virtual reality and then sharing it with the community. So proud of myself, I started spending the last two days in experimenting with implementing an instant preview feature for my Vive Focus (actually, Mister President’s one).

Being lazy, I didn’t want to develop everything from scratch, so I looked for something that already did something similar. I discovered with much surprise that Unity already features a super-useful mobile instant preview feature: it is called Unity Remote. You have to install the Unity Remote 5 app on your Android Phone, then set a special Editor Setting inside Unity… and you can test an app on your phone just by hitting play inside Unity! And it works like a charm. I spent my entire life without knowing this… I want back a year of my life spent on actually testing things by deploying them on Android phones!

Anyway, the problem is that Unity Remote 5 is a standard Android app, so it is not compatible with the Vive Wave ecosystem and can’t be run on the Focus. Damn, it looked so good. But luckily, its previous version, Unity Remote 4, is available as a Unity Package on the asset store. So I downloaded it inside a project, I imported the Vive Wave SDK and deployed the app on the Focus. It launched correctly. I pressed Play inside a Vive Wave project on Unity and with much surprise, I noticed that it actually worked! I could see the content of the Unity camera directly from the headset! WOOOW!

Vive Focus instant preview remote vr unity
This is the output that Unity gives in the Game window for a Vive Wave project: as you can see, it is exactly as it would appear on the screen of the device. So we just need to take these frames and paste them on the screen of the device and we’re done! The Unity Remote app does exactly this, so it is perfect

I scored a point… but there was a big problem: Android apps are usually reactive only to the rough orientation of the screen, but the main camera doesn’t rotate following the orientation of the device as with VR headsets. I needed a way to rotate the camera in Unity following the orientation of the headset. Luckily, the Unity Remote app already streams the gyroscope data from the device towards Unity, so I just had to create a little script to take the rough data from the gyroscopes and apply it to the Unity camera. It worked! I could rotate my head and see the scene updating correctly… I could modify the scene inside Unity and see the results immediately in the headset! Wow!

I was happy and ready to share my results with the community, but something in my head said that I could do something better. This is because the solution featured various enormous problems:

  • Big lag and bad resolution;
  • No support for the translation of the headset, and this is no good for a 6 DOF device;
  • Bad support for the rotation of the headset: raw data from the gyroscopes is pretty unusable and sometimes results were pretty weird.

So, feeling the spirit of Enea Le Fons inside me, I continued working on it. I decided that instead of taking data from the gyroscope, I should have taken data directly from the Vive Wave platform, asking for the position and rotation of the headset and sending them to Unity. And here my headaches started: Vive Wave is pretty complicated and still very poorly documented and so it is very hard to do with it things that are outside of what it is conceived for.

30 days in vr 30daysinvr virtual reality
Enea Le Fons wearing the Vive Focus

I added the WaveVR_DevicePoseTracker script to the scene that I had used up to that point, hoping that it would give me the pose of the headset, but the script just refused to work. It seemed that to work it needed a WaveVR prefab in the scene. But I didn’t want to use standard Vive Wave VR rendering method, because the Unity Engine, when I hit Play, already gave me a distorted view of the scene, so I had just to take it and display it on the screen of the device without further distortions or calculations. I wanted to use a standard camera and attach the frame sent by Unity on the screen of the device: the distorted view already provided me by Unity, together with the distortion of the lenses, should give me the right visualization. The standard rendering of the Vive Wave VR prefab, instead, renders the whole scene and apply to it distortions, that are useful in the standard scenario, but were a problem in this one.

I got crazy, because I couldn’t find a way: with the standard camera I could get the correct rendering but no data from the sensors of the device, while with the WaveVR prefab I had the data from the sensors, but no correct rendering. In the end, I discovered that to have the updates from the sensor, I had to call some strange things like this one to initialize the device

WaveVR_Utils.IssueEngineEvent(WaveVR_Utils.EngineEventID.HMD_INITIAILZED);

and then this one in the update loop to update data read from the sensors

WaveVR.Instance.UpdatePoses(WVR_PoseOriginModel.WVR_PoseOriginModel_OriginOnGround);

BAM! I had the data from the sensors… but unluckily, the first call to initialize the HMD makes the screen all black, so again I had no visuals. Many headaches after, I discovered that to show something on the screens of the display, there were other magic calls. For instance, with these three calls

WaveVR_Utils.SetRenderTexture(currentRt.GetNativeTexturePtr());
WaveVR_Utils.SendRenderEventNative(WaveVR_Utils.k_nRenderEventID_SubmitR);
WaveVR_Utils.SendRenderEventNative(WaveVR_Utils.k_nRenderEventID_RenderEyeEndR);

I could send a texture directly to the renderer of the right eye, that is the right screen of the headset. Uhm, so, what if I take the texture obtained from Unity, split in two, resize the two halves to the right dimensions of the Vive Focus screens and then send this data to the two Focus screens? It should work. I’ve actually done it, also with the help of a script found on the Unity Wiki. The result? It worked… but at like 1 FPS. You read it right! 1 Frame per second… when I did it, I was so happy that it worked that I called Max to see it. He said: “Uh, cool! But it is so sloppy that it can be used as a torture method inside Guantanamo!”. Ehm… 1 FPS maybe is not the ideal framerate for a VR application…

vive focus unity instant preview
Donald Trump on my experimental project (Image from Know Your Meme)

The problem was in the fact that I was using the damn GetPixels and SetPixels on the Texture2D objects. If you’ve ever played with Textures, you surely know that these functions are daaaaaaaamn slow and are able to kill the performances of whatever application. And by killing performances, I mean something like a 10x factor: they require you to move the texture from the Graphics Card memory to your RAM memory and this is a bottleneck. How to avoid it? By doing all the operations on the Graphics Card.

I heard someone telling the word RenderTexture inside forums, so I started experimenting with Materials, Textures and RenderTextures and… in the end I did it! From 1 FPS to maybe 20. The app is still a puking machine, but at least it has a sense as a prototype: it tracks both the position and rotation of the headset and shows you the content of the Unity scene inside the headset!! To be really usable, it needs to arrive at a decent framerate of at least 60FPS, to work over Wi-Fi (unluckily Unity Remote 4 works only with the device connected via USB), and to stop triggering a GL exception every frame… but who cares 🙂 I’m so proud of it that I decided to share it with the whole VR community.

And so now on GitHub there is an opensource project that offers a terrible Instant Preview feature for the Vive Focus. It is commented a lot, so that you can understand how I did it, learn from it and also upgrade it. It is under MIT license, so you can do with it whatever you want. If you’re interested in it, you can find it here. It is my first complete project pushed on GitHub, so be forgiving with all the errors that I’ve made.

https://gfycat.com/EmbellishedWarmheartedChimpanzee

(in this GIF you can see the system working inside Unity and inside the headset… isn’t it cool?)

I really hope that what I did will inspire the Vive Wave developers to take my prototype and realize a working solution for this platform: they know very well their system, so I’m sure that they know how to optimize all the stuff and fix the GL exception. Furthermore, if Vive partners with Unity, maybe the preview feature could be integrated directly into the engine, as it has happened with HoloLens for instance.

Anyway, in two weeks, Vive will make a great event inside which a lot of big news will be announced (and I’m already hyped for it)… who knows if the Instant Preview for the Focus will be one of them… 🙂

While we all wait for this news from HTC, enjoy my little instant preview feature! I hope that it will be useful for someone, and I also hope that what I did will inspire other developers to take their Vive Focus (or whatever other headset) and start doing crazy experiments with it: do something new with it, have fun with it and share your results with the community… and if what you do will work only at 1 FPS, don’t worry, It will be awesome the same!

(Header image by Engadget)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

9 thoughts on “Experimenting with Unity instant remote preview on the Vive Focus

  1. Hahaha this was a great read. I love hearing about experiments like this, even if they don’t achieve the intended results. Well done, sir!

  2. Great work! These standalone headsets have some great development potential, for sure.

    Here in the UK, we wait patiently for our new headsets, no word from Lenovo (Mirage solo) or HTC (Vive Focus)…hell we didn’t ever see the Samsung Odyssey mixed reality headset.

    We are sad since our stupid brexit decision, the headset makers have abandoned our country 😫

      1. Yes June is hopeful! I’m really looking forward to the Bladerunner Revelation game 😍 it’s already launched in the US along with mirage solo headset.

  3. Nice work! I see how it’s already can be usable in current workflow as soon as controller support comes in! This definitely should be a thing to save years of lifetime. If Vive Wave devs will not take the initiative, I’ll try to help with further development of this project.
    So far, I’m trying to get used to wave SDK, which is… strange and undocumented sometimes.
    I was happy to find even such a trivial thing as touchpad raw axis data

    1. There is not controller support because mine is currently broken :(. But the replacement is coming, so I may try to introduce it! I’ve already notified someone from HTC, hope they’ll take the project to an usable stage.

      Vive Wave needs better documentation and a forum in some western language. Current one is only in Chinese!

  4. Thanks for the link to the forum… I hadn’t find that. When I looked for how to stream the focus on an external screen, I found instructions only in the Chinese forum! o_O

    Regarding the durability… I have not problems on the lock yet, but I have had problems with the controller that broke up in little time. My guess is that Focus is still a work in progress and they still have to fix some things…

Comments are closed.

Releated

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]

valve deckard roy controllers

The XR Week Peek (2024.12.02): Valve Roy Controllers 3D models’ leak, Black Friday VR deals, and more!

Happy Thanksgiving weekend to all my American friends! We don’t have Thanksgiving in Italy, but I know it’s a very important celebration in the US, Canada, and a few other countries, so I hope all of you who celebrated it had a great time with your family.  To all the others who did not participate in […]