Hand Tracking on Vive Focus 3: all you need to know
After I have written my super in-depth review on the Vive Focus 3, HTC has finally enabled hands tracking on its device, so I decided to talk about it to you. In this post, I’m going to tell you how to implement hands tracking in Unity for the Vive Focus 3, how to enable it on the device, and what are its performances, also compared to the ones of the Oculus Quest 2. There is a lot to talk about, so keep reading, and I’m sure you will find this article interesting!
All you need to know on Hand Tracking on Vive Focus 3 – Video
I made a super cool video on the topic, where I highlight:
- How to activate hands tracking in your Vive Wave Unity project
- How to activate it inside your headset
- What are its performances
- How it does compare with the tracking of the Oculus Quest 2
Have a look at it here below (and please, subscribe to my Youtube channel!):
Otherwise, keep reading for the textual version.
Hands Tracking on the Vive Focus 3
HTC has just enabled hands tracking on the Vive Focus 3 as a developer feature: interested developers can start developing applications for it, waiting for it to become a more stable feature on the headset. Hands tracking was already enabled on the Vive Focus Plus, but some days ago, HTC has released the first package to use it also on the Focus 3. The unity package to support it is still in preview, and hands tracking is not enabled in the system UI of the headset, so I think that HTC has still to make many steps to improve it. But even at this stage, its performances are already much superior to the ones that I reviewed on the Vive Focus Plus some months ago.
How to integrate hands tracking in your Vive Wave Unity project
(This chapter is tailored to developers. If you are not interested in development details, skip directly to the “How to enable Hands Tracking on the Vive Focus 3” paragraph)
To develop for the Vive Focus 3 in Unity, you have to use the Vive Wave XR plugin. Let’s see how to use hand tracking in Wave inside a new Unity project.
How to install Wave XR Plugin
Something that almost no one knows in the community is that lately, HTC Vive has added support for the Unity XR Plugin Management and Interaction Toolkit inside Vive Wave. This means that it is incredibly easy to make a project that works on the Vive Focus 3.
But Wave VR Support must be added to the project, it is not ready out of the box like Oculus’s one. To do this, open the directory of your project with your file manager in Windows, and then open the file manifest.json inside the Packages directory. Any text editor should do the trick.
This file is a JSON file that specifies the packages needed by our solution. As it is written in Wave official guide, add these lines immediately after the opening curl bracket:
"scopedRegistries": [ { "name": "VIVE", "url": "https://npm-registry.vive.com", "scopes": [ "com.htc.upm" ] } ],
then save the file to disk. Doing this, we have just said Unity that besides its usually repositories for the packages, it should also consider the one by HTC.
Return to Unity, go to Window -> Package Manager. In the upper-left corner, you should see a dropdown with the label “Unity Registry”. Click on it, and in the resulting listbox select “My Registries”. At this point, you should see all the Vive packages that are now available. Select “Vive Wave XR Plugin – Essence”. I always use the “Essence” variant because it also offers me the native capabilities of Wave, but it is not mandatory.
Before installing the plugin, be careful that you are installing the right version. At the time of writing, in Unity 2019, Unity suggests me to use Vive Wave XR 1.0.1, but this version is not suitable to develop for the Vive Focus 3. Click on the little arrow next to the label “Vive Wave XR Plugin – Essence” in the left column, and you should see an entry with all versions of the plugin (or a button with written “See all versions”: if you click it, it expands on all versions of the plugin). Select version 4.1.0 or above (at the time of writing there is 4.1.0-preview.5.1) and then install the plugin by hitting the “Install button” on the lower right corner of the window.
Wave SDK should show you a popup asking you to accept some settings. Select “Accept all” and live happily with it!
You can check that the installation has been successful by going to Edit -> Project Settings… -> XR Plug-In Management and checking that in the tab related to Android (the one with the droid icon) now there is a “WaveXR” checkbox. Be sure that this checkbox is checked of course.
A look at the Hand tracking SDK
Wave XR usually automatically imports the hands tracking feature into the project.
I won’t dig much into the SDK because it is still in preview, so subject to future changes, but having a look into it, I was able to find more or less the same features of every hand tracking SDK:
- Sample prefabs that you can already use out of the box to implement hands tracking into your project
- The possibility of using your own rigged hands model
- Scripts to track every single joint of the hand
- Possibility of having the confidence for the detection of each hand
- Facilities to use your hands as input devices, pointing at elements with your fingers and pinching with your index and thumb to select them
- Possibility of detecting simple gestures (like OK, pointing, five fingers, etc…)
All in all, it seems pretty good, the only problem is that it is not very well documented yet. You can start with a quick start guide here, but then you have to learn by looking at the samples and by performing trial and error.
Build the hands tracking samples
To give a test to the SDK, you can build the samples, and in particular, you can start from these two scenes:
- Assets\Wave\Essence\Hand\Model\4.1.0-preview.5.1\Demo\HandModelTest.unity that shows you just your hands into a scene, without giving you the possibility of interacting with anything
- Assets\Wave\Essence\InputModule\4.1.0-preview.5.1\Demo\HandInput.unity that shows you how to use your hands to interact with some UI elements as if the hands were two controllers.
Notice that of course the above two directories “4.1.0-preview.5.1” may have a different name for you depending on the version of the plugin that you have installed. If when you open the scenes, you see a popup asking you if adding an Input Manager or something like that to them, click on Cancel and ignore it.
Before building one of the scene and testing it, don’t forget to do two operations:
- Find the HandManager in the scene and in the Inspector check the “Initial Start Natural Hand” checkbox, so that hands tracking starts immediately when you launch the application
- Select the menu Wave -> HandTracking and check EnableHandTracking.
After that, build the application and install it on the device. But be careful that if you run it now, it may not work anyway.
How to enable Hands Tracking on the Vive Focus 3
To use hands tracking on the Vive Focus 3, you have to enable it in the system settings.
While in the main lobby of the headset, select Settings -> Connectivity and then look on the right for Enable Hands Tracking and turn on the switch. After that, it is strongly suggested to reboot your headset (or turn it off and turn it on again).
Cool, now you can run the samples you have just built and experiment hands tracking on your device!
Hands tracking on Vive Focus 3 – Review
So how does hand tracking perform on Focus 3? It is night and day if compared to the one of the Vive Focus Plus. Thanks to the XR2 chipset, the system works well and it is fast too, so hand tracking is finally usable.
The device detects the overall pose of the hands and also all five fingers. The pose of the fingers is somewhat accurate, but it is not perfectly adherent to your real pose. Especially the pinky and ring fingers may have a wrong pose, while the thumb and index usually are almost accurate. I tried to make the thumb touch each individual finger, and when it touched the ring or the pinky, the system detected as if the last three fingers of the hands were closing together, and this was wrong. Especially if there are self-occlusions in the hand, the tracking quality degrades. Occasionally one of the hands (usually my left one) lost the tracking, but it recovered it pretty fast.
Anyway, the tracking quality is somewhat good and for sure usable in some applications. The tracking is quite resistant, and I tried to punch the air, and the tracking on my hands was not lost, so it works even if the hands are moving a bit fast. It also was working on my whole field of view, even if at the periphery the tracking quality was a bit inferior.
It doesn’t work at all with two hands, though: like on the Quest, you must keep your two hands separate, or the tracking of one or both of them gets lost. But this is pretty common on standalone devices: until now the only engine that I’ve found good enough with both hands is the UltraLeap one.
All in all, I would say that the performances of the system are promising. The hands are good enough and they are already usable for some applications, but they could be better.
Hand Input on Vive Focus 3 – Review
HTC also offers some facilities to use your hands as controllers through which you can point at objects and then select them by pinching. I have tried it and I can say that even if it theoretically works, practically speaking, it needs a lot of polish.
This is especially because of how the pointing system works: it doesn’t point in the direction you intend to point in most of the cases, and also it usually requires you to keep your hands risen, so it is pretty tiresome. In comparison, with Oculus, usually, you can keep your hands in a more relaxing low position when pointing at objects on the screen. Pair all of this with the tracking glitches I have described above, and you obtain a system that is not much usable.
Anyway, as I have said, the SDK is still in preview, so this can be enhanced in a future update.
Vive Focus 3 vs Oculus Quest 2
I’ve tried both hands tracking engines side by side, and I can confirm that the Quest one is still better. Not only its SDK looks more refined, but its performances are higher. The tracking is slightly more reactive, and it is more precise especially for what concerns the ring and pinky fingers. It is also more reliable: I don’t have one of the hands losing the tracking sometimes. The FOV of the two trackers is comparable, instead.
The Vive Hand Tracking works better in the punches scenario if compared with the LOW-frequency mode tracking of the Quest 2, but it has comparable performances with the HIGH-frequency mode. Both hand tracking engines do not work well with two hands interacting together.
So Vive Focus 3 has a decent tracking, but Quest 2 one is better.
Final thoughts
This hand tracking package is a good start for HTC in the hands tracking realm on the Vive Focus 3. I really can’t wait for the SDK and the runtime to be improved, so that to evaluate in one year how it will compare to Quest performances. I also hope that in the next months, the Vive Focus 3 will support hand tracking in its system UI, so that the users will be able to do simple operations without having to look for controllers every time. I think that in enterprise settings, this could come very handy.
I hope you have liked this review, and if it is the case, please share it on all your social media channels to inform the biggest number possible of people in the community about hands tracking on the Focus 3!
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.