oculus quest hands tracking review

Oculus Quest hands tracking review: an impressive first step towards natural UX

Last week, Oculus rolled out the runtime v12 of the Oculus Quest, which took various features to the standalone headset, and among them, there is hands-tracking support.

I was very curious to try it, and yesterday, as soon as I could, I gave it a look. Do you want to know my hands-on (pun intended) impressions? Then follow me!

Oculus Quest hands tracking video review

If you like Youtube videos, I made a very nice one showcasing hands-tracking in action, with my soothing voice commenting it. In it, I show you:

  • Hands tracking working in the Oculus UI;
  • A stress test of hands-tracking;
  • The accuracy of the Quest hands tracking by super-imposing the real and virtual hands with a smart trick.

I advise you to watch it, especially for the third part of the video. You can find it here below.

Oculus Quest hands tracking textual review

If you prefer reading a textual description, here you are.

How does hands tracking work?

To activate Hands tracking on Quest, you have to update it to v12 of the runtime, and then enable hands tracking from the experimental features settings menu.

Once you activate it, your controllers disappear, and Oculus provides you a handy tutorial that teaches you how to use it. Basically, at the moment you can use your hands in the following way:

  • Moving your hands in the menu, you see a pointer that represents where the hand is pointing at. Notice that the pointer doesn’t follow your index finger, but it moves together with the whole hand, following the direction of your palm;
  • If you pinch by letting your index and thumb finger to touch, you perform a click, activating the button the pointer was pointing at;
  • If you keep pinching and move your hand, you don’t click, but you scroll inside a menu;
  • If you position your hand with the palm up and you pinch in this position, you emulate the pressure of the Oculus button.
https://gfycat.com/farflungindolentchevrotain-virtual-reality-hands-tracking-oculus-quest

Notice that while you use hands tracking, you can’t use controllers at all. And if you try to launch an application requiring controllers (e.g. Beat Saber), a pop-up asks you if you want to disable hands-tracking and re-enable controllers to use it.

At the moment, there is no 3rd party app using hands-tracking, because the SDK has been released only yesterday, so you can use it only inside the Oculus UI.

Hands-tracking UX impressions
hands tracking oculus quest
A full mesh of virtual hands superimposed to the video stream of the hands of the user, showcased at Oculus Connect 6

The hands-tracking controls provided by Oculus remind me a lot the ones of HoloLens v1 (do you remember the air-tap gesture?), just working much better. The detection of the pinch and drag gesture works almost every time and is tolerant of different ways of performing the action. The commands are also very intuitive: clicking by making the thumb touch the index reminds the action of clicking something and also gives you the sense of touch when you click something because your fingers are touching. I liked it a lot: it is natural, intuitive and effective.

There are only three drawbacks:

  • The cursor doesn’t always point in the exact direction you are aiming with your hand. Sometimes it is in a slightly different position, sometimes it is slow in following your hand’s movement because it is smoothed too much, and so on. Sometimes this is a bit annoying;
  • The virtual hand lags a bit behind the real hand, and this is perceivable (according to Oculus, the delay is 80ms);
  • Using hands tracking keeping your hands in front of the headset is a bit tiresome. I wouldn’t do that for hours every day. It’s perfect for short interactions (like launching a movie), but not for long actions.
Hands-tracking accuracy

I stress-tested the hands-tracking of the Quest, trying all different conditions. Oculus shows you a virtual hand for each hand that it detects reliably, and makes a hand disappear when it can’t understand its pose.

https://gfycat.com/legitimatelightbushsqueaker-virtual-reality-hands-tracking-oculus-quest

The detected hand shape is impressively accurate: it can detect the movements of every single finger, even the slightest movements, like a finger bending a bit. In my opinion, the tracking is in line, if not slightly better, with the one of Leap Motion. The detection fails only if you try some very weird positions, like crossing your fingers, or if there are big occlusions, like when you make a fist and you slightly raise the pinky finger a bit up. As I’ve told before, there’s only a slight lag in the tracking, that is noticeable and breaks the magic, but the accuracy is impressive.

Different is the performance with two hands. As soon as the hands are separate, everything works. But when they touch, or when one is in front of the other, everything comes to a catastrophic end and both hands disappear. This is the biggest limit of the current hands-tracking solution: as humans, we use a lot bimanual interactions, and the fact that you can perform none with this framework is highly disappointing. Leap Motion still has the lead in this.

A final notice on fast movements: the system tracks the hands even if they move a bit fast, but if you move too fast, like when you play a boxing game, the tracking gets lost. The good news is that it recovers very soon, so maybe in future iterations, this will be possible.

Oculus showcasing future updates of the hands-tracking system that will improve performances in next updates
Final impressions

This first version of hands-tracking on Oculus Quest is impressive. Facebook has managed to perform some black magic and make such an algorithm to run on a mobile processor. The accuracy is not perfect, but very good, and the only problems are represented by a slight tracking lag and the fact that there’s no way to use two hands together.

For the use that Oculus suggests now, that is interacting with the UI, it is already very good. But remember that using your hands is more tiresome than using the controllers.

To have fully interactive hands-tracked 3D applications, we have to wait for an update that enables the reliable tracking of both hands together. But if this is the first version, and it is so good, I’m very confident that the second or third one will enable that for sure.

(Header image by Oculus)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]

valve deckard roy controllers

The XR Week Peek (2024.12.02): Valve Roy Controllers 3D models’ leak, Black Friday VR deals, and more!

Happy Thanksgiving weekend to all my American friends! We don’t have Thanksgiving in Italy, but I know it’s a very important celebration in the US, Canada, and a few other countries, so I hope all of you who celebrated it had a great time with your family.  To all the others who did not participate in […]