The Importance of Hand Tracking UX Design

Hands tracking on Quest. I forecasted it would be not possible, and instead...wow!

Today I take a break from my posts about AWE to host a guest post by Dejan Gajsek, Content Marketing Manager at Circuit Stream. Dejan defines himself as “a huge marketing nerd”, that loves finding out what makes companies great and why. On his Linkedin profile, you can see that he has strong competencies in marketing, SEO, growth hacking, but also a great passion for AR/VR, and this is the reason why I got to know him a long time ago.

Circuit Stream, the company he works for, offers a full platform of AR/VR training and learning services to help people master immersive technology. Established in 2015, Circuit Stream has helped over 40,000 people learn to design and develop virtual reality and augmented reality applications. They are currently offering deals for Black Friday, so if you are reading this article close to when I published it, you can check out their offers here.

The article Dejan has written is a big collection of tutorials, suggestions, and tips about developing applications that use hand tracking. It starts with how you can install Unity and get started with hand tracking development and then proceeds with some suggestions about hands tracking UX. It is quite long, and in some parts, it is very detailed (like the section about how to install Unity), so my suggestion is to use the titles of the paragraphs to skim the article fast, read all the parts you are interested into, and skip the ones you may already have the knowledge about.

That said, have a nice read!


Your hand tracking UX design determines how your users will experience your application. Get it right, and you can rest assured you’ll delight your users.

In this guide, you’ll learn what design questions to ask when working on your applications. And you’ll discover answers to those questions.

We’ll explore the limitations of hand tracking and its strengths.

In addition, we’ll set up your headset to both hand tracking and developer modes.

So, this article helps you learn these:

  • How to setup hand-tracking in Unity
  • The different uses for hand tracking in your VR projects
  • A quick guide to hand-tracking design for AR/VR with the Oculus Quest
  • When you should use controllers or hands
  • When and how to use levers
  • Other interesting facts about hand-tracking

If you’re looking to learn more about AR/VR design or development, check our free XR workshops and courses at Circuit Stream. 

First, let’s start with installing Unity hand-tracking.

Table of Contents

How to Set Up Hand Tracking in Unity 3D

If you’re going to do a lot of development, you had better use Oculus Link through a USB 3.0 cable with data connection. Also, install the Oculus app and the Oculus Hub on your computer.

What would a well-designed hand-tracking user experience look like? This video from Holonautic sheds light on what’s possible.

First, you need an editor to build your application. Let’s start there—let’s download Unity and set it up for hand-tracking.

Follow the guides here to set up your Unity account and turn on Oculus hand-tracking.

Install Unity Using this Guide

Follow these steps to set up or update Unity on your device.

Download Unity from the Unity website. Choose the individual plan, if you’re new to Unity, or pro plan if you work with a team.

Install Unity and Create Your Unity ID

Create your Unity ID: Enter a username, your email and create a unique password.

Make sure to use a non-business email address if you’re setting up a personal account. So, you’ll use Gmail, Yahoo, iCloud, or other public email services.

Check the consent and agreement boxes.

Wait for Unity to download to your device. And then click the Create Unity ID button.

Select a Microgame for your first Unity project.

Unity provides you with five microgame options.

Let’s look at each of them.

LEGO Microgame: Let’s you build interactive projects with glossy LEGO bricks. It’s a Unity 3D development Microgame.

FPS: Stands for First Person Shooter and is a Unity 3D microgame. The FPS Microgame lets you modify and play first-person shooter games.

Karting: Karting is a Unity 3D Microgame that lets you accelerate your development with its kart, environments, and time-trial mechanic.

Platformer: Performer is a 2D side-scrolling Microgame. It comes with readymade characters and simple enemies.

Empty 3D: As the name goes, Empty 3D is a blank Unity 3D template that uses Unity’s 3D renderer. Empty 3D isn’t a complete Microgame but provides developers a basic setup for a 3D scene.

Click Continue if you’ve named the project and selected a Microgame. First-time users would have to verify their age and agree to the terms of use.

Watch Introductory Video and Confirm Your Unity ID

Although not compulsory, I’d suggest you watch the welcome video—it helps put you in the right frame of mind when developing with Unity.

Once your download is complete, you should see the option to Launch Unity. Click it.

Make sure to confirm your Unity ID.

Turn on Android SDK and NDK Licenses

Sign in to your Unity Hub on your device. Then, turn on the Android build support.

Follow these steps:

Go to Install, click on the vertical three dots on the Unity installation you want to put Android on, and select Add Modules.

Check the Android Build Support checkbox, read and agree to terms and license agreements.

Click Done.

You won’t start developing just yet. Let’s consider some pre-Unity steps.

If you’ve installed the Unity app already and need to change your Unity version:

When installing a new version of Unity, turn on Android options. Although we have some exceptions, Oculus works mainly with Android.

Go to Install and then click Add.

Choose the Unity version you want to install.

Click Next and then check the Android box to turn on Android options.

Wait for the install to run.

You will see the Android symbol when you’ve installed Android fully.

Install the Oculus Integration Pack

Oculus Integration helps build hand tracking functions into your Unity applications. Follow the guide here to install this asset.

Go to Projects in your Unity Hub and select New.

Select a template. Name the project. Set the project location and click the Create button.

 Wait for the project to load.

When you’ve loaded the project, you should have a dashboard like the one below.

To get your hand-tracking working, go to the Asset Store.

Note that from Unity 2020.1 and newer versions, you can only access the Asset Store on a browser.

However, you can access and use your purchased assets from the Package Manager. In the Asset Store in your Unity, you can scroll down to access the Package Manager. If you have already used the Oculus Integration in the past, you can find it in the Package Manager. Otherwise, if it is your first time, let’s get it from the Asset Store.

Search for the Oculus Integration Kit in the Asset Store.

The Oculus integration kit is free. It’s provided by Oculus and has a lot of features in it.

If you’ve never got it click “Add to My Assets” button to download it, otherwise just click “Import” to add the asset to your project.

If requested, sign in with your ID and verify with a verification code you’ll find in your inbox.

The installation will take a while to complete. Use that time to read the overview and package information.

Make sure to read and accept Unity’s terms of service, if it is requested.

When you’ve added add Oculus Integration to your assets, you’ll see a pop-up with options to go to your assets or open Oculus Integration in Unity.

When you select Open in Unity, a dialog box pops up and fetches the packages.

However, you can also launch this asset from your Unity assets account online. Let’s see how to do it.

Go to your Unity Assets Store account.

From your account dropdown menu, click My Assets.

See the indications on the image below—click A and then B.

Alternatively, you can access your assets from the navigation menu of your store account when you log in.

On your assets page, you’ll see the Oculus Integration. Click Open in Unity.

A Unity Package Manager will open on your computer screen—if your Unity Hub is already open.

So, ensure to open and log in to your Unity Hub first.

Click on the Download button at the bottom right-hand corner of the Package Manager.

And when you’re done downloading the asset, you’ll click Import at the same point on the dialog box to import the asset’s package to your Unity project.

Another dialog box will pop open. Click on Import again.

Wait for the process to run.

The process might take several minutes to complete.

You may receive a notification to update a plugin in an asset. This scenario is just a possibility; it may not happen when you set up your Oculus Integration in Unity.

The import process might trigger requests for other updates. Read the message on the dialog boxes and decide if accepting the update is suitable for your project.

However, most updates come with better features, and you can always switch to older versions of a plugin if you don’t find the update helpful.

When you make software changes—like updating to a newer or older version—you will have to restart Unity to initiate those changes.

So, I will click Restart.

I also accepted an updated spatializer plugin, so I saw a dialog box like the one below. I could choose to upgrade, not upgrade, or delete the plugin, even.

But since I can always turn off this new plugin, I’ll accept the upgrade for now.

You’ll get a prompt to restart Unity. Click Restart and wait. In a few seconds, you’ll see a progress bar like the one below.

When the progress bar runs to the end, your Unity project will pop back on your screen. So, now you should see Oculus in your assets.

Now you’ve installed your Unity Hub and Oculus Integration. Next, let’s bring the hand prefabs and camera into your Unity scene. 

Bring the Hand Prefabs and Camera into Your Unity Scene

Under Assets, click Oculus. In the search box, enter OVRcam to find the OVRcameraRig.

Drag the OVRcameraRig into my project. The OVRcameraRig helps you do VR. If it’s your first time bringing in the OVRcameraRig, it’ll change some settings in your Unity project.

Add hand tracking by adding OVRhandPrefab. Under Assets, in Oculus, enter OVRhands in your search box.

Let’s add the OVRhandPrefab to the gameobjects that are tracking the hands.

Open up the OVRcameraRig in your scene.

Open up the TrackingSpace in the hierarchy to show the RighthandAnchhor and LefthandAnchor.

Add the OVRhandPrefab to each of the LeftHandAnchor and RightHandAnchor.

So, you’d have these:

Select the OVRhandPrefab under the RightHandAnchor. In Inspector, from the dropdown of the

  • Hand Type of OVR Hand Script
  • Skeleton Type of the OVR Skeleton Script
  • Mesh Type of the OVR Mesh Script

Select Hand Right in each case.

So you should end up with something like this:

Now you can press play in Unity if you’ve set up your headset with a link.

Only press play if you’ve connected your link cable to your headset. 

If you’re still wondering, yes, your link is the cable connecting your headset to your computer.


Image credit: Oculus

The link helps you run the application on your computer instead of your headset.

Running your Oculus link to your headset is essential because you need the headset for hand tracking. You can’t use hand tracking with the editor only, without the headset.

The link helps developers send hand data back for hand position.

Your hand movement data will not be perfect on your desktop compared to when it is on the headset. But, it’s a valuable tool for building your application in Unity.

Now, press play, and when you bring your hands into the field of view of your Oculus, you’ll see something like this:

Follow the guide below to set up your Oculus link if you need some help in setting it up.

Install the Oculus App to run on your computer.

In the headset, you’ll see the settings option; you’ll find an Oculus Link cable there.

Once you plug the Oculus link into the headset and your computer, you’ll see a pop-up asking you if you want to run Oculus Link.

Select “Run Oculus Link” in the settings of the headset. 

The idea is that you should be able to run Oculus and also build applications. So you can build applications on your local desktop and stream them to your headset.

Secondly, the link helps you to use your Rift app inside the headset—if you have the Rift app.

Using Physics Colliders

For experienced programmers who are just trying the Oculus Integration SDK for the first time, We’ll explore how to use the physics colliders associated with it.

The physics colliders have a twist, though. If you use the default setting, they won’t operate smoothly because they have a timestep issue.

Follow these steps to set up your colliders:

Select either hand. You can select both hands simultaneously—press down the Control or Command key to allow you to select multiple things in the hierarchy.

For simplicity, let’s select them individually, starting with the right hand.

Click on the OVRhandPrefab in either hand and check the Physics Capsules and the OVR Skeleton Render Script boxes.

So you’ll see how the physics capsules would look relative to the shape of your hands.

This setup would show the literal collider capsules for your hands when it’s running.

Change the Timestep If You Use an Older Version of Unity

Remember to change the physics timestep if you use an older version of Unity. In the project settings, go to the time settings.

Go to the time function. Alter the fixed timestep to be 0.005 or smaller, and set the maximum timestep to 0.005.

This timestep issue is fixed in ​​Unity version 2020.3.19f1 or newer.

The timestep values to change

These numbers won’t change a lot of things.

The point is you have to change it from the Unity default in older versions because, sometimes, a bug may not allow it to track the hands.

You can sidestep this issue by installing the latest version of Unity.

Now, let’s run the scene.

Run the Scene

Put something on your scene to interact with.

Right-click the hierarchy area and select 3D object, and choose from any object options you see from the menu.

Let’s say I chose the first item on the list, a cube.

We can use the options in Inspector to make any changes we want to the object, that is, the cube.

For example, you can use Scale to change the cube’s dimensions and scroll to the bottom to add a Rigidbody so you can interact with the object you select.

Once you’ve made the changes to your heart’s content, run the interaction with the object with your headset.

 So click Play in your Unity dashboard, and you can interact with the object you created.

When to Use Controllers and When to Use Hands

We’ll consider the differences between using Quest controllers versus hand tracking and what those differences mean. And what questions you should be asking when building applications.

Controllers Versus Hand-Tracking

In this section, we’ll make a quick side-by-side comparison of hand-tracking and controllers. The table provides you with a summary.

CharacteristicsControllersHands-Tracking
PrecisionHighLow
LatencyLowHigh
Input possibility LowHigh
PresenceLowHigh
ParadigmKnownNew
ApproachabilityLowHigh
Batteries and WiresPresentEliminated
ExperienceGoodGreat

Now, let’s go a little deeper—what do these differences mean? And what design implications do they have?

Hand-Tracking and Controllers Differences

Controllers have higher precision than hand-tracking because you’re using approximate hand positions, and it’s not perfect. Controllers have better precision because they have more accurate mathematical calculations.

Due to the extra calculations happening in hand-tracking, it has higher latency compared to controllers.

On the other hand, controllers have input limitations that hand-tracking doesn’t have; your design is limited to the hardware manufacturer’s buttons.

Hand-tracking design allows you to make anything a button, so you have more creative freedom. Also, hand-tracking allows you to create interactions that feel more present.

While controllers are pretty popular already, hand-tracking is a new paradigm.

If you build applications for lots of companies and enterprises, keep in mind that lots of their users might never have used VR in their lives. So they wouldn’t know how to use controllers.

Those users just want to come to their meetings and put on their headsets for some training or teaching experience. Moreover, these people know their hands better than they’ll ever know how to use controllers.

Typically, when people use VR, they don’t look at their hands. So you might have to prompt them to look up or bring their hands up into the scene.

Questions When Building Applications: Decide the Input to Use

Precision and latency are the main factors in VR/AR hand-tracking user experience. Although other factors matter, your user experience will depend on these two factors.

Whether you’re working on VR or AR projects, ask yourself these input-side questions:

  • How complex are your interactions?
  • What is the audience of your application?
  • What is the risk of failure?
  • How do you plan to scale the application?
  • How big and diverse is your QA team?

Let’s address each of them.

How Complex are Your Interactions?

One of the biggest limitations of hand-tracking is low precision. So, you can’t do lots of very complex interactions yet. In the future, hand-tracking will accommodate better precision. But for now, it has limitations.

So if you’re creating a Konami code inside of your application, then you’ll do better with controllers.

Secondly, you have latency.

If you have fast and complex interactions, then latency might get in the way of using hand-tracking.

What’s the Audience of Your Application?

If you’re building for first-time VR users, then either controllers or hand-tracking would do well.

However, experienced users have used VR for a long time, so they’re going to prefer controllers—you’d save them the learning curve.

What is the Risk of Failure?

One of the biggest questions to address is risk tolerance.

If your application involves running in a field, the risk of using hand-tracking is significantly higher.

You can’t predict all the scenarios, and everyone has different hands. However, since it doesn’t differentiate skin colors and other unique attributes, the hands would look alike.

Moreover, if you have any problems with your hands, you will miss them.

If your application has low failure tolerance—you’re not playing around or building applications you can redo the input on a dime—stay with controllers.

How Do You Plan to Scale the Application?

This question favors hand-tracking, especially if you’re scaling your application for a large company and you’re making 500 applications. Hand-tracking might be the better way to go—you won’t need controllers or someone on-site to charge batteries or do controller maintenance.

If you plan to put your application on the Oculus store, plan for controllers.

How Big and Diverse is Your QA Team?

Hand-tracking needs a lot of user testing, especially because you’re building for different touchpoints. So you’ll need to be sure it works well for every user.

The more complex your application, the larger the user testing group you’ll need to make sure everything works as you’d expect.

Hand-Tracking Design for Oculus Quest AR/VR


Image Credit: Ultraleap / Gifer

Hand tracking has gone through a series of evolution.

So you have a hand tracking continuum based on those changes over time.

Dimensions of Hand Tracking

The hand tracking continuum goes from one-dimensional hand tracking to three-dimensional hand tracking.

One-dimensional hand tracking happens with hardware objects that track the up-down movement of the button. On the other hand, two-dimensional hand tracking involves a multipoint contact—think of phones and computer mice.

And then you have the three-dimensional hand tracking, which works on Oculus Quest today.

Besides these tracking dimensions, you have different levels of tracking. For example, you can have single-point interactions, like tracking the hand motion alone. Alternatively, you can have multiple-point interactions where you can track different parts of the hand.

For example, you have touch screens where you can interact with one finger motions or multi-point touch.

Thirdly, there’s also a difference between the hardware we interacted with and the hands-free that we use today. And this hardware to hands-free spectrum has different scales too.

Characteristics of Quest Hand Tracking

Let’s look closely at the characteristics of hand-tracking in the Quest.

3D Tracking


Image Credit: Giphy

Yes, the Quest supports 3D tracking, but it’s not free-range 3D tracking because you’re limited by what the Quest can see.

The Quest has four cameras on it, so it can only track a certain space in front of you.

Hands-Free


Image Credit: Giphy

The Quest is completely hands-free, but you can choose to use the controllers—Quest controllers. However, you don’t need controllers to interact—you can pretty much do everything hand-free.

Multi-Point Bone Position


Image credit: Tenor

It uses approximate scaling for its multi-point function by inferring where your bones are in your hands. The Quest is not multi-point for the whole of your hand.

Quest approximates the scale of your hands by inferring where your bones are in your hands and your hands’ positions based on its approximations.

Not like it’s directly tracking all the bones in your hands—just uses them to make approximations on your hand’s position per time.

Ignores Skin Characteristics

Quest ignores your skin texture, blemishes, color, and all other things related to your skin. It purely tries to calculate your hand’s positions.

Not Accommodating for People with Special Hands

If you have lost a finger or have more than five fingers per hand, or you have webbed-hands or other special characteristics, the Quest is not very accommodating of those situations.

Hand Representations


Image credit: gfycat

Since the Quest will not track the specific characteristics of hands, you can use some avatar to represent your user’s hands. 

User experiences vary based on the type of avatar you use. The closer you are to real, human hands, the more people would feel comfortable with the experience.

However, the more real it looks, it might get to the point where it’s eerie, and users might react negatively to it.

Gestures: Endless Possibilities


Image credit: Aldin Interactive / gfycat

Hand-tracking gives you lots of opportunities to introduce gestures as a form of interaction. And people familiar with sign languages can also create applications that work based on just hand gestures.

You can even create a sign language app.

Gesture Surprises

You might experience some surprises in gesture recognition:

  • Accidental gestures
  • Complex gestures
  • Meaningful gestures

Accidental gestures could happen when people are communicating, and they make a gesture that accidentally triggers a response.

Some hand gestures are difficult for some people to make. If your design includes those gestures, then your users might run into issues. Test all of your gestures to make sure it accommodates all your user group.

If you plan to reach an international audience, keep in mind that some gestures have certain cultural meanings.


Image credit: gfycat

So, ask your people, friends, and community members about their experiences to find out what other cultures and countries think about your gestures.

Tracking Volume: Proximity


Image credit: Ultraleap / gfycat

The image above is a Leap Motion hand-tracking demo. Unfortunately, however, Quest has a problem that prevents it from having such accurate tracking.

Leap Motion delivers this quality hand-tracking because of its camera positions—it has a dual-front-camera setup so that it can track the hands quite well.

On the Quest, however, the cameras are not set up the same way. At the moment, Meta suggests that you don’t attach your UI to one hand because your hands will interfere with each other when they are too close to interact the one with the other.

If you’re going to develop anything related to the two hands interacting with each other, keep it quick and straightforward. Oculus even says to perform these interactions with avoidance in mind.

Tracking Volume: Visible Area

If you need someone to reach down and grab something from their tool belt, they’ll have to look down to grab it.

Your handset isn’t going to automatically know your hands’ positions. So you need to always look in the direction of the hands.

Take that into consideration:

Pretty much everything that’s not in front of you would probably need some design cues to draw users’ attention to them or just remove them from your application.

Tracking Volume: Precision

Let’s discuss how you can get around precision issues. Your hand interaction is usually a hit or miss because of poor precision.

For example, you may have challenges playing the drums or doing other activities.

So you can transform the space using levers, decreasing your movement resolution and making it possible for people to interact with smaller items.


Image Source: ultraleap.com

Using levers is a way to transform small actions into big changes.

Comfort and Zone Range

The rule of thumb here is that what’s comfortable for you would probably be comfortable for your users.

Your designs should keep your users’ elbow at a typically relaxed position—about 90 degrees. You should keep your hands close to your body.

Secondly, there’s the zone range.

Keep most of the interactions right in front of the user. You have the primary, secondary, and tertiary zones.

At 460mm radius, the primary zone is closest to the user and is best for frequent reaches. The tertiary zone is for occasional reaches should be at about a 700mm radius.

Infrequent reaches happen at the secondary zone, which should be at about 600mm radius from the user.

These numbers represent the average human size.

Also, do not have interactions all over the place. Users should not have to be constantly moving their hands. Long interactions do not work.

Oculus Store Technical Requirements

Oculus says they will only accept an application that supports hand tracking to the Oculus store if it satisfies their additional technical hand-tracking requirements. For example, they require your app to support both controllers and hand tracking when submitting your app to the Oculus Store.

However, this rule doesn’t apply if you’re using Oculus for your business or you’re scaling Oculus internally in your company.

Hand Tracking is the Future of XR

Over the coming years, you can expect hand tracking to become the norm in extended reality—virtual, augmented, and mixed realities.

Hand tracking offers near-infinite interaction possibilities, freeing designers to be as creative as possible. In addition, it’s going to feel more natural to first-time VR/AR users as they can carry on their natural hand gestures into the virtual environment.

Check out a free video hand tracking workshop if you’re looking for the best place to start. Among other things, the course helps you learn the limitations of hand tracking and how to circumvent them.

You’ll walk away equipped with knowledge on debugging hand tracking in your applications, working with various interaction models, and understanding how hand UI works.

(Header image by Meta)

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...