Getting started with Oculus Touch and Avatar SDK in Unity [Updated]

Welcome to 2017! I wish to you all that this will be an astonishing year!

In 2016 I left you with this video showcasing my first experiments with Oculus Touch, Avatar SDK, in cooperation with our ImmotionRoom full body system. As you can see in the video, there are female full body avatars made with ImmotionRoom (so, basically, made using a Microsoft Kinect) and epic-bearded blue avatars made using Oculus Avatar SDK. Then there is some firing, because firing in VR is always cool!

This demo took me more time than expected, due to some problems I had in using the Avatar SDK. That’s why I’m writing this article: to help you in developing using this SDK wasting less time than me. It will not be a step-by-step tutorial, but I’ll give you some detailed hints. So, ready… go!

[Note: this is un updated version of the original article, that takes in count evolutions of the Oculus SDK… so if you’re returning here after a long time and you notice some differences, well, it’s perfectly normal]

Oculus Integration download

To develop for Oculus platform inside Unity you have to download the Oculus Unity plugins. The following tutorial makes you download all the following packages that are necessary to make Oculus Touch and Oculus Avatar to work properly:

  • Oculus Utilities for Unity 5
  • Oculus Avatar SDK
  • Oculus Platform SDK

Recently has been released by Facebook an overall package including all these libraries in a single comfortable place that you can download from the Unity Asset Store. Just import Oculus Integration package into your project and voilà, you have all dependencies you need out of the box.

The upcoming instructions won’t use this facility and will make you download every single package separately for an educational purpose. My personal advice is to follow the tutorial as-is to learn how things work and then use the overall Oculus Integration in all your future projects.

Oculus Touch integration

Oculus Touch integration took me no time. Really, it already works out of the box and this is amazing. Just open your Unity project, import the Oculus Utilities for Unity 5 (you can find them here), remove the Main Camera gameobject, drag and drop inside your scene an OVRCameraRig or OVRPlayerController and that’s it! When you run the game, you will just see standard virtual reality without Oculus Touch and so you will think that I’m completely insane, but trust me… you have already Touch Integration!

If you look OVRCameraRig prefab better, you will see that its grandchildren LeftHandAnchor and RightHandAnchor already move according to your Oculus Touch pose. So, for example, if you attach a sphere to your LeftHandAnchor transform, that sphere will follow exactly your left hand, as you can see here:

Even without other plugins, I managed to have a nice sphere attached to my hand… cool, isn’t it? I just attached the sphere to the hand anchor and everything worked like a charm (click to zoom image on a new tab)

If when you press play you can’t see your app to run properly in VR, it is because latest version of Oculus Utilities often doesn’t check the Virtual Reality Supported flag. So go to Edit -> Project Settings -> Player and make sure that the “Virtual Reality Supported” flag is checked and that Oculus SDK has been selected.

Ok, now you have Oculus Touch reference frame, but… how about input? Super easy, you use the OVRInput class. OVRInput class is very simple and easy to use and it is also very well documented at this link: I strongly advice you to read the docs! (RTFM is always the right advice, you know) Basically all you have to do is to query OVRInput about the status of some Touch triggers or thumbsticks and then do some action. In the above video, at 0:07 I move the avatars using Touch thumbsticks. Underlying code is super-easy: I attached a script to the avatar specifying this:

void Update () 
{
    Vector2 touchAxis = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick) * Time.deltaTime;
    transform.position += new Vector3(touchAxis.x, 0, touchAxis.y);
}

Basically, I ask OVRInput about the status of the primary thumbstick (that is, the left Touch thumbstick) and move avatar object accordingly.

How about the Kalashnikov? Easy as well: first of all, I’ve attached it to the RightHandAnchor transform, so that it appears in the right hand. Then, in its update function, I’ve written:

if (OVRInput.Get(OVRInput.Button.SecondaryIndexTrigger) && m_deadTime <= 0)
{
    GameObject bullet = Instantiate<GameObject>(Bullet);
    GameObject fireStart = transform.FindChild("FireStart").gameObject;
    bullet.transform.position = fireStart.transform.position;
    bullet.transform.rotation *= fireStart.transform.rotation;
    bullet.GetComponent<Rigidbody>().AddForce(fireStart.transform.right*2.85f, ForceMode.Impulse);
    GetComponent<AudioSource>().Play();
    m_deadTime = DeadTime;
}

Here Seconday Index Trigger is the trigger that I push with my right hand index… like if I were firing. If I push it, I just fire a bullet and play a firing sound. It sounds amazing. Nothing difficult, nothing worth giving particular advices.

Controller buttons mappings inside OVRInput (Image by Oculus VR)

The problem if you use Touch this way is that you have no haptic feedback and no avateering… so bad!

Adding haptics

Oculus Touch can give you little vibrations to simulate haptics of the objects you’re interacting with. Again, this is a simple-to-use feature and it is well documented here. Only two things got me a bit confused:

  • There’s no haptic prefab, nor MonoBehaviour script;
  • Haptics uses audio clips. Yes, you’ve read right: you provide haptic engine with audio clips that it “plays” through the vibrations of your Touch controllers.

You can’t see that in the video, but I added vibrations to right Touch controller when the user fires with the Kalashnikov, to give my little game a more badass effect. So, I modified my update block to add this feature.

if (OVRInput.Get(OVRInput.Button.SecondaryIndexTrigger) && m_deadTime <= 0)
{
    GameObject bullet = Instantiate<GameObject>(Bullet);
    GameObject fireStart = transform.FindChild("FireStart").gameObject;
    bullet.transform.position = fireStart.transform.position;
    bullet.transform.rotation *= fireStart.transform.rotation;
    bullet.GetComponent<Rigidbody>().AddForce(fireStart.transform.right*2.85f, ForceMode.Impulse);
    GetComponent<AudioSource>().Play();
    m_deadTime = DeadTime;

    OVRHaptics.Channels[1].Mix(new OVRHapticsClip(VibeClip));
}

Last line is what does the magic. I ask OVRHaptics to play something on Channel 1 (that is the one of right hand… Channel 0 represents left hand instead). The clip I ask to play is VibeClip (which is an AudioClip object, that I pass as parameter to the script), which I have to box into a OVRHapticsClip object to let it to be played through haptic engine (and yes, I know, instantiating a OVRHapticsClip at every call of the method is not very smart, but this is a simple toy program, so that’s ok). About the chosen method, I preferred Mix over Queue… why? If you ask OVRHaptics to Queue a new clip, it will finish playing last one and then will start with new one: in the case of a machine gun, this is not a smart choice, since every new bullet gets fired while the gun is still vibrating from last shot, so you want the engine to blend the new vibration with the current playing one and this is where the Mix method comes in.

How did I create the VibeClip object? Well, I took the audio file of the kalashnikov bullets, amplified a bit its volume and duration using Audacity and that’s it: it’s simply a WAV file. My advice is starting from an existing WAV file and then going trial and error using Audacity until you obtain the vibration you find optimal. It’s obvious that the file gets interpreted so that the device vibrates to mimic the waveform the file contains.

Adding avatar

Here is were things start having some problems. Avatar SDK is not built-in into Oculus Utilities and you have to download it separately. So, go back to Oculus download page and look for Avatar SDK, download it and unzip it. This project is not only Unity-oriented, so you have to look for Unity package inside the \Unity folder of the unzipped SDK. Import into your project the OvrAvatar.unityPackage package . Notice this may require some time to compile the avatar shaders: my Unity seemed freezed for some minutes, but actually it was just compiling the self-occlusion shader.

Importing Avatar SDK could prompt you a lot of errors due to some references missing. This is because you have also to download the Platform SDK. Oculus Platform SDK can be found on Oculus main download page. Download it, unzip it, then import into your project the Unity Platform plugin that you can find at \Unity\OculusPlatform.unityPackage. This should fix all the errors you may have in your console.

Importing the packages do not suffice to see your virtual hands. To see your avatar, you have to drag-and-drop into your project the Assets\OvrAvatar\Content\Prefabs\LocalAvatar prefab. Hit Play and move your Touch controllers… you should start seeing some hands moving someway. Great! You should also see an error about Oculus App Id, but we will deal with this later.

LocalAvatar’s OvrAvatar behaviour will show you lots of options: most important to me are:

  • “StartWithControllers”: if you check this, the avatar will not show you bare hands, but hands with Touch controllers in them;
  • “ShowFirstPerson”: select this if the avatar is the one of the player;
  • “ShowThirdPerson”: check this if this is not the player’s avatar. With this option a simple avatar’s body and face will be shown.
Oooooh, look! There are hands in my Scene View!! (click to zoom image on a new tab)

Ok, now we have a great problem: our OVRCameraRig and Avatar are completely unrelated!!! This is the biggest WTF of Oculus SDK. Your avatar and your camera+touch reference frames are not related in any way.

To make it work, you can do as in the samples: set the same pose (position+rotation+scale) for OVRCameraRig and LocalAvatar game objects. Then in OVRManager script of OVRCameraRig select TrackingOriginType: Floor Level.

If you want your CameraRig and your avatar to be coherent, you have to select Floor Level as origin type. And don’t forget to set the same position to the Camera Rig and Avatar transforms!! Don’t like this approach, but seems necessary… (click to zoom image on a new tab)

Re-run the samples… now you should see that your avatar actually has sense now and that avatar hands are exactly where they should be!!

But this is a Pirro’s victory: if you try to move your OVRCameraRig gameobject (to emulate the fact that the player is moving inside the game), the avatar remains fixed in place! This is a non-sense for me (IMHO if I say that an avatar is a first-person-avatar, it should follow the CameraRig automatically) and I think that it is something that Oculus has to fix. And then, again, if you select Eye Level as origin type, nothing works the same.

You can solve the issues putting a common parent between the Rig and the Avatar, but I prefer having a more complete approach (also because this is needed in our ImmotionRoom prefab!). Solution I found to make them related in any case (indepently by the parents of both objects and the origin type) is writing a small script that makes sure that the avatar follows your position and rotation.

using UnityEngine;

public class AvatarCalibrator : MonoBehaviour {

    public GameObject LeftHand;
    public GameObject LeftHandAvatar;

    // Use this for initialization
    void Start () 
    {
        if(LeftHand == null)
            LeftHand = GameObject.Find("LeftHandAnchor");

        if(LeftHandAvatar == null)
            LeftHandAvatar = transform.FindChild("hand_left").gameObject;
    }
    
    // Update is called once per frame
    void Update () 
    {
        transform.position += LeftHand.transform.position - LeftHandAvatar.transform.position;        
    }
}

Take this script, save it as AvatarCalibrator.cs and attach it to your LocalAvatar gameobject. This is a super-simple script that I wrote in few minutes and it is far from optimal, but it conveys the idea: the avatar will always make sure that his left hand coincides with the left hand reference frame of OVRCameraRig, this way it will follow you and will work like a charm even if you use a OVRPlayerController.

Try substituting the OVRCameraRig with a OVRPlayerController (and adding a floor plane, or the player controller will fall to death forever) : you will see that even if you move with your Touch Thumbsticks, your avatar will always follow you… awesome!

With this marvelous Calibrator, we can even use an OVRPlayerController completely separated by the avatar… and the avatar works the same!! (click to zoom image on a new tab)

This scripts makes only the position equal and so it breaks the magic when you rotate the player. To make it work even to handle rotation, you have to add a line to take that in count:

using UnityEngine;

public class AvatarCalibrator : MonoBehaviour {

    public GameObject LeftHand;
    public GameObject LeftHandAvatar;

    // Use this for initialization
    void Start () 
    {
        if(LeftHand == null)
            LeftHand = GameObject.Find("LeftHandAnchor");

        if(LeftHandAvatar == null)
            LeftHandAvatar = transform.FindChild("hand_left").gameObject;
    }
    
    // Update is called once per frame
    void Update () 
    {
        transform.position += LeftHand.transform.position - LeftHandAvatar.transform.position;        
        transform.rotation = LeftHand.transform.root.rotation;
    }
}

Then you have to go to your OVRPlayerController gameobject and remove the “Hmd Rotates Y” flag from the OVRPlayerController behaviour. Try to play again and to rotate your player controller and your head… wow! Hands now are always perfect!

Ok, time for a build… save the scene and build the executables (it may be a long process the first time you do it, but don’t worry)… launch the project… and…e-ehm, WHERE ARE MY HANDS???

Well, there is a problem. If you try looking at the program log, you will see a bazillion of exceptions, all like this one:

DllNotFoundException: libovravatar
  at (wrapper managed-to-native) Oculus.Avatar.CAPI:ovrAvatarMessage_Pop ()
  at OvrAvatarSDKManager.Update () [0x00000] in <filename unknown>:0 
 
(Filename:  Line: -1)

The problem is in the chosen architecture. Go to your Build Settings (File -> Build Settings…) and pick x86_64 Architecture. I’m running a 64-bit Windows 10 PC, but building for x86 with Oculus had not given me any problem until now that I started using Oculus Avatars… weird!

Change your Build Settings architecture to x86_64 or you will see no avatar! (click to zoom image on a new tab)

Ok, so re-build and re-launch the program! You should finally see your hands!!

Anyway, if you want more info, full docs of Avatar SDK are here.

Adding YOUR avatar

So, everything seems fine, but you can still see only that bald aqua-coloured guy avatar and unless you are the Brazzers bald man, this can’t satisfy you: you’ve spent hours crafting an epic avatar of yourself and you wanna see your personal avatar. So, how can you do this? Again, this will not be super-easy.

Oculus documentation gives us precise instructions at this link (where there is a more precise explanation of Unity Avatar Plugin), but why reading the official tutorial? Keep reading here and I’ll warmly guide you 🙂

Select your LocalAvatar gameobject and double click on OvrAvatar script to modify it. Only thing you have to do is

Find the lines:
OvrAvatarSDKManager.Instance.RequestAvatarSpecification(oculusUserID, this.AvatarSpecificationCallback);

Replace oculusUserID with a valid Oculus User ID such as 295109307540267.

Done it? (If you are in doubt which number to use, use the provided one, we’ll return on this later on)

Ok, now, do you remember that little exception I told you to ignore? Well, it’s time to consider it. You have to provide Avatar SDK and Platform SDK your Oculus App Id (this is for security reasons). Go to your Oculus Developer Dashboard and create a new fake app, giving it the name you prefer (I chose TouchTest, since I have a lot of fantasy).

My brand new OculusTouch app inside my Oculus Dashboard… Oh, look… there is Time of Conundrum too! (click to zoom image on a new tab)

Go to details of your newly created app and choose the “Getting Started API” tab: you should see an App ID. Copy it.

This is where you find your newly created app Id inside your dashboard (click to zoom image on a new tab)

Go back to Unity and select Oculus Avatars -> Edit Configuration and then paste your ID into the Oculus Rift App Id textbox that you see in the inspector.

Time to set our marvelous App Id inside Unity editor (click to zoom image on a new tab)

Now select Oculus Platform -> Edit Settings and paste the same Id to the Oculus Rift App Id and Gear VR App Id fields. Now when you hit play you should not see the exception anymore. But, even more: you can see a new avatar, a black-with-blue-sparkles one! This is the avatar of user 295109307540267, that I have no idea who is… maybe Palmer Luckey? (Don’t know… there are no flip flops in the avatar… can’t understand if he’s really Palmer)

Palmer, is that you?

We’re almost there: we have shown the avatar of Palmer Luckey… but, what about showing yours? It’s so simple! You have to put your Oculus User ID inside that method instead of the one of Palmer. But… WHAT THE HECK IS YOUR OCULUS USER ID??

Long story short: I don’t know… looking on Google I didn’t find anything… I tried looking on my Oculus user profile, Oculus dev profile… and no clue about my ID! So, how to find your Oculus Id? Honestly, I don’t know.

BUT I can tell you a workaround to obtain it. I found this amazing video of Unity where they explain properly how to use Avatar SDK.

It’s awesome and at 21:50 they show you something really useful, that is the proper way to use OvrAvatar to show the user’s avatar. I’ve taken a screenshot for you

This is the screenshot that shows you how to properly do stuff (click to zoom image on a new tab)

Basically, you have to create a script to initialize Oculus Platform with your App Id, then wait for platform initialization and only at that point you can create and initialize your avatar, providing it the ID of the logged in user (that is, you). I used this to trick the system and obtain your User Id.

So, create a new script called PlatformIdManager and add it to an empty gameobject.

using Oculus.Platform;
using Oculus.Platform.Models;
using UnityEngine;

public class PlatformIdManager : MonoBehaviour
{

    // Use this for initialization
    void Start()
    {
        Core.Initialize("<your_oculus_app_id>");
        Users.GetLoggedInUser().OnComplete(OnGetUser);
    }

    private void OnGetUser(Message<User> message)
    {
        var id = message.Data.ID;
        Debug.Log("My id is " + id);
    }

}

Of course you have to replace <your_oculus_app_id> with your Oculus App Id. Launch the program hitting Play and… bam! Error! Why is that happening? Reason is that Unity can’t connect to your Oculus platform to detect who is the logged-in user. We have to make them communicate.

Let’s get back to your developer dashboard and scroll down the window: below your App Id that you used above, you should find a “User Token” section. Click on “Generate Token” and copy all that strange string that the portal generates for you.

Generate the user token, so that Unity will be able to communicate with the Oculus Platform (click to zoom image on a new tab)

Then go back to Unity and select Oculus Platform -> EditSettings. In the Inspector window, expand “Unity Editor Settings”. Select “Use User Token” and paste your user token into “Oculus User Token” textbox.

This is where you have to paste your User Token to make everything work (click to zoom image on a new tab)

Now save everything and hit play! You should see in the Unity console your Oculus User ID being written! Copy it and replace the standard Palmer Luckey’s number with this number inside OvrAvatar.cs script.. re-launch everything (you can even remove the PlatformIdManager script, now!) and now you should see your avatar being used! You can even add a new Local Avatar with “Show Third Person” flag checked to see your own face!

Wow, it’s amazing! My epic-bearded avatar!! I’m showing it through a secondary Third-person avatar. (click to zoom image on a new tab)

Two more things about this last point:

  1. You will get a warning about Clobbering of ID, because in the PlatformManager script you should not write Core.Initialize and specify again the App ID (that you’ve already specified in Unity menus), but if you don’t do it, you get a null reference exception (don’t know why, another strange issue). To cancel the warning, you can go back to Oculus Platform -> Edit Settings and clear the Oculus Rift App Id and Gear VR App Id you wrote some time ago. Anyway, the warning is of no harm, so you can even leave everything as is.
  2. As I’ve said, this is the quick-and-dirty way to obtain your avatar. Theoretically you should do as the video and the Oculus official documentation suggests, so to make sure that the avatar of the current user is shown in a clean way, whoever the user is (if you distribute your game with the above scripts, everyone will see YOUR avatar!). To do that, there is the issue that you have to wait for the Platform to initialize itself before showing the Avatar/Avatars. If you try to specify a user Id before the Platform has been initialized, nothing will work and you will see again the aqua bald guy avatar. One possibility to obtain this is deactivating the Avatar gameobjects and then modify the script PlatformIdManager as follows:
    using UnityEngine;
    using Oculus.Avatar;
    using Oculus.Platform;
    using Oculus.Platform.Models;
    using System.Collections;
    
    public class PlatformIdManager : MonoBehaviour {
    
        public OvrAvatar myAvatar;
    
        void Awake () 
        {
            Oculus.Platform.Core.Initialize();
            Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
            Oculus.Platform.Request.RunCallbacks();  //avoids race condition with OvrAvatar.cs Start().
        }
    
        private void GetLoggedInUserCallback(Message<User> message) 
        {
            if (!message.IsError) {
                myAvatar.oculusUserID = message.Data.ID;
            }
        }
    }
    

    And then set your Local Avatar as parameter of this script. This script will make everything wait until the platform has initialized with current user; then when the user will be recognized, the script will assign it the ID of the current logged in user (the player). Of course you have also to go back to the line you changed into OvrAvatar.cs and restore the line OvrAvatarSDKManager.Instance.RequestAvatarSpecification(oculusUserID, this.AvatarSpecificationCallback);
    Launch everything again: you should see again your avatar… but the code is far better! So, why do I showed you the dirty approach? Because this way I could show you the hack to obtain your Oculus User ID and because knowing that can be useful in some situations.

This is the final setup to load the avatar dinamically (click to zoom image on a new tab)

And with this, my super-long tutorial (woah, it is the longest post ever on my blog!) ends. My final words on Avatar SDK is that it is cool but it still needs a bit of refactoring.

Hope you liked this tutorial, since I spent a lot of time writing it:  please like it, ask questions about it, comment it saying that I’m amazing and share it with all your other fellow Virtual Reality devs! And if you want, come to say hello to all of us of Immotionar! Happy 2017!

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...