One of the first accessories for AR/VR I had the opportunity to work on is the Leap Motion hands tracking controller: I made some cool experiments and prototypes with it and the Oculus Rift DK2. Leap Motion has also been the first important company I have interviewed in this blog. This is why I felt so honored when the company, now called UltraLeap, contacted me to have a preview of their new hands-tracking runtime, codenamed “Gemini”.
UltraLeap Gemini hands-tracking review video
I’ve made my usual video review of the new UltraLeap Gemini runtime, with some cool in-action shots of me using hands-tracking. If you want, you can find it here below!
Gemini
UltraLeap Gemini is the 5th iteration of the Leap Motion/UltraLeap hands-tracking runtime for Windows PCs. UltraLeap has evolved its hands-tracking runtime since its initial launch in 2012, and every major release of the runtime introduces a big jump ahead regarding the tracking accuracy. If you are in the XR field since a while like me, for sure you remember how the Orion runtime made Leap Motion finally usable to create XR applications thanks to the big improvement it added on hands detection when the sensor was attached to a headset.
The cool news of these runtime updates is that they are compatible with all available UltraLeap sensors, both the v1 Controller (that most of us have) and the new Stereo IR 170 (the one embedded into headsets like XTAL), so you don’t need to change your hardware to appreciate the new tracking fidelity.
Today UltraLeap has released the preview of the Gemini runtime, that should carry with it:
- Better smoothness, pose fidelity, and robustness (likely to be most apparent on desktop mode)
- Improved hand initialization
- Better performance with two-hand interactions
The preview of the runtime can be downloaded for free by everyone on the UltraLeap website, and the final official release is expected for Q2 2021. While reading the review below, remember that this is a preview release, and it will be surely improved during the next months.
Tracking accuracy
After the Orion runtime, the tracking accuracy with Ultraleap sensors was already good, but I can tell you that the Gemini update brings a new unprecedented level of quality. I’ve tested it with my Leap Motion Controller, the Oculus Quest, and a lot of scotch tape and I was very satisfied by the results.
The single hands now get tracked very well, however you may rotate them. The palm, the back, all the fingers while moving… everything works like a charm. If you make some weird pose, the system doesn’t detect it, but it shows you the closest “normal” position that it can imagine, and this is already good for most applications. The only problem, if I have to be picky, is that the pose of the pinky finger may have some misdetections if it is not clearly seen by the sensor: for instance, if I touch my pinky with my thumb, the system may recognize that I am touching my ring finger with my thumb. Little problems like this may happen, but they are not a big deal.
What really surprised me is the stability of two hands interactions. For the first time, I’ve been able to make the fingers of my two hands cross and interweave the ones with the others, and the tracking kept working reliably. Just to make a comparison with the competition: if you make your two hands to just touch with the Quest, the tracking gets interrupted, while here you can make all your fingers cross! The tracking halts when one hand goes completely on top of the other (so a palm over the other palm), and this probably should be solved in a future iteration (the system should “think” that a hand can’t disappear, and so hypothesize its current position even if it is partly occluded). The system also sometimes keeps working when you move fast the hands so that one covers the other for a fraction of second, so it is more robust. I can fairly say that this is the best hands-tracking system I’ve seen until now on all headsets for what concerns the interactions between two hands.
Another welcome improvement is the tracking FOV. During Oculus DK2 times, hands were tracked only when they were really in front of your eyes, otherwise, they lost the tracking. With this new runtime, the system can track the hand for all its field of view. I have to say that at the periphery the tracking is not super-stable, so maybe the hand gets lost-and-then-detected multiple times, but at least it’s there. This is a great improvement for the usability of hands-tracking with UltraLeap. I have to say that the Leap Motion Controller (v1) has a quite limited field of view, so with the v1 hardware, I was not super satisfied with how far I could move my hands, but the new hardware is much better in these terms and should pair very well with this runtime.
I also wanted to evaluate how the new hands tracking could be used for punching things: you know that I’m among the developers of HitMotion, an XR boxing game, and I wanted to test how UltraLeap Gemini was usable for fast movements. I have to say that I was surprised to see that at least 40% of times, I could actually perform a punch movement in front of me and see it correctly tracked. Not bad at all, given the current status of hand tracking. I think this opens the road for new rehabilitation experiences.
I’ve also tried the sensor in desktop mode (so without VR), and I have experienced similar improvements, exactly as the company has promised. The only thing that I haven’t liked about the new runtime is the initialization time. Ultraleap talks about “Improved hand initialization” and for sure the sensor, when it detects a hand the first time, it initializes it very well. But the problem is that sometimes it requires various seconds to initialize the hand: it seems that it wants to be very sure that it is a hand before actually recognizing it, but this adds frustration to the user, that doesn’t understand why one or both of his hand doesn’t get recognized when they are in front of the headset. I think this is a problem that must be ironed out before the final release of Gemini.
Hands-on with UltraLeap demos
Unluckily, UltraLeap has not released any new demo to highlight the new possibilities enabled by this new runtime, especially the two hands interactions. So I had to test it with the previous applications, that is Cat Explorer, Blocks, Particles, and Paint.
Cat Explorer is mostly about single-hand interaction, so it was the least useful demo to test Gemini… but hey, there is a cute cat, so why shouldn’t have I played with it? It is just too cool…
Paint was the first nice use case because it requires, first of all, to put the two hands one close to the other (one hand is the brush and the other holds the colors), and then while drawing the strokes with your fingers, you immediately realize if tracking has been lost, because the stroke gets interrupted. I can tell you that the tracking with Gemini proved solid in both cases, and I could draw incredibly well. I also made my assistant Miss S to try it, and she started drawing immediately, without me explaining anything. She isn’t technical, but she was able to draw some funny things without ever complaining of things not working apart from the hands sometimes disappearing and require some seconds to show again. Also, this test with her made me realize the true power of hand tracking: it is very intuitive, and when it works, you have to give very little instructions to people to make them use an application of yours. It’s truly a natural way of interacting with technology.
Blocks, an experience to create cubes and interact with them, gave me some mixed feelings. On one side, the tracking worked well, and I could perform the interactions both with one and two hands. On the other side, I noticed that it is quite normal for me to move my head while I interact with objects, to put them more on focus, and this created problems with the limited FOV of my Leap Motion Controller. When I moved my head to focus on one object I wanted to interact with one hand, sometimes I sent to the peripheral view the other hand with which I was already interacting with another cube. This other hand very soon lost the tracking and so I stopped interacting with that other object, frustrating me a bit. I so realized that without a V2 sensor is difficult to have solid and persistent interactions with virtual 3D tools that are not always in front of you. Furthermore, this demo made me realize once more how interacting with objects without a clear sense of haptics and without realistic physics can be an illusion-breaker at the beginning, because you realize how your hands can’t interact with the virtual world as you do with the real one.
UltraLeap Unity SDK
The UltraLeap Unity SDK is the same SDK we have always appreciated all these years. There are already many prefabs and scripts that make the building blocks to create hands-tracking applications. There is also a full package with an interaction system implementing the cool hands UX that Leap Motion has studied for years and that you can use in your apps. There are many demo scenes from which you can learn (and of course copy) so that to develop faster.
It is a very well made SDK and you can get started in very little time in implementing it in your XR experiences.
Further References
If you need some additional references to go deeper on Gemini, you can follow these links:
Final impressions
Leap Motion is a company that I’ve always esteemed, and now its people are continuing to do great things at UltraLeap. The new Gemini update works very well and can give you good hand tracking accuracy even with a Leap Motion Controller that is many years old. I was especially positively impressed by how two-hands interactions work well now. Of course, it is not perfect, and the road to use hands tracking as input every day is still long, because as Abrash says, a mouse can’t work only 90% of the times, but it should be always reliable. Gemini is anyway a good step in that direction. And my tests made also with non-techie people show me that it is already usable.
It’s a pity that all major companies (like Oculus or HTC) are re-inventing the wheel for what concerns hands tracking when there is this big expertise already available, but this new runtime is a piece of good news for all the headset manufacturers like Varjo, XTAL, Pimax that have integrated the UltraLeap v2 controller in their headsets.
If you want to try it as well, you can download the runtime and the demos on UltraLeap website. Of course, if you do that, let me know your impressions in the comments here below!