AWE 2022: hands-on Tobii and Ultraleap technologies
My report on AWE 2022 never ends and today I want to write an article about my visit to two important companies in our ecosystem: Tobii and Ultraleap.
Tobii
For the first time in my life, I have been able to have a chat with people at Tobii. And it has been quite an interesting one.
Tobii software
They had no new hardware for me to review, but they let me have a test with a Neo 3 Pro Eye (Pico Neo 3 with Tobii eye-tracking incorporated) and showed me a few demos while they explained to me me the importance of eye-tracking. The demos were quite simple but effective to showcase the added value provided by eye-tracking technologies.
The most relevant of them were:
- A demo about foveated rendering: I was shown an environment that could run at a full framerate on the Pico Neo 3 because it had no real-time lighting. It is a common trick for Quest/Neo developers to publish scenes with baked lights and no real-time lighting to increase the framerate of the application, and it works, but it makes the scene feel less alive. Then I’ve been asked to press a button, and the scene took life thanks to realtime lighting, but the framerate dropped a lot and the rendering became choppy. Then I pressed the button again, and the same scene was rendered with real-time lighting and foveated rendering provided by Tobii, and it was alive and running at full frame rate. This was meant to show the potential of foveated rendering, which will be able to let us developers create rich scenes (with real-time lights and shadows) also for standalone headsets like Quest in the future
- A demo about training: I was asked to perform a procedure that pilots have to follow when they enter an aircraft before turning it on. The procedure included items I had to activate (e.g. buttons to press) and items I had to check (e.g. indicators that I had to verify were at the correct level) in the right order. Thanks to the power of eye-tracking, the system could assess if I was controlling the correct values on the indicators at the right times. With current headsets without eye tracking, it is impossible to check where I am looking at for an application
- A demo about avateering: I could see my avatar in front of a mirror. Without eye tracking, it had still eyes, while after I activated eye tracking, I could see my eyes moving, and my eyelids blinking. With eye tracking, the avatar felt of course more natural. What I found very interesting is that while in the beginning, I could find it acceptable to have an avatar with fixed eyes, after I tried the version with moving eyes, I found weirder to come back to the previous one
- A demo about UI: I had a menu (similar to the one of the Oculus store) in front of me, and wandering around with my eyes, I could highlight every element and read more information about the element I was looking at. More interestingly, once an element was highlighted, I could just perform a click with my index trigger to activate that element and launch that game without moving the controller. This way, the interface in VR becomes much more usable: I don’t have to move my hands anymore to select something in a menu… I just look at it, click, and that’s it. It’s super-comfortable. You may wonder why I can’t just click with my eyes… well, eyes are good for exploring and not for interacting. In our life, we don’t use our eyes to actively interact with anything, so it would be weird to do that in VR. Plus, eyes move around very fast to explore everything you have around you, and if they had the power of “clicking” on something, for sure they will click on something you were just looking at to evaluate it. So it’s better to have an actual deliberate confirmation with the hands, which are the tools you use every day to interact with objects
- A demo about throwing objects: As a developer, I can tell you that implementing throwable objects in VR is a pain in the**. That’s why most experiences you try out there have mediocre throwing mechanics. Tobii people put me in a VR experience and let me throw rocks so that to hit bottles that were around me. I started throwing the rocks, and I had mixed results: one-two hits, and a lot of misses. Then they made me move inside the experience and grab some golden rocks: I started throwing them and I caught all the targets like a sniper. The secret? Well, you may have guessed it: golden rocks were powered by eye-tracking: considering that you look at your target when you want to hit it, the system “helped” me in throwing the rock with the correct parable depending on the bottle I was looking at. It was quite surprising. It had its drawbacks, though: however I threw the rock, it always hit the target, so the game was not challenging anymore. Probably a compromise between the two solutions was to be found. Anyway, this demo showed me how eye tracking can improve various interactions in XR, like throwing and also pointing with a finger (when we point at an object with a finger, usually the finger doesn’t point exactly to the object if it is distant away, but our gaze does).
The first three demos were quite standard to me, while the one about UI and throwing objects, and how eye-tracking can improve interactions in XR, were the most interesting ones.
Tobii hardware
I was then shown the eye-tracking hardware that Tobii produces. Every eye tracking device is comprised of a camera and a series of LEDs that illuminate the eye. The illumination is necessary to make the tracking better and more precise… I’ve been told that eye-tracking without internal lighting can not work well. The hardware is usually installed in a ring that is set up around the lens of the headset, with both the camera and the LEDs being on that ring. Notice that you may see just a circular opaque black plastic ring because everything works with IR light: the little illumination lights are IR LED (so you don’t even see their light), the camera is an IR camera, and the plastic of the device is opaque to visible light and transparent to IR light (so you see it as opaque, but for the LEDs it is transparent).
Another interesting trick used to track the eye is adding a mirror skewed 45° between the lens of the headset and the display. The camera, which is on the ring, is rotated at 45° itself, so it frames the mirror, who reflects the image of the eye. This way, it is like the camera is in front of the eye, and not lateral to the ring, and can track it much better. At this point you may wonder: if there is a mirror in front of my vision, how can I see the VR display? Well, the answer is quite simple: it’s an IR mirror: it reflects IR light but it ignores visible light, so you see the display, and the camera sees your eye lit by the IR lights of the LEDs. This super-smart trick is what is implemented by the Vive Pro Eye, for instance.
It’s all very cool, but now that all headsets manufacturers are migrating to pancake lenses and a slimmer form factor, there is no way to use this procedure anymore. Tobii people anyway have already found a new way to perform eye tracking even in this edge case. They couldn’t share with me their future solutions (they are still secret), but they told me that the plan is to start using very small IR cameras. They then showed me some standard glasses with some very tiny black dots on the lenses which were actual cameras able to track the eyes. It is quite impressive how a camera can be miniaturized so that to be so small and still have a decent resolution (like 300×200 or something like that). Wearing the glasses, being the cameras so small and so close to my eyes, they appeared to me just as dirt on the lens, creating something like a slightly visible black halo when I was looking in their direction and being invisible when I was looking in other directions. This was just a hint about possible tracking solutions for upcoming small AR glasses and pancake VR headsets: I guess the idea is to install these small cameras in the periphery of the vision (so they are not noticed by the eye) and track the eyes with them. I can’t wait to see an actual solution built on top of this idea.
I also tried asking if Tobii is the company manufacturing eye-tracking for PSVR2, but I got no answer on the topic. No scoop for me this time. But it’s been a very interesting time with Tobii anyway, and I’m thankful to the Tobii employee at AWE for the time they dedicated to me.
Ultraleap
At the Ultraleap stand, there were some nice demos with the Ultraleap hand-tracking sensor installed on the Pico Neo 3, the Varjo XR-3, and the Lynx R-1. As you have read in my mega-review about Varjo products, I have tried Ultraleap tracking integrated into Varjo XR-3.
You know that I’m a big fan of the Gemini tracking with the latest Ultraleap sensor (read my full review here) and I can confirm that it works very well even in a crowded exhibition. The use with Varjo XR-3 was almost flawless, and I just had a few glitches here and there. What also surprised me is the wide tracking FOV of the IR-170 tracking sensor: I could operate my hands even beyond the wide field of vision of the headset. I tried putting my right hand on the right side of my head, grabbing an object, and putting it in front of my eyes, and it worked: my hands were able to operate also in regions where I couldn’t see them.
It was the first time that I used Ultraleap hands tracking in RGB passthrough AR. The cool thing is that when it works, it feels like black magic: you see a virtual object in front of you, you grab it with your bare hands, and you manage to manipulate it… this is cool. The only drawback is that if you have absolutely no virtual representation of your hands, it may happen that you can’t perform some operations and you don’t understand why. For instance, you try to grab something, and you see the system not reacting to your grab: if you could see your virtual hands, maybe you would see that the fingers tracking is going bad and you would retry the gesture, but if you have hidden the virtual hands so that you just see the real ones while you are in passthrough AR, you may not realize when your gesture is correctly recognized and when not. A feedback system would be highly useful in this scneario.
I have also been briefly shown the Ultraleap 3Di, which is a new case for the hand tracking sensor meant to be used not on XR headsets, but in digital signage kiosks and other installations with 2D displays so that people can interact with them without touching anything and have a better experience from a hygiene standpoint.
Anyway, Ultraleap showed once more how its hand tracking is currently the best on the market. Kudos to them. And a big shoutout to Tessa Ulrwin and Faye Lockier for being always so nice to me!
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.