I’m back to telling you about some hands-on sessions I had at AWE in Santa Clara, California. And today there are a two I’m sure you will find interesting…
XTAL 3
One thing I’m very happy about having tried at AWE is the XTAL 3 headset by Vrgineers. I was very curious to try this device since when I read its hands-on by Upload VR, in which David Heaney claimed it was the first wide FOV headset without peripheral distortions he had tried.
People at Vrgineers made me try their headset with a flight simulator software. It was not Microsoft’s one, but a professional simulator for pilots. I can tell you it was a professional one because I died countless times and made my plane explodes more times than in a Michael Bay movie.
And by the way, it was not only a simulator software, but also hardware: to enjoy the demo I had to seat inside a special chair that had some plane control levers around it. It’s always cool to feel like Maverick from Top Gun, even if just for one minute.
I had not much time to try the headset, so I have not evaluated all its characteristics, but there are two features that I remember very well, and one is good and the other one is bad.
Let’s start with the bad thing: the weight. The headset felt very big and clunky, and especially very heavy. VRgineers claims a weight around 600 grams (without the head strap, though), which is not that high, considering that the Quest 2 is around 500. But the sensation I had on my head was of something very heavy… as if I tilted my neck too much, it would break it. I remember trying the previous version of XTAL in Prague and already having the same sensation, but that headset was much heavier than this one. So either the head strap of this model is made in lead (and the weight with the strap is 1.5kg), or the problem is not the weight per se, but the perceived weight. Probably being the headset so big and large, the weight is so unevenly distributed that it seems much heavier than it actually is. I didn’t find the device much comfortable in this sense, and I was thankful that I was heaving a seated demo in the simulator, where I didn’t have to move my head and my body that much.
The good thing was instead the visuals, and especially the wide field of view. The display was surely good: with 3840×2160 per eye resolution, the screen door effect becomes noticeable only if you focus on it. The more we go on, the more the screen door effect becomes something like a sensation, a noise effect on our visuals, than the grid we had in the past. Even if the resolution was very high, there was still a bit of SDE effect because also the FOV was large, so the pixels had to distribute along a larger field of view. And the FOV is not just large, it is very large, arriving at 180° horizontal × 90° vertical. I had my whole peripheral view in the headset, and this enhanced my immersion and especially spatial awareness while I was flying on my virtual plane. I had to remark that I had the sensation that the vertical FOV was decreasing towards the periphery, so that the FOV was not a rectangle, but looked something like this.
I don’t know if this effect was given by the lenses, the displays, or whatever, but I had this perception. If I had to bet, I would say it was caused by the anti-distortion warping. XTAL 3 does its best to reduce the distortions in the periphery that are typical of wide-FOV headset like Varjo or XTAL 2, and this probably requires to distort the image on the display so that it appears undistorted through the lenses. And Heaney was right: it does a great job in this sense. While looking at the periphery, apart from this shrinking effect on the FOV, I noticed no other noticeable artifacts. The distortions that I’d always seen in the other headsets in the same category were gone. I was sincerely impressed by it. I had my full horizontal view, completely undistorted. It’s a remarkable technical results. Great job, engineers of VRgineers!
In the end, I’ve found my experience with XTAL 3 satisfying: I think it’s great for seated use cases where you need a very large FOV, for instance for some training scenarios. If they can reduce a bit the size and the weight of the headset, it could become good also for other purposes.
Manomotion
Manomotion was another technology I had always wanted to try, and luckily they were partnering with VRgineers, so I could kill two birds with one stone.
Manomotion is a startup that offers an SDK to track your hands using the standard hardware running on your device. Long story short, it means that it can detect the pose of your hands from just the camera of your smartphone and nothing more.
Yeray from Manomotion showed me an app built with the company SDK working on his smartphone: he could hold the smartphone with one hand and use it to frame his other hand. On the screen, I could see an application showing his hand that was interacting with the virtual elements. When he did some specific gestures (e.g. closing his fist), the app reacted by doing something. It was interesting that the SDK could not only track his hand but also segment it to show it inside the virtual application. It was a clear example of augmented virtuality, where a real element (the hand) was shown inside a virtual environment (the one of the game). I had not the occasion to thoroughly evaluate the accuracy of the Manomotion SDK, I can just say that from what I saw, the gesture recognition seemed to work well. Also the segmentation was very good: the hand was “cut out” in a believable way.
Yeray told me that one of the things that Manomotion is working on is porting their solution to the XTAL headset, in particular the version with passthrough cameras. You can install on the XTAL 3 a module with 2 high-definition cameras that let you enjoy high-definition mixed-reality experiences. Manomotion is working to bring its augmented virtuality solution for hands to this version of the device. I could try it briefly, and see my real hand on top of virtual reality content. The effect was very interesting, but also a bit strange: my hand wasn’t always segmented in a perfect way, plus it appeared yellow-ish, as if it was a dead hand, and also its focus seemed a bit different from the one of the rest of the content. I’ve been told that this is all normal because this is still a work-in-progress and not the final solution. I can’t so express a real opinion about it… I’ll just say that it was an interesting prototype (I’ll have to revaluate later).
I asked what is the advantage of using such an augmented virtuality solution and I have been answered that it’s good to use for instance with simulators, so the user can see his real hand interacting on top of the buttons to operate with. Plus this solution should be able to work also with the hand close to real objects (e.g. a command panel), a scenario where other available hand tracking solutions may instead fail.
At the end of the day, it was interesting to try this hand-tracking solution, and I hope to see a full integration of it in the next months.
And that’s it! I hope you liked these short reviews, and if it is the case, please share this post on social media with your XR peers. And subscribe to my newsletter to not lose my next article 😉