Varjo combo hands-on: XR-3, Reality Cloud, and a tiny bit of Galea!
At AWE 2022, I have been able to try a few products by Varjo, the Finnish company that offers XR solutions mostly tailored to the enterprise market. Let me tell you something about all of them!
Varjo XR-3
Varjo XR-3 is considered the gold standard for pass-through augmented reality. Priced at around $6000, it should provide a fantastic experience both in AR and VR, thanks to its high-resolution displays and passthrough cameras.
I have been able to try the XR-3 for a few minutes both at Varjo and Ultraleap’s booths.
Design
Varjo bets a lot on the quality of the products that it builds. XR-3 is not an exception: it is crafted very well, and it looks solid. The design is also polished, with the front plate being very cool thanks to the studied camera positions and the reflective finish.
Visuals
The resolution and the pixel density of the device (over 70 ppd) are high, and with this headset, you forget about the screen door effect and similar artifacts of VR. The colors of the display were also quite bright, and so in general the virtual elements I could see both in AR and VR felt very alive. The FOV is also larger than the usual: not huge, but large enough that you don’t feel like looking small binoculars. When you wear this device, you can feel that you are trying a premium headset with high-end features.
BUT, there is a huge “but”: the distortions. Like many other headsets with a larger FOV, Varjo XR-3 has noticeable distortions on the periphery of your vision. The central part of what you see is perfectly fine, but when you move your head, you can spot that something is “moving” and distorting in your peripheral view, and this is quite distracting. I remember having a similar issue with the Varjo VR-2, so I think that Varjo has still to find a way to fix the issues with its lenses. The good news is that usually these problems can be corrected (or at least improved) via a software update.
Passthrough augmented reality
Passthrough is the main feature of this headset, which thanks to its 12-megapixels RGB front cameras reach a level of realism no other device can challenge. And I have to agree with Varjo that the passthrough is cool, the best I have ever tried on all XR headsets until now: the resolution of the cameras and the display makes sure that the final result is a high-quality passthrough vision.
But even on this top-quality headset, the passthrough is not perfect: first of all the colors of the world that you see through the cameras look a bit washed out, a bit “sadder” than in reality. Then there is a slight noise, so you perceive that you are still looking through some kind of camera. And when you rotate your head fast, there is a bit of blurring of the image. Because of all of this, you can perceive the difference between the bright virtual elements, and the colder real-world visuals when you use passthrough AR.
So, amazing passthrough, you can even type text on your phone with it on, but it is still not like the real thing.
Augmented Virtuality
During the demo of the Reality Cloud, I was using no controllers, but I interacted with the system just using my hands. And it was cool that I could see my real hands (in passthrough AR) on top of the virtual experience. Even better, the system calculated the depth of my hands, compared it to the depth of the virtual objects, and decided which parts of my hands to show depending on if they were occluded by a virtual element or not. Since I had a real element on top of virtual ones, this is a clear example of augmented virtuality.
It’s one of the first times I experiment with AV, and I have to say that it is pretty cool: seeing your real hands and not just a virtual representation of them gives you a better perception of your body in VR. But at the same time, it requires the system to make a pretty complicated effort. I guess it has to calculate the depth of the hands via some kind of depth estimation algorithm, and the result is that the segmentation of the hands is a bit “blocky”. This means that you usually don’t see a perfect cut-out of your real hands, but you see some jagged lines around them. Also when you move them close to a virtual object, and you move them up and down to make them appear and disappear, you see that the hands get occluded and shown always as a combination of big jagged blocks, which are probably a representation of the resolution of the depth estimation algorithm. Anyway, I tried AV only together with the cloud rendering, so I don’t know if part of the artifacts were due to the remote streaming.
Eye-tracking
I could see the performance of the eye-tracking just for a few seconds after I have put the headset on during the Reality Cloud demo. It seemed to follow my eyes correctly, but the time was too tiny to express a real opinion on it.
Hands tracking
Hands tracking on Varjo XR-3 is provided via Ultraleap (sensor v2) tracking, which is currently the best in the industry (you can read my full review about it here). It worked very well: my hands were tracked whatever was their pose in all my field of vision, and even beyond.
Comfort
The XR-3 was comfortable to wear… at least for the few minutes I wore it on my head. It offers various fitting options to make sure that you can find the right fit for your head, and this is very good from an ergonomics standpoint. On the negative side, I’ve found it to be a bit heavy.
My biggest problem with the XR-3 is that after I have tried an AR demo with it, when I remove the headset, my eyes feel suddenly a lot of strain as if they have to get used to reality again and need a bit of time to recover. I don’t know if it is because of the distortions (but I don’t think so, because I had not this problem with the VR-2), or because the passthrough feels almost real so my eyes consider it as the real thing, but it has fixed focus and so the vergence-accommodation problem causes my eyes to get a bit dizzy when they can have the variable focus of reality again. I spoke about this with Varjo, and they were quite surprised by my feedback but will try to investigate.
Final impressions
Varjo XR-3 has a very good reputation in the industry and for a reason: it is a high-quality device with the best passthrough in the market. But as with all XR devices available nowadays, it doesn’t come without problems, with the lens distortions and the eye fatigue being the most annoying for me. Overall I think it’s a good headset, though.
Varjo Reality Cloud
Varjo Reality Cloud is the cloud rendering service offered by Varjo for its headsets, and it can work both with AR and VR experiences.
How does it work
I guess you are already familiar with cloud rendering: it is that solution that makes you render your virtual content of an XR experience on the cloud so that you consume less computational resources locally. Varjo has decided to offer this service for a simple reason: its high headsets, like XR-3, require a very powerful computer to work reliably (at least an RTX 3080 someone told me), but not all people have them. If many people in a company want to try some XR experience, maybe because they all must review some XR prototype of an object to manufacture, equipping all of them with expensive computers may not be the best choice to take for the company. With Varjo Reality Cloud, everyone can attach an XR-3 headset to a laptop even with mid or low-mid specifications and use it smoothly. The idea is to “democratize” the use of Varjo headsets.
For all of you who like technical details, I’ve been told that Varjo Reality Cloud runs on top of AWS, with the new G5 servers that have onboard dual A10 NVIDIA GPUs. There are various server locations, both in Europe and US, and Varjo is working to expand this network because you know that the closer the rendering server, the smaller the latency of the streamed rendering. All the streaming solution is proprietary by Varjo and it is not based on NVIDIA CloudXR. It is now supporting only Varjo headsets, but they’re planning support also for hardware of other vendors.
Hands-on impressions
The demo I was shown was about seeing a beautiful luxury car in front of me in augmented reality. I guess the purpose was to remind me that I’m too poor to buy both the Varjo XR-3 and the luxury car.
Varjo’s employee told me that the car was an unoptimized model coming directly from Autodesk VRed, so it was not a VR-ready model at all, it was a model designed for offline renderings. I could see a VRed window with the car modeled inside in the remote desktop session with the rendering server on AWS. I have been also told that the demo with the car was a collaborative one, with up to 5 people being able to connect inside the same virtual room and discuss the car together, but I have tried this demo alone.
When I put the headset on, I saw the car in front of me. It was beautiful and there were no signs at all of the remote rendering. I couldn’t see streaming artifacts, the car looked like rendered locally, and it was impressive. I was asked to look at the performances tab on the laptop that was running the demo: the RTX3080-powered laptop was just using 20% of its GPU to power the XR-3 headset and show that heavy car. This showed me that the server was taking all the heavy lifting of the rendering.
Regarding the lag, unluckily I can’t express a reliable opinion. The Wi-Fi network of the hotel was not the best ever (I can confirm this), and this added some visible latency to the streaming. This meant that when I was moving around the room, I could see the car slightly wobbling and not being static. The wobbling was not huge, but was noticeable…. the car was lagging behind the movements of my head. I had no nausea, though, because the passthrough was still processed locally to guarantee the most comfortable experience for the user (so the passthrough had zero lag). I can’t so express a complete opinion on the overall experience, because I would like to test it in ideal network conditions. For now, I can say that even with a not ideal Wi-Fi, the cloud rendering was still working and providing good visuals and a decent lag.
Final impressions
I am a big fan of cloud rendering, and the one provided by Varjo seemed to work well in my opinion. I’m just wondering if it is really needed: if someone has $6000 to spend for a headset, he/she should also have $2500 to spend on a high-end laptop… 😛
OpenBCI Galea
Varjo has recently partnered with OpenBCI for Galea, a headset full of sensors that helps in understanding the brain status of the user. Priced at $22,500 it should offer the best BCI performances currently available on the market. Read more about it in my dedicated post.
A sad story
I love BCI and so I desperately tried everything to be able to have a test with Galea: I asked Varjo PR for a demo, I stalked OpenBCI PRs for 10 days, I kneeled in front of OpenBCI people on the AWE show floor (literally), and I even entered in the Varjo demo room when not authorized. I lied, cheated, and stole, but with no results. OpenBCI didn’t let me have a demo of it.
What I got of it
I haven’t gone away from AWE with completely empty hands, anyway.
First of all, I have talked with some people that have had a quick demo of the device, and they confirmed to me that the device actually works. It can read data from many sensors and show them on the display of the computer in a dedicated viewer.
Then, I have been able to see a Galea prototype myself: I got OpenBCI CEO wearing it while he was walking around the AWE Playground area, and asked to have a look at it. It was exactly like in the pictures, with sensors on the face and around the head. I also tried to wear it and I noticed that it was not very comfortable. OpenBCI says that it has invented a new material for the EEG sensors that make them very comfortable, but I kindly disagree with their opinion. Maybe they are used to sleeping in a bed made of nails, and so for them, the headset is comfortable… but for me, it was not. The tiny plastic spikes of the EEG sensors that have to touch the skull, all together feel like many tiny nails that touch your head. I can’t say it was a very pleasant sensation. It is quite common with EEG sensors, though (NextMind gave me similar sensations, for instance), so I don’t blame OpenBCI people for this problem. What surprised me instead was the size of the EEG sensors, which are quite little, smaller than the other ones I have tried until now.
Varjo People
My technical review is over, but I want to finish this article by talking about my experience with people at Varjo, which have all been very kind and available. I want to especially make a shoutout to two of them.
The first one is Johnathan Sutherlin, who works in cloud services at Varjo. He was the “Varjo employee” that made me try the demo of the Varjo Reality Cloud. He kindly greeted me, and then said something about us knowing each other. I had no idea what he was talking about, so I did the first thing I do in these cases: I looked at his badge and read his name. When I read it, I had a hugely emotional moment. Johnathan has been like the first person interviewing me: he was part of a company called Quad7Computers, and they fell in love with the full-body VR solution we were building at Immotionar. It was something like 2015… I still had a long beard and hair. They were so passionate about it that I remember Johnathan spamming on Twitter all the companies going to E3 to tell them they had to use our full-body solution because it was the best! He was truly a fan! and I had an interview with him and his partners, and it was a lot of fun… it was really like the first of my life or something like that.
Seeing him, I felt as if all that happened in these long years in VR was passing in front of my eyes: we met that we were so passionate about VR, desperate and poor… and after so many years… we are still passionate, desperate and poor, but now he’s at least at an important company like Varjo. It was a very emotional moment seeing him in person for the first time in my life, I almost wanted to cry, then I remembered that I am a man and that crying is not allowed by our contract and so avoided doing that. But it was like seeing an old friend after many years, and I was so happy to see him being part of such a good team like the one at Varjo. I’m proud of him and the career he has had. He was very good at demoing to me the Reality Cloud (even if I had an absurd look with pink hair), answering my questions, and listening to my feedback. I wish from the bottom of my heart that he can keep doing his great career in XR.
My reference to pink hair brings me to the second person I want to mention here, that is Ida-Emilia Kaukonen. She’s kinda new in the Varjo PR team and was tasked to manage the relationship with me and other press people on behalf of Varjo. Poor her, she didn’t know I’m slow in answering on whatever channel I’m contacted on because I’m always too busy. She tried in any way in organizing a demo for me with Varjo Reality Cloud, but with me answering once every three days and with e-mails composed of maximum 7 letters, it was getting hard. But being a kind and determined person, she booked the last available slot for trying the cloud with her name so that she had not had to wait for an answer from me and “forced” me to come. I mean, it was a demo at 8.30 am, a time that is very Finnish but not much Italian, considering that for us the day begins at midday… but at least I had a demo, and if I’m writing this article today is because of her determination.
Things became fun when I arrived at the demo room and before entering she told me again that the demo was booked at her name, so she was there just in case there were problems. Being an idiot, I suggested that if the demo was at her name, I should become her. So we got an idea: I got her badge and some of her pink hair and entered the room as Tony-Ida, with Varjo people quite confused about what was happening. Yes, I had all the demos with pink hair on my head. Yes, no one was able to take me seriously in that room. Yes, at one point, one Varjo guy looked at my badge and got confused for a second if I was Tony or Ida. I took this as a compliment.
When the demo ended, we both had to go to the show floor, so we made the walk together, and it was pretty funny seeing the shocked reactions of people that saw at first her, and then me with the pink hair…
… my friends were screaming “Tony, nice hair!”, or “You are beautiful!” and things like that. We should have filmed this and made a candid camera. Arrived at the show floor, I decided we re-took our original identities before she could become Tony and started complaining about the quality of pasta and pizza in the US. But they have been fun crazy minutes…
Anyway, lots of thanks to her for how she managed to get me a demo with the Reality Cloud, for having also tried to help me in getting the demo with Galea, and especially for having helped me with the name-change joke! One of the best PR ever.
And with this final fun moment, I invite you all to share this post with your peers on social media channels to make me a bit happier! And don’t forget to comment saying how much you like me with pink hair…
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.