These days at AWE are being intense… in a good way. Today I have selected two cool things to talk about with you: the new version of the haptic gloves HaptX, and the new “Teleport” service by Varjo. Then I am reporting you the (in)famous announcement by Palmer Luckey about him working on a new headset! And, if you are interested in my personal story at the event, you will also discover if in the end I fulfilled my dream of having a picture with Palmer Luckey or not…
HaptX G1 Gloves
Last year, HaptX teased at AWE it was working on a new model of its haptic gloves, the G1. This year these gloves were not just on a mannequin, but I could finally try them.
The G1 Gloves represent a step forward for the usability of HaptX Gloves: they are smaller, more ergonomic, and cheaper than the previous generation. I was the first one to try them at the HaptX booth and I have to say I was pleasantly surprised by them. Not only the gloves appeared smaller, but also the box the gloves had to be connected to was smaller: there was no need of having two boxes, but just one box was enough. And this box can stay on a table or can also be worn on a backpack to make the setup of the HaptX gloves wireless.
If you are new to VR and you see the pictures of HaptX G1 Gloves, you may find them pretty big, and you may be confused by my positive statements. Well, it is because you have to understand what HaptX is trying to do. Every company in immersive realities is following a specific approach, and HaptX has chosen to prioritize a very accurate haptic feedback on the hand, at the expense of other factors, like ergonomics. So the first edition of the device already had very cool haptic sensations, but it was incredibly big and heavy. Every subsequent version of the device had the purpose of shrinking the setup. Other companies are following a different approach: for instance, bHaptics (about which I will talk about in another post) has chosen to start with comfortable and affordable gloves and make the haptic sensation more accurate at each iteration.
So you have to see these gloves according to this light, comparing them to the previous version. And since I tried the previous version a couple of times, I can tell you that the step forward is very noticeable. The gloves are still bulky and still make you look like a cyborg ready to kill John Connor, but are smaller and then before and much easier to wear: let’s say you look more like a more human cyborg now. The setup at the booth was quite fast: I just had to wear them and close a knob. And in front of me, on the desk, there was only one box pumping the air inside the gloves.
I also noticed that they were more comfortable. There was especially a feature that made them feel much better: the gloves were not just a single rigid piece of plastic. In the new version of the gloves there is a joint made only in fabric at the wrist position. This means that while wearing these devices you can finally freely move your wrist. This is a huge improvement over the previous version.
While trying the demo, I also found that the haptic sensation on my fingers and my palm was better than before, too. I could feel the tingling touch sensation of objects on my hands and it felt even better than the last time I tried the device. I can not explain in what sense, but it felt like a more natural touch sensation to me. I could feel that sensation both when I was grabbing virtual objects in my hand, or I was just touching a virtual surface and slide my hand over it. It was still not a perfectly realistic sensation, and there was a slight lag from when I saw my virtual hand touching the object and when I felt the touch, but it was good.
The highlight of my demo was when I put on my palm a little fish, which was contracting itself like every fish outside of the water. Thanks to the accurate positioning of the haptic feedback on the various parts of my hand, I was able to feel the spasms of the fish, and I could feel it sliding over my hand until it fell to the ground. That was pretty amazing.
The force feedback sensation, instead, was more or less like before. Force feedback is the sensation that virtual gloves give you about the resistance of an object: for instance, when you grab a virtual ball, the force feedback should apply on your fingers a pulling force to prevent your real fingers from penetrating the virtual ball, to simulate the resistance of the surface of the object. HaptX gloves provide this feature, and you feel that your fingers are being applied a force when you grab objects, but this force does not feel much natural and it also is not strong enough to really simulate the stiffness of an object. I have to say that no glove in XR is currently able to provide very good force feedback, though.
I was in the end satisfied with my test on HaptX G1. The gloves are smaller and more comfortable than before. They keep having their flagship feature of letting you feel the haptic sensations all over your hand surface and this haptic sensation is improved too. I’ve heard that also the price is smaller than before, which is good, but since they don’t want to publicly share how much it is, I guess they are still very expensive. HaptX G1 are of course dedicated to the enterprise, and thanks to their innovative features, they have already generated more than ten millions dollars in preorders and they are beginning shipping the first items to North America in the next few days. I mean, they are not perfect: they are still big, they still require a big box, and the force feedback is not perfect. But this is because of the approach they have chosen of prioritizing the haptic sensation and trying to improve the other factors over time. And over these years I’ve seen them constantly improving version over version, so I think they are doing a good job. Compliments Jake, Joe, and the whole team!
If you want to discover more about HaptX G1, you can head to this website: https://g1.haptx.com/learnabout
Varjo Teleport
Varjo has a private room at AWE, where it is not only demoing its XR-4 mixed reality headset, but it is also showcasing for the first time its new product called Teleport. Teleport is a service that brings real locations inside virtual reality so that you can “teleport” into them. This service has just been announced, but I’ve been told the company has been working on it for years.
The way teleport works is pretty straightforward: you use the Varjo mobile app to scan the environment, then you let the Varjo cloud crunch all the data, and then the digitized environment is ready to be downloaded and entered in VR.
The scan with the mobile app is the typical scanning of a place you do for photogrammetry or for scanning a place for VPS: you move around the place and you scan it with your phone, trying to look in all directions and to picture the various parts of the environment from different points of view. Currently, the scanning operation is possible only with an iPhone. I asked the Varjo representative why they are not using an existing service to scan the environment. He told me that while there are already many apps on the App Store for object scanning, there are not so many for environment scanning, so they decided to go with their own.
I tried then to enter a few environments, teleporting into them. The environments looked like the real ones but with a quite strong fuzzy effect. I have been told that the visualization uses Gaussian Splatting, so it can be very accurate, but if a part of the environment has not enough data (e.g. because it has not been scanned from enough points of view), the Gaussians try to express at their best the probability of a certain pixel to have a certain color, and this results in an effect that looks like the environment has been painted with watercolors. So the environments I was in had the elements visualized in three possible ways:
- Some had been poorly scanned, and they looked like clouds painted with watercolors
- Some had decent scanning and looked like the real object but with some fuzzy strokes around it… like if it was visualized in a dream
- Some were scanned well and they looked incredibly well
In my tests, most of the visuals were of the second type: they were good enough, but they looked like coming out of a dream. So the reconstruction of the places was working fairly well, but the places did not look realistic at all, but they seemed more like an artistic representation of themselves. Personally I find this acceptable at this stage because the Teleport service is currently flagged as a “technology preview”, meaning that it is at an alpha stage, so it is normal it is not working perfectly.
I think Varjo Teleport can be an interesting service: I’m personally a believer that the scanning of environments will be huge in the future. I was also a fan of a scanned environments “social media” that there was a few years ago: I still see that as a potential long-term use case. But at the same time, I have also to say that Varjo has still a long road of optimization to go on. I think they made a good job in making the scan part easy to perform with just a phone. And the visualization of the parts that have been scanned thoroughly is impressive. There was an ATM in one of the environments I was in that looked incredible, especially through the lenses of the high-resolution Varjo XR-4. But all the other places looked a bit oniric, and if Varjo wants this to truly feel like a teleport, it needs the environments to be like in reality. It’s a lot of work, but I think it’s doable if the company keeps investing in the service.
If they manage to polish it, I think the Teleport service may have many use cases, both on the enterprise and consumer side of things. In the meantime, I’ve been told that even the current fuzzy version can still be useful for those cases when true accuracy is not necessary. For instance, if a company is organizing an event abroad, an employee on the event site could scan the environment around him a few days before the event, to give the company stakeholders a quick look at how the event location is so that they can give feedback. In this case, perfect accuracy is not necessary and the service could already be useful as it is today.
Palmer Luckey’s announcement
If you follow me on social media, you know that my main goal at this AWE was to have a photo with Palmer Luckey, because in my ten years in VR, I have never had the occasion of meeting him and he’s my idol because he started this new generation of VR. The plan was to attend his panel with Darshan Shankar (the founder of BigScreen) and then after the talk approach him and ask for a picture.
Palmer was very fun during the panel and he cracked a lot of jokes. It was really entertaining to hear him. But between a joke and the next one, he and Darshan said a couple of interesting things. Palmer remarked that in these 10 years, VR made a lot of steps forward: the BigScreen Beyond is 3 times lighter and 20 times more powerful than the Oculus Rift DK1. This shows how VR made a lot of progress, even if we just consider tethered PCVR. He also said that it’s bizarre that VR is still considered an unsuccessful niche: the Nintendo 64 sold around 30M units and it is considered one of the most iconic consoles of all times. The Quest has sold between 20M and 30M units and it is considered a niche device. The diffusion of the smartphone, which has reached billions of people, has distorted our perception of what “success” means for a tech product.
They also gave a suggestion for young people who want to succeed: ask very specific questions to the people who are experts in that specific field. Do not ask generic questions and do not speak with people who are experts in other fields. And don’t be afraid of contacting important people: show your commitment to what you are doing, show your passion, show the cool things you have built, and ask for specific help, and someone will help you for sure. Palmer can also speak by experience on this side: he started when he was 19 years old, and by showing his cool prototypes, he was able to get the interest of a divinity like John Carmack.
Everyone was expecting Palmer Luckey’s announcement about him working on a new headset. But that was a big anti-climatic moment: while he was speaking, he just said that he’s now working on a “headset with military requirements for military stuff that can also be useful for nonmilitary stuff” (this is not the exact sentence, but more or less it was like this). His current business is technology for the US defense, so it makes sense that he’s working on a headset for the defense. More than an “announcement”, he just said that “for our interest”. It sounded more like “oh guys you know, I’m back at working with headsets, but for my new field, cool story bros”. So we all speculated things like the “Oculus Rift 2” for nothing.
Then the panel ended, the speakers went to backstage and the presenter of the show announced that the organization decided that no one could wait for Palmer to have a picture, an autograph, a chat, or whatever else. Everyone in the room was very disappointed. We all wanted a picture with Palmer, and some people even wanted him to autograph their headsets…
So my dream of having a picture with him after his speech sadly shattered.
One hour before
Around one hour before the beginning of Palmer’s panel, I finished a very interesting business meeting at the Hyatt Hotel close to the convention center. I went to the toilet, then I took some water to drink. I had to head back to the convention center to visit the Qualcomm booth, and I decided to take the path that went upstairs with some escalators that let me easily go from the hotel to the AWE venue. At a certain point, while I was walking towards the escalators, I looked up and I saw two guys walking, including a guy with a bit of an eccentric haircut. I looked better at him: he was Palmer Luckey. At that point, my brain thought only one thing: THIS IS YOUR FUCKING OPPORTUNITY, TONY.
I started running over the escalators, but then again, I didn’t know what to do, I could not jump on Palmer Luckey from the back while screaming “Surprise Selfie!”. So I started walking a bit fast, hoping for an opportunity: luckily he, Darshan Shankar (who was with him), and Sonya from AWE stopped for a picture while outside. So I quickly approached them and stood behind Sonya, while she was taking the picture. I looked at Palmer smiling and keeping my index finger up close to my chin like a little student who has to ask a favor to his teacher. I was embarassingly smiling, then Palmer looked at me, he smiled back, and said “TONY! I saw your tweet today! I know you were looking for me!”. He was referring to this tweet I published that day where I said I absolutely wanted a picture with him.
I was kinda excited and confused: Palmer Luckey knew my name! And he read my tweet! That was fucking cool. No wait, it was even more than cool. Why does he know my name and my face? Oh wow
He had to move very quickly, but he was very kind and let me take a selfie with him. I also hugged him and I thanked him for everything he did for VR. He also thanked me for being a “VR cheerleader”, then he really had to go. He really made me the impression of being a kind and fun guy as everyone that met him before described him.
And so now, after ten years in VR, I have my selfie with Palmer Luckey. And this makes me feel very happy.