I have many sweet memories of my trip to China. One of the best of them is when in Beijing I visited the company 7Invensun, a worldwide excellence in eye tracking. It has been a great moment both from a personal and professional side: as a person, I was delighted by their kind attitude: they really made me feel as a friend of theirs; as a VR professional, I was impressed by their eye tracking technology and the various devices they have down the line, that for instance will give eye tracking to the Vive Focus and HoloLens. Let me tell you the whole story.
For sure you remember the name 7Invensun: I have already interviewed them when they announced that their eye tracking module aGlass DK II was able to provide eye tracking not only for the Vive, but even for the Vive Pro. So, when I decided to go to Beijing, I wanted absolutely to go visit them to try it with my eyes.
The company is located inside a skyscraper in Beijing and has a very nice office. When I arrived there, I was overwhelmed by their kindness. I have spoken with them sometimes on Wechat, but they received me and my Chinese assistant Miss S as I were their best friends. I started shaking hands with people I had only spoken to virtually, like Lee and Kristina and it was really nice seeing them in person.
After that, I started trying 7Invensun devices. We started with the aSee Binocular Eye Tracker, an eye tracking device for desktop PC. PC was the first target market for this company, that started working on eye tracking in 2009 (so even before Tobii). The first goal was helping disabled people in using electronic devices, but then they realized the many other amazing applications of eye tracking and so expanded their offering. A 7Invensun employee (I don’t remember the name of that girl, I feel so sorry) started showing me various photos on the screen of the PC and the only thing that I had to do was looking at them. (Of course, knowing that my eyes were tracked, I purposely decided to avoid looking at compromising details of the various photos :D) After this little presentation, she showed me the heat map of what I looked at, showing what were the images I looked at the most and what were the details of the images in which I was mostly interested into. Then she opened an Excel file and I could clearly see there all the data regarding my eyes movements while looking at the various photos.
I was impressed: the program recorded everything I looked at and this has amazing applications, as the girl confirmed: for instance, this can be used in psychology (seeing what kind of images arise interest in you can help in discovering psychological issues, for instance) but would be massive in UX design. Imagine if you were a website/app designer and you could see the journey of your users’ eyes over your page. You could discover where they are instinctively looking for information first, what are the regions that they look the most and re-organize the page according to this precious data so that it becomes more usable and effective.
She told me that actually lots of different professionals are using eye tracking applications: for instance, it is also used by architects to analyze what parts of a building attracts the users’ attention or by trainers to see if the trainees are actually paying attention to what they say. 7Invensun is working with all these figures to exploit the power of eye tracking, that can disrupt various sectors.
After this little PC demo, we switched immediately to VR. Lee handed me a Vive Pro with an aGlass DKII installed inside and I was ready for the party. Before actually using it, I had to perform a little calibration stage, that is necessary to adjust tracking parameters to my particular eye configuration. I had basically to adjust the IPD of the headset mechanically to fit my eyes (it would be cool if the Vive Pro would be able to adjust IPD automatically depending on what the aGlass device detects) and then look at some points popping up in different positions on the screen. It was very fast, it took less than a minute and it was necessary only once per session. After that, I have been able to try various demos.
The first one was about foveated rendering: they activated foveated rendering and showed me that it worked. To me is a bit strange evaluating the performance of this demo, since they had not an on/off switch on the foveation, so I wasn’t able to tell if I couldn’t notice any difference with and without the foveated rendering. So I can’t guarantee that there was absolutely no graphical difference than using standard rendering. For sure, with foveation the visuals were great and I wasn’t able to spot that the device was changing following my eyes and downgrading the regions I was not looking at. So, it worked well. Then they also made me see foveation on NVIDIA VR Fun House. With that demo, I noticed that with very aggressive settings on foveation (so, the areas that you are not looking at gets downsampled a lot), the difference is noticeable, so I learned that foveated rendering parameters have to be calibrated well, otherwise the trick is noticeable. They also slowed the eye tracking and so I was able to see how is foveating rendering when the software lags following your eyes… and it is a trippy experience where you see a high-resolution window moving inside your vision 🙂
I think that foveated rendering will be a fundamental evolution for virtual reality, because it will relieve the work of the graphics card and this will mean that from one side VR developers will be able to deliver experiences with a better graphical outcome and from the other side that even people with non powerful graphical cards will be able to use virtual reality headsets. I really can’t wait for it to become widespread.
Another VR demo that I tried let me interact with a fantasy world just by using my eyes. I could move inside VR using only my eyes: I could look at a particular position in the world and then teleport there. And then I found myself in front of a table with three objects on it and just by looking at them, I was able to see further info about them. I felt a bit like Terminator, that was able to see information about the objects and people he looked at. I had super vision. Another one let me be in a plane and shoot at enemy planes that came towards me just by looking at them. It was fun. I think that using eyes to interact with stuff can not only help us in having less hands and neck fatigue while using VR apps, but can also help disabled people in using virtual reality experiences.
The last demo was one about analytics: I found myself in a virtual supermarket, and I was able to buy stuff in VR just by picking items naturally. In the end, I could go to the cash desk of the supermarket and pay for what I bought. After I did my shopping, 7Invensun employee pressed a key and so in VR, I could see the 3D world around me becoming a heat map of what I looked at. The world was white where I had never looked at, and instead showed a color ranging from green to red for points I looked at, depending on the time I stared at them.
These analytics data can have two important applications:
- See what kind of products people are mostly interested in;
- See where people mainly look for products, so that to adjust the supermarket structure.
These are precious data for all the retail and e-commerce firms. In fact, this demo has been developed together with JD.com, that is one of the most important Chinese e-commerce websites.
I loved all of their demos. But as always, I have also some concerns I want to tell you:
- First of all, privacy. Eye tracking is awesome, but it gives companies like Facebook the power to discover everything we look at. This is frightening because at the moment companies can mostly track what we voluntarily do (like putting a like or sharing something), while with eye tracking they could discover also what we instinctively look at, what we are unconsciously interested into. I am sincerely afraid of this and so I hope that there will be a regulation regarding the use of eye-tracking data;
- The second one is about tracking accuracy. The tracking technology worked very well while I looked in front of me, but tracking precision degraded as I moved my eyes to look too much towards left, right, up and down;
- The third is about using the interfaces. When I had to use my eyes to perform some actions (like looking at a point to teleport there), I found it very comfortable and easy, but at the same time a bit strange, since I don’t usually use my eyes to perform actions. In my real life, I use eyes to inspect things, not to operate on them. I found more naturals the experiences that let me use my eyes in the normal passive way. This taught me that at first, we should focus on using eyes in VR in a natural way and then maybe slowly move to use them to interact with stuff. I think we all will have to learn to use eyes to do more things than the ones that we are used to doing now;
- The fourth thing is about fatigue. I tried to do an extreme test and didn’t move my head and just moved my eyes to shoot at the enemies in the action game. The result is that my eyes felt really tired of having to continuously move to shoot at stuff. So I learned that using eye tracking doesn’t mean not moving the head: the more comfortable thing is to move both in a natural way.
Anyway, I was satisfied with the tests and also Miss S, that is not a techie, liked it and had no issues in learning how to use it. That’s great and means that using the eyes is so natural that even general consumers do not need a tutorial. After the tests, I also realized that eye tracking in VR is not consumer ready yet, for the above reasons. On the hardware side, we need devices that work every time with great precision and with almost no calibration; and on the software side, we need programs with a proper UX for using the eyes. That’s whey the aGlass is still a dev-kit. A very cool dev-kit, IMHO, but still a dev-kit.
After all these demos, we all went to have lunch together and I ate a lot of delicious Chinese food. After that, I met the Company CEO, Mr. Huang Tongbing and we all did a meeting to talk about my experience with the device and the future plans of the company. Miss S has helped a lot during this stage in translating stuff from Chinese to Italian, so I have to thank her a lot.
The first question was about their future plans: such a cool device should be available all over the world and shouldn’t be relegated to the only Chinese Market. Mr. Huang told me that 7Invensun’s aGlass devices will go worldwide. At first, they will be launched very soon in countries that are nearby China and then, with time, they will try to conquer the whole planet.
I asked him about the problems that I had during the demos and he said that they are working hard in improving the tracking and also in offering a proper UX that exploits the eye tracking. They are also working in making the aGlass more comfortable for the user. Regarding my issues in having less precision of tracking when I looked far from the frontal direction, he explained that this is a technical issue. When the eye looks forward, the system can clearly see the iris and the pupil having a circular shape, but when you look too much left or right, for instance, part of the iris becomes occluded by the eyelid and so it has a different shape in the images shot by the internal cameras. This makes the tracking more difficult. They have an enormous database of eye images and are training the AI to solve this issue. He also added that since the headsets’ FOV is currently limited, people tend to not move the eyes that much, so the most important thing is having a better tracking in the central region of vision.
In his opinion, eye tracking in VR will still need 2-3 years to become widespread and to be used properly by all the VR experiences and devices. So the technology is on the right track, but it is not ready yet. Regarding the future of VR in general, he said that virtual reality needs lighter headsets with higher resolution and improved comfort to become successful. And it also needs a better UX, something that eye tracking can help with.
Regarding the future, I asked for standalone headsets. I love the Vive Focus, and I would love to see eye tracking on it. I thought he would answer that he couldn’t tell anything, while actually, he showcased me everything in preview. I’ve seen it: there is an aGlass device also for the Vive Focus. It is not ready yet and the Focus it was installed on was clearly hacked to fit it in. But there was it. I can’t show you the pictures, but it was basically a Vive Focus with two eye tracking inserts around the lenses and the USB C cable of the eye tracking addon that went into the USB C port of the Focus. Some screws around the power-on button showed that the device had been hacked to use it with eye tracking.
I wasn’t able to try it, but they let me use the eye tracking module installed on a similar Qualcomm Snapdragon 835 VR Dev-Kit headset. It was there and it worked. Performances were a little worse than the one working on PC in the sense that the problem of tracking precision becoming bad while looking at the external region of my vision was more noticeable. But I was amazed nonetheless: eye tracking on standalone can be really disruptive: along with all the above reasons, think about the performance gain that foveated rendering can give on standalone devices. These limited power headsets could be able to run games with great graphical quality, improving a lot the user experience. My mind started thinking about Robo Recall on a mobile headset, then I remembered being in a Vive-X company and so avoided citing Oculus 😀
After this amazing demo, I tried another one: a device called aGlass Holo that can add eye tracking to HoloLens. They made a custom frame that can be installed inside HoloLens to have eye tracking even in the HoloLens v1! Eye tracking will probably be available only in next-gen HoloLens (and is already present in Magic Leap One), so having a solution that can upgrade the 50000 existing Hololens is interesting. This AR solution is customizable, so it can potentially be customized for every AR headset on the market. They made me try it in exclusive preview, saying that it is still a work-in-progress: I was super happy, of course. I have to say that performances were worse than the ones of the other devices that I tried (it’s just a prototype, it is normal), but when it worked, it was great. Wow, eye tracking in AR!
After all these awesome demos, I had to left them. I was quite sad about leaving, but they gave me the last smile by giving me as a gift an aGlass DK II, to show how greatful they were of my visit in Beijing and to let me start experimenting with eye tracking! I really want to thank Mr. Huang, Kristina, Lee and all the other 7Invensun employees for the wonderful time I had there.
And in these days, I’m actually experimenting with it and I will keep you updated on this blog regarding what I’m doing with it. Of course, expect also an unboxing and a review of this cool device! After using it for various days, I have a better picture of its pros and its cons, so stay tuned!
I think that the devices offered by this company are really interesting. 7Invensun’s people are looking at companies and developers interested in collaborating with them and in using their products. If you are interested in a dev-kit, feel free to email them saying that you got to know them through my website. Or if you need to talk about partnerships, feel free to ask me for an introduction, I will be happy to help.
After my previous post, they even shipped some free aGlass devkits to people wanting to do cool experiment with their products, so if you have an HTC Vive or HTC Vive Pro and a great idea in mind that exploits eye tracking, shoot them an e-mail 🙂 . (Just to clarify: they don’t give free kits to everyone, but if you have some awesome project that gets approved by them, they can send you a free kit.) I’m sure that eye tracking is the future of VR, so being able to experiment with it from now, is very important for us VR developers.
And for me is very important that you share this post on your social media channels and that you subscribe to my newsletter… so don’t you mind doing these two little actions? Thanks 😉
(Header image by 7Invensun)
Join my super-exclusive club!
Receive for free AR/VR articles like this + a weekly roundup of the most important XR news of the week straight in your inbox!