7Invensun aGlass DK II unboxing, setup and review: a great dev kit for eye tracking in VR

In China I met the company 7Invensun, that is a worldwide leader for what concerns eye tracking. Being them very kind, they gave me as a gift an aGlass DK II eye tracking add on for HTC Vive and I used it to experiment with eye tracking interfaces. After some days of usage, I think I can do a review of this interesting device, so you may discover its pros and its cons. So, keep reading to know everything about the aGlass DK II!

Unboxing

The device comes with a black box with some Chinese writings in it and a big “aGlass” box caption on top of it, with the G that has a shape that reminds a lot the shape of the eye tracking modules. Opening it you’ll find everything that you need to empower your HTC Vive headset giving it eye-tracking capabilities: two eye-tracking modules, some lenses to use the device if you have eye problems (so that you can avoid wearing glasses), a cloth to clean the lenses, the hub to connect the eye tracking modules to the USB port on the Vive and the instructions printed on paper. In the box, there is not much, because you don’t need lots of stuff to add eye tracking features to your VR headset.

aGlass DK II box on top of HTC Vive one

I always comment about the beauty of packaging and this time I won’t do any exception: the items are packaged very well and all the sponge makes them stay safe, but I can’t tell that it is a delight for the eyes. This is still a dev kit, so it is ok. But for the consumer version, I hope they will invest something more to make the arrangement inside of the box more fascinating.

The device

As 7Invensun explained me, each eye tracking module is composed by a ring of IR emitters that light the eye (in the dark of the internals of the headset) and an IR camera that sees the eye as lighted by the emitters. The camera shoots images at high frequency and then sends them to the PC via USB, thanks to the USB port that is present on the front panel of the HTC Vive. On the PC runs a service that gets all these images and returns the position of both eyes, so that you can use it in VR.

The eye tracking module for the left eye installed inside my HTC Vive: all those tiny white points that you see there, are the IR emitters, while that black circle in the lower part of the image is the IR camera

If you want some specifications, here you are:

  • Accuracy: < 0.5°
  • FPS: 100Hz/120Hz
  • Delay: < 5ms
  • Communication Port: USB2.0/3.0
  • FOV: >110° (adapted to HTC Vive)
When photographed properly, you can see the IR emitters actually casting light
Setup

Inside the box, you can find the instructions on how to assemble the device, both in English and in Chinese. The hardware setup is really easy. The only difficulty I had was actually in inserting the two modules inside the internal space of the Vive, that is very tight, but once I get that the solution is making the two lenses the closest possible by moving the IPD knob, everything went very smoothly.

The setup is just basically attaching the two eye-tracking modules on the lenses and then connecting them to the USB port present on the Vive. If you need an assembly tutorial video… well, here I am to help you!

After the hardware assembly, you just connect the Vive headset to your PC as always. After that, you go to 7Invensun website and download the aGlass DK II runtime from the downloads page. If you plan to develop, download the SDK v2 archive (that contains the runtime, too), otherwise go for the runtime v3. I’ve provided you the direct links so to make your life easier, aren’t I the best VR blogger ever? 😀 😀 😀 . Unzip the archive, launch the setup file, follow all the instructions to install 7Invensun aGlass runtime.

At the end of the installation, the system will suggest you to start the calibration from the aGlass menu. As I’ve learned when I interviewed 7Invensun for the first time, calibration is currently a necessary stage for all eye tracking devices. The system has to understand the peculiarities of your eyes and so before using it, you have to look at some predefined points, so that the system learns how you do look up, right, down, left and so on. If you don’t calibrate, you can still use the system, but the accuracy won’t be optimal. Eye tracking companies are working on tricks to reduce calibration time or eliminating it at all because it introduces friction, but at the moment you have to perform it.

My HTC Vive is now able to track my eyes! Wow!

Calibration is easy and it is divided into three steps:

  1. Adjust the IPD of your headset so that it matches your eyes. The eye tracking system can track your eyes so it can detect your IPD. Unluckily, Vive devices have not an automated IPD modification system, so you have to change it by hand. Follow the instructions on the screen and rotate your IPD until the system tells that it is ok;
  2. In phase two you have to move your headset so to center physically your eyes within lenses. The system shows how you have to move the headset so that with your hands you can adjust it. I have to say that this is the trickiest phase because I actually never get how I do have to properly move the headset to fit my eyes and when the system says that it is ok, the headset is actually not resting on my face anymore, but it is a bit distant. The company is working on fixing this little issue… and anyway pressing the SPACE key on the keyboard is already possible to skip this part;
  3. Look at the points that the system will show you on the screen. The system will show you a short sequence of little points on the HMD screen and you have to look at them until they vanish away. These points are strategically positioned so the system learns how you look in the various positions (up, down, left, right).

Apart from step 2, that has some little issues, the calibration is really easy and fast to be performed: in 30-60 seconds you’re done with it. After that, you just see a final screen that lets you see if the calibration has been successful. You have some points and looking at them, they become highlighted. If at this stage, you see that the precision is bad, you can retrigger calibration. Otherwise, press the Menu button on your Vive and exit. You have successfully setup your eye-tracking device!

After that, you will be able to see the aGlass tray icon in your taskbar. Right-clicking on it, you will be triggered with various options. One that I want to highlight is the possibility to re-trigger calibration or also set the current user. The system allows you to set the current user, so that you can calibrate the device for different users on the same PC and then just by using this options menu, you can load the calibration parameters for each user, so you don’t have to calibrate the device again, even if different users use it. Cool, isn’t it?

The menu that you can access through the taskbar: as you can see it offers various facilities regarding calibration
Comfort

I was afraid that the eye-tracking inserts could make the comfort of the Vive worse, but this is mostly false. You just feel some little discomfort in the zone of the nose and in the zone between the nose and the eyes, because now in that zone there is the hard plastic of the eye tracker and not only the rubber of the headset. I have to say that this is not a big issue at all and I’ve got used to it. Of course in future iterations of the device, this should be improved.

I’m an eye tracking pirate!

The fact that aGlass DK II features additional lenses for eye problems (presbyopia, myopia) that you can put on the modules so that you can use eye-tracked Virtual Reality without wearing glasses is a nice choice by the company, that improves the comfort of using the device.

Runtime performances

If you want to test the device, you can get some demos from the aGlass downloads page. Otherwise, you can code something using Unity and do some tests yourself.

I’d define the tracking as pretty accurate, but as I’ve noticed during my visit to 7Invensun HQ in Beijing, it is still not perfect. Especially when you look too much up, down, left and right, the tracking accuracy degrades a bit. When you are mostly in the central region of your vision, instead, the tracking is very accurate. This can sometimes be frustrating a bit because when in a VR experience you use the eye to select objects, it may happen that 5-10% of the time, you look at an object and it doesn’t get selected because the eye cursor is in a slightly different position than the one you would expect and so the selection mechanism doesn’t work. Regarding speed, instead, there is no problem at all and the system is very reactive to eye movements.

I guess that you may be curious of a comparison with Tobii, but unluckily I have never tried Tobii tracking technology in VR, so I can’t tell.

I recall what Abrash said years ago, that is that a mouse should work 100% of the time, not 90% of times otherwise users get frustrated and I have to say that with eye tracking we are still in the latter situation… and that’s why we have still not eye tracking installed on all headsets. But I think that it can be already used to do amazing research regarding where the user is looking at and to experiment with eye-tracked UX, as we are doing right now. So, it is dev-ready but not consumer-ready.

Unity SDK

The aGlass SDK contains plugins for Unity and Unreal. Using the Unity SDK is rather easy: you have just to check if the runtime can track the eyes and then ask for eye position and then use this eye position to do something. The SDK contains also a little sample to better understand how to use it. I learned how to use it pretty fast. To get the eye ray for instance you just need these few lines of code:


        //check if eye tracking is valid at this frame
        IsEyeTrackingValid = aGlass.Instance.GetEyeValid();

        //if it is
        if (IsEyeTrackingValid)
        {
            //get current gazed 2D point
            Gaze2DPoint = aGlass.Instance.GetGazePoint().ToVector2();
         
            //transform it in a ray
            GazeRay = Camera.main.ScreenPointToRay(Camera.main.ViewportToScreenPoint(new Vector3(Gaze2DPoint.x, 1 - Gaze2DPoint.y, 0)));            

            //do something with the ray of the eyes
        }

In my opinion, the problem of the SDK is not what is there (that works and works well), but what is not there. There are no facilities (e.g. to select objects by just using your eyes, etc…), there are not many samples, there are no additional services. At the moment it is just a barebone SDK that gives you the minimum necessary features. I think that it can be improved to offer more features and make the life of us developers easy. Again, maybe this will be offered with time, when the device will approach the consumer version.

A little analytics prototype that I made with the aGlass DK II inside Unity. It was fun and took me quite a little time to code it

While developing I also noticed that sometimes the runtime crashes and it has to be restarted or that Windows show some weird pop-ups about a disk it can’t access. Nothing terrible, but a thing you must be aware of. The company has told me that some of these bugs have already been fixed in version 3 of the runtime.

Final judgment

I love the aGlass DK II. I’m doing a lot of experiments with it and I’m now more convinced than ever that eye tracking will be disruptive for VR. Personally, I’m using it to experiment with eye tracking UX with my partner Max at New Technology Walkers and we are doing with it a lot of interesting things like this one that you can see in the video. In a future post I’ll detail you all these experiments, so stay tuned to learn more about eye tracked VR!

But if you get aGlass, you have to understand what it is: those two letters in the name, D and K, are there for a reason: this is a dev kit and not a finished product. Being a dev kit, it can show some little accuracy issues, it can have occasional crashes, it can have basic features in the runtime and in the SDK. As a developer I already experienced all of this with other products: Oculus Rift DK2 headset itself made Unity crash constantly on my laptop and sometimes froze it completely.

I think that if you are a developer, or a customer wanting to use it for an experimental product, you will like it. It is something that is worth investing in because eye tracking will be disruptive for virtual reality (for analytics, foveated rendering, new UX, support for disabled people, etc…) and that’s why I’m hyped about it. If you instead want something that works flawlessly all the times, you had better wait some time before entering the eye tracking realm.

Regarding the price, it is still undisclosed, and at this moment it may also depend on the use you want to make with it (if you are a dev, an academic institution, etc…). So, if you want to use it, you have to contact directly 7Invensun. They are kind people, so don’t be afraid of sending them an e-mail 🙂


I hope to have satisfied all your curiosity about this device, but if you have further questions please ask me here in the comments section or through my social media channels. And please also subscribe to my newsletter! 🙂

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...