My last demo with Mojo Vision AR contact lenses

mojo vision eyes on hands on

This year has begun with a piece of very sad technical news: Mojo Vision, the startup that was working on smart contact lenses, has decided to pivot to manufacturing microLED displays. The reason is that they haven’t been able to find the funding necessary to keep developing such a futuristic project, and had to resort to working on a technology that has an immediate market. Because of this pivot, 75% of the staff has been laid off.

It’s sad because Mojo obtained incredible results, and just a few months ago, they managed to put a smart contact lens in the eye of the CEO.

This is one of the most cyberpunk things I saw in 2022

They were making science fiction real, and I was totally in love for their work. And for this reason, I want to celebrate the great results that they had by talking about the last demo I had with the device at AWE US last year, a hands-on I haven’t described yet in this blog. I’m doing this as a way to remind the team, especially the ones that are not at Mojo anymore, that they managed to accomplish something incredible and should be proud of it. Things have not gone as wished, but still what they obtained was already remarkable.

Mojo at AWE US

Me trying the lens at AWE US 2022

I met Mojo Vision at AWE US 2022, and with me visited their room also my friend Eloi Gerard and Mister President Alvin Graylin. I was the only one having an appointment, but as an Italian, I’m not good at respecting rules, so I brought some friends with me. People at Mojo were happy to see me and we chatted a bit about what they were working on.

They had two demos to show: one was with a VR headset to showcase the UI that was available through the lens, and one with the real lens installed on a stick. And this time, for the first time ever, they let us took pictures of it!

Mojo Vision lens hands-and-eyes-on

Expanded view of Mojo Vision lenses (Image by Mojo Vision)

The company brought there the latest edition of the lens, the one that was “feature complete”, integrating inside it:

  • A microdisplay to let the user see the augmentations;
  • An image sensor for seeing the surroundings and process them through computer vision (e.g. edge detection);
  • Eye-tracking sensors (accelerometer, gyroscope, magnetometer);
  • A battery system;
  • A 5Ghz radio communication antenna to make the lens communicate with an external unit;
  • An ARM0 processor, that acts as a “traffic cop” for the data.

Putting it on my hand, I was really like “How the hell can this small piece of plastic contain all this electronic?” and I stopped for a while looking at it, in awe (pun intended).

The mojo vision lens, in my hand. Finally, I was able to take pictures of it…

The company had prepared another lens, with a stick attached to it, to let us have a demo. Of course, we couldn’t put the lens in the eye, but just hold the stick and put the lens very close to the eye to simulate its usage. Of course, it was not the same, because eyes continuously move fast, but it was good enough to have a taste of the real experience.

I so could appreciate the demo applications that should have run on the lens. For instance a sample circular UI, and some sample applications, like the teleprompter and the compass.

My friend Eloi Gerard trying the lens on the stick. I swear this is not the face he has all the time.

I started by putting the lens close to my eye, and what appeared at first sight like a green dot, turned out to be a full display. Yikes. The display was green monochrome and on it, I could read some text lines. What really surprised me is that the text was readable. Of course, it was not very tiny, but not enormous, either, and the fact that it was readable was cool. It was not perfect: since it was green, it reminded me of the old displays of the terminals or the Commodore clones. I found the content inside was trembling a bit, and the color seemed to slightly leak from one pixel to the surrounding ones. But these may be artifacts due to the fact that the lens was held by my hand, which of course is not perfectly still.

This is how I was carrying on the demo. So close to my eye, but not inside it…

The same caveat held for other issues I found: the FOV was a little smaller than I expected, and there was a slight barrel distortion. Mojo people told me that those issues were surely given by the fact that the lens was not on my eye, while its optical properties were tuned for on-eye usage. The last problem was the number of pixels: the display is made of 14,000 pixels which means around 120×120 pixels resolution, which is usually not enough to show a single image with high definition. But considering the dimension of the screen, it was already impressive to be able to see images and read texts.

An image visualized by Mojo Vision monochromatic display (Image by Mojo Vision)

I tried a few demos: in one, I could see a circular interface that let me select elements of a menu. Around a circle, there were some buttons, and by staring at those buttons, I could trigger some popups. Another demo showed a mockup of an interface for cyclists: in front of me, I could see a small popup with info about the route I was following and the data about my physical parameters. Notice that these interfaces were not “attached to the eye”: they had a fixed position in front of me, and moving my eye I could uncover part of them. Imagine something like a popup inside the Oculus Quest, just green and more “essential in graphics”: it was there in front of me, and I could see of it only the part that fell into my fovea.

https://gfycat.com/limpickyeuropeanfiresalamander
Sample video showing the interface for cyclist on Mojo Vision lenses

I also tried the teleprompter: I could see a window with some text, and I started reading out loud. When by moving my eyes (or better simulating it by moving my head, because the lens was on a stick) I arrived at the last line, the system started scrolling the textbox automatically, unveiling new lines of text. This was similar to a demo of the HoloLens 2… just made with a tiny contact lens :O.

The last demo was my favorite: the compass: in front of my eye, I could see in what direction I was looking at: so in the beginning I saw a W, then rotating, a N appeared, and so on. I loved it because it showed that the lens was not only acting as a display, but it could integrate sensors that interacted with the real world.

The lens on the stick

The VR Demo

People at Mojo invited me to also do the VR demo, because it showed how the lens works when in real use inside the eye. As I’ve told you, I noticed some problems with FOV and also distortion when trying the lens which were due to me not putting the lens into the eye. Since they couldn’t make me wear the lens, they showed me in VR how could the output of the lens be when worn. They used a Vive Pro Eye to simulate what is the view that you have by wearing the contact lens. The Vive Pro Eye employs eye tracking, so can put directly straight in front of your eyes, exactly like a lens would do.

Concept of Mojo Vision demo interface (this was not was I was shown, but it is somewhat similar): what you see here is resemblant to the popups that I could trigger by staring at the icons that were in the periphery of my vision (Image by Mojo Vision, from Fast Company’s article)

The VR demo showed a circular UI similar to the one I tried in AR: I could point at icons and activate some popups. Actually, I could check that the expected FOV was much bigger than the one I experienced with the lens on the stick. Furthermore, the VR demo made me notice a few things I missed with the lens stick.

The problem with demoing a contact lens on a stick is that our eyes are very complex organs: for instance, they move fast and move a lot, so forcing them to look straight forward through a stick all the time is not natural at all. In VR I could move my eyes around, and let them explore the environment through their typical saccadic movements. The result of that was that the “perceived FOV” was bigger than the actual one: my eyes moved fast while looking at the interface, so in very close instants they were focusing on different parts of the UI, and it was like my brain could reconstruct the structure of the whole interface from all these quick glimpses, having much more context than the one I had while looking only straight in one direction. This was a great thing.

https://gfycat.com/dismalalertatlasmoth
Another sample of what Mojo Vision lenses allowed me to see. You can see in this video that even if in every frame you just see a different section of the interface, your brain is able to keep a “map” of it and reconstruct the whole UI

On the cons side, the movements of the eyes may lead to the so-called “Mida’s touch problem”. If an interface is purely gaze-based, the risk of triggering unwanted things with the eyes is very high. Eyes move fast in every direction and sometimes fixate on an object to analyze it better. So the risk with Mojo was that you could stare at looking at some real objects for some seconds and inadvertently trigger some UI buttons. Mojo Vision people told me they were working on it.

A new path

This image shows you how tiny is the display used by the AR contact lenses. This is from where Mojo will start again (Image by Mojo Vision)

All of this that I tried was mindblowing, and again, all people working at Mojo Vision at that time should be proud of what they accomplished. Now the company is aiming at entering the microdisplays market and putting in hold the project of developing smart contact lenses. The tiny display of the lens was truly impressive, and it is a little gem that Mojo will try to exploit to keep itself alive.

We are talking about a MicroLED display featuring 14,000 pixels per inch. Measuring less than 0.5mm in diameter with a pixel-pitch of 1.8 microns, it is the world’s smallest and densest display ever created for dynamic content (according to the company). It is a technical wonder on its own, and I’m sure that Mojo will find other companies interested in this technology, maybe as customers or partners. I wish my best of luck to Mojo for its new course and also to the laid-off employees, hoping that they can find a new job soon.

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...