New Oculus avatars full body

New Oculus Avatar system hands-on

Some weeks ago, Oculus has announced that it has finally released the new avateering system that had been announced at Oculus Connect 5. Yesterday I have finally been able to try it and I want to give you some impressions on mine about it… also because I have found no hands-on articles on the other VR magazines…

The purpose of the new avateering system is the one of offering more realistic avatars. Facebook is performing various experiments regarding avateering: it is following various parallel roads. So while it releases incremental updates for the Oculus Avatar platform, that has to work with currently available hardware, it is at the same time working on hyper-realistic avatars, in a dedicated FRL (Facebook Reality Labs) center. These hyper-realistic avatars are really mind-blowing, but to work they need some expensive and cumbersome hardware that no one has at home, so it is more something that they are developing for the long term, maybe 5 to 10 years.

So, these new Oculus Avatars have not the ambition of resembling real people, but just to be better than the previous iteration. In the interesting technical article released together with the new system, Facebook details that creating a too realistic avatar would not only create the uncanny valley effect, but would also create the false assumption that it behaves like a real person. This is something that I’ve read also in various articles in literature: the more something looks realistic, the more the brain thinks it should behave like in real life and it becomes disappointed if this doesn’t happen. So, while the brain has no expectations from a cartoon-ish avatar like the one of Facebook Spaces, it expects high realism from an avatar that looks like a real person… and if this doesn’t happen, you will feel that that avatar is creepy.

Facebook has tried to introduce new innovative features for eye and lip tracking, but they are very experimental, so they don’t work perfectly. For this reason, building around them a real-life avatar would have been the road to a creepy result. Facebook had to find a compromise between the will to make avatars more realistic and the necessity of making them less realistic to avoid the creepiness factor.

The result is the one of avatars that are more pleasant than the ones that were before, that look more real, but that are still far away from photorealism. They seem a bit like the ones of a good animation movie.

This, together with the addition of new customization options (new face shapes, hair, facial hair and clothes available), enables the possibility to create avatars that look more like you. And I have to say that I am satisfied with this. I have created a new avatar of myself, and I like it a lot… I can’t say that it is like me, but it is a good step in that direction. The model feels less fake than the one of the previous iteration, it is really better. The only part that felt worse to me are the hands, that seem more made of rubber… they feel a bit less realistic.

And I have been able to customize it a bit better… there were really a lot of hair possibilities this time, for instance, and that’s great. The greatest addition that created a better connection with my avatar is the fact that now my virtual self finally makes me see his eyes (more on this later on) and so feels more like “a person”.

I will feel “more myself” inside VR social experiences, and this is important. Unless you want to impersonate someone else in VR (that is great as well), the possibility of truly identifying yourself with your avatar is very important, in my opinion… and Oculus has improved this sense of embodiment a lot in this update.

new oculus avatar review
Look, it’s me! Now I look like a super elegant tenor… like the new Pavarotti! (After shooting this screenshot, I also thought that my avatar’s face expression looks like taken from a porn movie as well… a tenor porn movie)

But what is super important to know is that in this new update, there is eye and lip tracking for your avatar. And what is super impressive, is that actually this is performed without any tracking hardware on my Rift.

Thanks to its enormous AI and ML capabilities, Facebook has been able to create a technology that is able to emulate your eye and lip movements just by the data that it has from the headset. How is it possible?

Well, for the lips, it takes the data from the microphone and tries to create some facial expressions given the sounds you create with your voice. So, for instance, if you are saying “OOOOOOOOOOOOO”, the system will understand that it has to move the lips of your avatar so that to give to its mouth a rounded shape. That’s quite natural.

What sounded like black magic to me is the emulation of eye movements, that is absolutely impossible without an eye tracking device (like the one by 7Invensun). The eyes move in a very unpredictable way: in every moment, you can choose to look in whatever direction (for instance, if you hear some sound coming from your side) and then there are the saccadic movements, that can make it move in whatever direction very fast every second. So, how to emulate it?

Well, recreating eye movements perfectly is indeed impossible. Hoping that the Rift 2 (the real Rift 2, not that WMR Lenovo headset dubbed Rift S) will have eye tracking, this problem will be solved in few years. But for now, Oculus is satisfied with offering eye movements that are not that bad. From the studies of Facebook Reality Labs, it is possible to know that people often look in a cone of 30° in front of them, but while in VR, because of the reduced FOV, they only look in a cone of 10° in front of them. Furthermore, often you may have a hint of what the user is looking at: if in a social VR application the user is talking with another person, most probably the eyes will be fixated on him/her. So, eye movements are somewhat predictable.

(Image by Facebook)

To improve even more the realism of the avatars, Oculus has also added blinking of the eyes and micro-expressions of the face: our face can make a lot of little movements, so having an avatar that has not only some high-level animations, but also some subtle movements make it more realistic.

Thanks to all of this, Facebook has been able to remove the glasses from all avatars and finally let you see your eyes in a mirror in VR. And I have to say that the results are pretty amazing. It was impressive looking at myself in the mirror in VR and actually seeing a human avatar that looked a bit like me, with eyes and lips moving.

Sometimes, this really looked incredibly realistic. The AI system was able to perfectly predict some lip movements while I was talking in a great way. But I was even more impressed when some eye movements of mine were reconstructed perfectly. I tried to rotate my head while still looking at the mirror and surprisingly, the avatar was doing the same! I mean, the system understood that I was not looking at the 10° cone in front of me, but that I was rotating my head while still keeping my eyes still on the mirror. I was in awe while seeing this. Look at this slow-motion GIF:

https://gfycat.com/GlumFrailAsiaticgreaterfreshwaterclam

Furthermore, the facial micro-expressions really convey more realism. I can’t explain why, but my brain processes them in a subtle way and tells me that they are a good thing, that that avatar is more real.

But I have also to say that there is still a long road to go. The emulation of the movements of the lips and the eyes are far from being perfect, and there are times where this really breaks the magic. For instance, when I look at the mirror and I see the eyes suddenly change direction while mine are fixated, or when I make an “O” sound with the voice and I can’t see the mouth of the avatar having a round shape. When this happens, I immediately lose my sense of presence and that’s a pity.

Furthermore, while the eyes just look a bit robotic, there is something regarding the model of the teeth and the animation of the lips that feels completely unnatural to me. Contrary to what I imagined before trying the system, I found the lips emulation worse than the eyes emulation, if felt to me less correct. You can see this in my video where I try the new avateering system.

My final opinion is that this new Oculus Avatar system is a great step towards having avatars that really look like ourselves, that can become our digital identity. The avatars are more realistic, customizable, and have movements that start resembling the ones of a real human. But this emulation is far from perfection and the road to be ourselves in VR is still long, probably some years long.

And while we wait for this to happen, why don’t you subscribe to my newsletter? 😉

(Header Image by Facebook)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

virtual reality hog awe europe

My experience at AWE Europe and a comparison with the US chapter

I’ve just come home after my trip to Vienna to attend AWE Europe. Many people asked me about my experience with the event and especially what are its differences with the US version, so I thought it could be worth writing a little post about it. AWE is one of the best XR events out […]

xpanceo smart contact lenses hands on review

XPANCEO smart contact lenses hands-on: AR prototypes from the future

At AWE, XPANCEO, a Dubai-based company working on smart contact lenses, showcased a few interesting prototypes of its futuristic technology. I was able to even put my eyes close to one of them and I want to tell you everything about this experience! XPANCEO If you’re not new to this space, you will surely remember […]