siggraph vr 2017

The best VR news from SIGGRAPH 2017

Hello everyone and happy Monday! Last week in Los Angeles there has been the SIGGRAPH. What is the SIGGRAPH? Well, copying-and-pasting from its website:

SIGGRAPH is the world’s largest, most influential annual conference and exhibition in computer graphics and interactive techniques: Five days of research results, demos, educational sessions, art, screenings, and hands-on interactivity featuring the community’s latest technical achievements, and three days of commercial exhibits displaying the industry’s current hardware, software, and services.

Long story short: it is a cool exhibition of cool technological stuff regarding computer graphics and computer vision. Of course since VR nowadays is one of the major trends in computer graphics, it has been an important theme at SIGGRAPH. In this article I want to make you a short summary of all the VR news you may have missed about SIGGRAPH (it is fundamental to write its name all capsed, or the world may implode). It will be something like the Week Peek I send you every week through my newsletter (if you don’t know what I’m talking about… REGISTER IMMEDIATELY to my newsletter using the form on the right… or the world may implode again!), but it will be all tailored around our favourite event with the caps-locked name.

There haven’t been breakthrough announcements, so I’ll just present the news in a random order.

Real Baby – Real Family

This is an interesting project that comes from Japan. Basically, it is an experience to let you try how it is to have your own baby. The system scans your face and the face of your partner (if you have one… if you’re a forever alone, you can provide a photo of the wife/husband you want to emulate… I’d go for Scarlett Johansson for instance) and then reconstructs the possible face of the baby you may have. This is possible thanks to deep learning and other kinds of AI magic: it takes some special points on the faces of the “parents” made younger and then applies them to a model of a little baby. This way you can see how your child can look like.

After that, you and your other special one may wear a HTC Vive headset and take in arm a doll of a baby that has a Vive controller sticked inside its head (pretty creepy, seems a gadget from a VR horror movie where the evil character kills people with Vive controllers and then play with corpses in VR!). You can so feel your baby, perceive its weight, touch it thanks to the doll and see it in VR thanks to the headset. Then you can start caring for your baby and learn how to be a good parent. The game is in my opinion a very interesting didactic experience, to show people how it is to be parents. And from a technical standpoint it is pretty cool.

HP Z VR Backpacks

HP has entered the VR backpacks market with its HP Z VR series.

On the dedicated VR Focus article, you can read that this backpacks specs are:

  • Intel Core i7
  • Maximum 32 GB RAM
  • NVIDIA Quadro P5200 with 16GB video memory
  • 4.65kg of weight

The coolest feature of this PC is that HP provides you a docking station so that you can also put your backpack steadily on your desk and use it as a desktop PC. Don’t know why this feature should be useful since backpack PCs are usually targeted at arcades and not to final consumers, but that’s ok. In September you can spend $3299 to buy one of them and let me know.

So, HP is entering the VR arcade world. This is cool since this proves once more that this company is dedicated to VR and that believes that VR isn’t a fad. But honestly, this is not the most interesting news ever, considering also the specs of the PCs.

HP ARCADE BACKBACK VIRTUAL REALITY
HP PC can be used both in a backpack and in a docking station. This is the greatest innovation that HP has taken in the backpack PC world (Image by VR Focus)
NVIDIA HoloDeck and Isaac

NVIDIA as always has showcased various solutions to make graphical renders faster and faster, better and better. But I want to focus here on 2 projects involving VR.

The first is the HoloDeck: a collaborative environment that NVIDIA is developing to make people collaborate in VR, especially dedicated to 3D modeling, 3D worlds creation and such. Details on HoloDeck are still not clear, all we know is that it should make different kind of people (like 3D artists and developers) to collaborate in VR. So a social VR experience dedicated to productivity, a bit like Bigscreen VR. From NVIDIA website:

Holodeck is an ideal platform for game developers, content creators and designers wanting to collaborate, visualize and interact with large, complex photoreal models in a collaborative VR space.

Seems pretty cool to me, considering the fact that it should support even models with 50million polygons (yikes!). If you’re interested, the project should enter its beta stage in September 2017. Register here to have more info in the future about it.

The second project is Isaac, a funny robot that has been trained by NVIDIA to play domino with you. Isaac, thanks to computer vision and artificial intelligence magic can play domino with you in the real world and this is cool (I love robots and this one also has a cute face!)

Nvidia robot virtual reality
At first they beat you at domino, then they will take control of Skynet (Image by NVIDIA)

But the coolest part is that you can put a headset on and play domino with him in VR, too! And all of this can happen inside the HoloDeck environment. Why is this so cool? Well…

  1. This means that HoloDeck can allow real people and bots to collaborate in the same environment;
  2. This means that a robot can work both in the real and in the virtual world;
  3. Even more… since for a robot what is real and what is not has no importance (he doesn’t ask himself if he’s living in a simulation like all my Quora followers), he can work with no difference in real and virtual environments. If you teach him a rule in the real world, he can follow it in the real world but also in the virtual one.
  4. Even even more… this means that if for him the virtual and the real world are the same, you can train the robot in the virtual world and then make it operate in the real world only when he’s learned everything.

The last point is the disruptive one: making a bot learning by trial and error in the real world can be a problem, since this could mean damaging the robot for every error it makes and also a lot of money spent during the training. Or, even worse, it could harm people. But if the robot can operate virtually in the virtual world that has realistic physics (NVIDIA manages the famous PhysX physics engine) and realistic graphics (it’s NVIDIA…), it can simulate all its movements inside the virtual environment and then learn safely by its errors while in VR. When the robot has learned enough, he can start operating in the real world. Nice move NVIDIA.

Terminator Thumbs Up GIF - Find & Share on GIPHY

Four poles GVS

You all know about motion sickness and what causes it: the eyes perceive that you’re moving in VR, while the vestibular system perceives that you’re actually still and this sensorial discrepancy makes your brain think that it’s a good idea to puke everything you’ve eaten (to expel poison). I’ve already talked about how a developer can design a game to reduce this sickness effect… but what about if we could trick the vestibular system in thinking you’re moving so to make its info coherent with the ones of your eyes?

Well, actually this is possible thanks to systems employing Galvanic Vestibular Stimulation: basically you put electrodes near your vestibular system and then fire electrical stimuli at it to emulate sensations you want. So you can trick it into believing that you’re moving forward, backward, towards left, right, etc… GVS is around since a while, but it has always had mixed feedbacks from the community: a lot of people don’t like the idea of messing with such delicate part of our body using electricity (and I admit I’m pretty scared myself) and other people have reported side effects after having used it. Following video is about last year GVS solution from Samsung.

At SIGGRAPH, according to Road To VR, professor Aoyama has showcased an innovative GVS system employing 6 electrodes, so you can fire electricity to more parts of your face. This allows the simulation of movement and accelerations on more directions: if you use only two electrodes, you can only simulate acceleration towards left or right, while if you employ more you can simulate even upwards and downwards acceleration. Very cool if you have to play a space-themed game.

6 pole GVS virtual reality siggraph
The system proposed by professor Aoyama (image from Road To VR)

Aoyama claims that this GVS system can be made very little and embedded into a headset. This is really interesting for the future of VR… but honestly, I wouldn’t like to beta-test it. Hope that some braver man (or woman) will sacrifice his ears for the future of VR.

MEETMIKE

MEETMIKE is an experience showcased at SIGGRAPH that has made everyone’s jaw drop. Have a look at it

This is the realistic model of journalist Mike Seymour, composed of 440000 triangles, that is animated in realtime in virtual reality (so at 90FPS!). Thanks to a motion capture system, the journalist could move his face and see his virtual counterpart to move in VR as well. Look at the video: in some frames it seems so damn real. And it is all rendered in real time at 90 FPS! it’s impressive! This has been possible thanks to the cooperation of different companies (including Epic, since it is rendered through Unreal Engine) and research institutes. If you’re curious, you can read more about it on Road To VR.

Shape changing VR controllers

When I talked with Alex Colgan of Leap Motion, he pointed out that the haptic feedback of VR controllers is far from optimal since VR controllers have a particular shape and so you can only have the haptic sensation of the shape of your controllers. So for instance Touch Controllers are good at emulating a gun, but bad at emulating a magic wand. Cornell University is trying to mitigate this issue thanks to its Vive controllers add-on.

As you can see from the video, this is a handle for VR controllers that thanks to air that gets pumped inside it can change the shape of the controller so that it matches the one of the object you’re holding in VR. The cool part is that since the shape change is very fast, you can also emulate moving objects: imagine holding a squishy eel in VR with this controllers! System is still very rough and still requires wires (and this is an adoption killer), but I see a lot of uses for it, especially in arcades, so it’s a project that I find very interesting.

Furthermore, the project is evolving fast and now it can work not only with a Vive controller, but also with only a Vive tracker, so the haptic controller has more freedom to provide you the shape that it wants. Very cool.

Weather in VR

Researchers from Taiwan and Tamkang university have created a device to simulate different weather conditions inside arcades. Have a look at this video, where an UploadVR journalist tries how it is being immersed into a hot VR environment.

The system is reported to have still a long way to go, but it is interesting nonetheless to offer a deeper sense of immersion of users inside arcade experiences, especially the ones set in particularly hot or cold environments (e.g. desert, North Pole, etc…)

Atom View and Nu Design

Nurulize has showcased two new products: Atom View and Nu Design.

Atom View is “a cutting edge tool for raw volumetric data processing, color management, and delivery“. When you can scan an object or an environment through a 3D scanner, you obtain a so-called “point cloud” of the point collected by the scanner. With this software you can take this complicated cloud and manage it easily, by modifying it, correcting its glitches, changing its colors and then using it very easily inside your games or movies standard creation workflow. It has been optimized with VR in mind and this surely means that Atom is able to take all this complicated point cloud and simplify it while maintaining astonishing visuals to allow use inside Virtual Reality. It works already with UE4 game engine and Unity support is coming soon. So, it takes files from a lot of different sources and then lets you use them in an optimized way inside your standard VR creation tools… cool!

Nu Design instead is a collaborative VR environment where up to 8 people can collaborate inside VR to use these volumetric data to create something cool together. It features the possibility of moving inside the virtual world, taking measures, pointing stuff, talking and sketching things inside VR. I guess its purpose is to enable people to use volumetric data to create a virtual experience together. And yes, to me seems a competitor of the above described HoloDeck.

New York Times Speech

VR is the ultimate empathy machine and the New York Times truly believes in it. You can have a look here at the super-long session they held about VR storytelling and possible uses of VR in journalism.

Neurable Brain Computer Interface

This is absolutely my favorite news from SIGGRAPH. Neurable, a virtual reality startup focused on brain computer interfaces, has showcased its brain-controlling device. I know, I know… seems like a science fiction statement, but it’s all real.

Basically they’ve put brainwave reader sensors onto a HTC Vive and this devices continuously monitors the waves of the brain of the user and fire this data to the PC. There a software analyzes these waves and tries to infer which object you’re thinking about. UploadVR journalist Ian Hamilton talks about a demo where there are some objects on the screen and you have to think about one of them. After some seconds, the system tells you the one you’ve thought and it makes very very few errors about it. Then there is another demo where brain computer interface is used in conjunction with eye tracking: you can look at various objects in the room and then think “grab” when you want to take one and… magic! It is now inside your hands! This is fantastic since solves one of the problems of eye tracking, that is the one of selecting objects: with eyes you can point to objects like with a mouse, but our eyes lacks the “click” command to trigger the selection. using a BCI can be the solution.

BCI always fascinate me, because I think that in the far future they will be the way we’ll interact with electronic devices. Why should I waste my time typing on a keyboard when I could just think about what I want to write and have it magically appear on my screen? And of course, I envision bi-directional BCIs. There are already experiments on injecting information inside the brain… wow, the Matrix is coming!

Unluckily we’re very far from it. My best guess about Neurable is that they’ve trained some AI to detect the brainwaves of some words (like “grab” or “potato”): I already saw experiments about this to help disabled people years ago. Detecting a lot of words or a complex thought is another kettle of fish. But it’s cool anyway.

Ultrahaptics demo

Ultrahaptics is an interesting company developing a device that is able to offer haptic sensations at distance using ultrasound waves. So you may have haptics without wearing anything… isn’t it incredible? They’ve showcased at SIGGRAPH a demo using Hololens that makes you touch “Holograms” (i.e. augmented reality elements) with your hands. Have a look at it here:

This video shows how it works

I’ve found it just awesome!

Optitrack’s improved tracking

VR Arcade parks like The Void offer multiplayer full body VR inside large spaces. This is possible thanks to motion capture solutions developed by companies like Optitrack. Optitrack is a leader company in tracking people inside VR: we’re talking about high-quality high-cost solutions dedicated for instance to arcade owners, so we’re not talking about consumer products. Consumers can use Vive Trackers or Kinects to achieve full body virtual reality.

Optitrack has announced some cool improvements on its technology:

  • Now it offers a faceplate for the Oculus Rift CV1 to make the Rift an Optitrack-tracked (Opti-tracked?) device. When you use Optitrack, you can’t use Oculus positional tracking through Constellation, so you have to use Optitrack to perform the positional tracking for the Rift. This faceplate allows arcade owners to track Rift in a very simple and effective way, for “only” $749;
  • It has developed a little device that users can wear on the feet to have full body VR. Currently most VR experiences employing Optitrack track only the headset, the backpack of the user and the objects he/she’s holding in his/her own hands, like the rifles. This means that feet are not tracked and their position is inferred through inverse kinematics and that’s bad, since it sometimes results in wrong and/or unnatural leg poses. Thanks to a new little device developed by Optitrack, users may have their full body tracked. They must only put one device for each shoe and then the system can reconstruct the full body pose of each of the users inside the room! If you’re thinking: “this is exactly the same thing that we can do with Vive Trackers”, yes, you’re right: the only difference is that Optitrack solution is more precise, can handle multiple people and can operate in larger spaces. And costs a lot more, of course.

    Optitrack VR arcades full body
    Optitrack active marker. It is 9.5cm * 9.5cm… so not that small. It works thanks to a battery (Image by UploadVR)
  • It has developed a new calibration algorithm that makes the system to continuously auto-calibrate itself. This means that after initial Optitrack system setup and configuration, it doesn’t need to be calibrated anymore. This is a great time saving, considering that older versions required recurrent re-calibrations by waving hands.
optitrack full body VR
Render of the Optitrack full body solution (image from UploadVR)

You know that I’m a fan of full body VR without wearing sensors, but I know that at the moment solutions where the user is required to wear sensors are the most stable ones… so this news is more than welcome.

VNect full body tracking

About full body tracking… what if it could be possible using only a standard webcam to perform it? We could have full body in VR without additional hardware like Kinect, so an affordable and easy solution for complete VR immersion. A paper presented at SIGGRAPH (thanks Gianni for the tip) by Dushyant Mehta et al. shows a framework that could make this possible:

Of course the system is far from perfect… but something like this this could change everything in the full body VR realm… I’m astonished.

Video summary

UploadVR senior editor Ian Hamiltion has made its own summary of his Siggraph experience in one short video. It’s cool, have a look at it. It also features the Google algorithm to show your face inside the headset in mixed reality and Flock, an interesting musical experience to be performed in groups where all people see themselves as birds and have to eat bugs… and in the end seen from the outside they all look like idiots!


And that’s it! I know the article is so long that it takes more time to read it than attending the whole SIGGRAPH, but I hope you’ve appreciated it nonetheless. Please subscribe to my newsletter and have a nice day!


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

virtual reality hog awe europe

My experience at AWE Europe and a comparison with the US chapter

I’ve just come home after my trip to Vienna to attend AWE Europe. Many people asked me about my experience with the event and especially what are its differences with the US version, so I thought it could be worth writing a little post about it. AWE is one of the best XR events out […]

shiftall meganex superlight 8k review hands on

AWE EU – MeganeX Superlight 8K hands-on: impressive resolution but at a price

Yesterday was my first day at AWE Europe, and I tried some very interesting devices there. My first test was with MeganeX Superlight 8K, Shiftall’s new headset with a very high resolution in a very compact design. MeganeX Superlight 8K Just a brief history of the MeganeX Superlight 8K, before we delve into the real […]