All you need to know about SteamVR Tracking 2.0… will it be the foundation of Vive 2?
Yesterday everyone was excited in talking about Apple entrance in the AR/VR world, while I was actually excited by another news: the upgrade of Lighthouse tracking of SteamVR. After having leaked various previews in the last months, Valve has officially announced the upgrade with a community post. Let’s see together what this announcement is and why it is so important.
SteamVR tracking v1.0
Before explaining the announcement, it is better to take a step back and explain how Lighthouse tracking works. I admit I’ve never understood this completely until today, when I did a lot of research to write this article, so I’m going to make a little recap for you people that like me didn’t get it completely. If you’re already a Lighthouse master, go directly to next paragraph.
Basically the tracking is composed of two agents: the Lighthouse stations and the various sensors on the headset and VR controllers. Each Lighthouse station is composed by IR leds flashing at regular intervals and of two little motors throwing laser beams into the room, one spinning horizontally and the other vertically. Since we’re talking about IR light, we can’t see it, but actually sixty times a second this little stations are irradiating our rooms with light.
The big flash you’re seeing is lit by some IR LEDs. The two cylinders that rotate contain a laser light emitter (the bright spot that you see on them) and that irradiates the room with laser light. What is all of this useful for? We’ll see that in a while.
Let’s take a single sensor on a Vive headset: currently it is a TS3633 circuit by Triad Semiconductor. When it sees the bright spot of the IR leds (basically it detects a big burst of light), it some kind of “resets itself” and starts counting. It continues counting until it gets hit by the moving laser rays of the other two cylinders. If it gets hit by them, it communicates to the “brain” of the Vive tracking system the time at which it has got hit. So, this circuit will communicate something like “I’ve been hit after 0.001 seconds after the reset light , I’ve been hit after 0.002 seconds, I’ve not been hit” and so on, forever. Knowing the frequency at which the reset light appears, the speed at which the cylinders move and all this stuff, the “brain” is able to detect some information about the position of each sensor.
Let’s look again the above GIF: there is a reset flashing every circa 16.6ms (60Hz) and as you can see, in this time the horizontal cylinder spans all its 180degrees; then there is another reset and then the vertical cylinder spans all its 180 degrees; then the loop begins again. Suppose that after the first reset, our Triad sensor gets hit after exactly 8.3ms. This means that our sensor has been hit when the laser ray attached to the horizontal motor was at 90° (8.3ms is half of 16.6ms, so the cylinder was at half of its 180° span, so was at 90°), so completely frontal to the station. We now know that this sensor is somewhere horizontally in front of the station. At next iteration, the vertical laser starts and we get hit after 5.5ms. This means that the vertical beam was circa at 60° (5.5ms = 16.6ms / 3; so the angle must be 60° = 180° / 3). We have now some certain data about the position of the sensor: it is in the point that is in front of the station, at 30° vertical orientation from it. We don’t know the exact position, but we have a line of possible positions of the sensor (we can’t reconstruct the distance, so the possible positions lay on a ray).
(As a note for purists, I know that between horizontal and vertical beam the headset has moved, so in reality the overall calculation is a bit different, but I think that the simplification I made conveys the idea better)
This video helps surely you in visualize better the process: in it you can see the flashing resets light and the spanning lasers going through the room.
Since we have all synchronization stuff between the base stations, we’ve performed calibration of SteamVR system and we exactly know the relative position of each sensor wrt the others (we’ve manufactured the headset, so we know exactly the position of all sensors on it), we can take all these data, feed them to the tracking brain and with some math magic reconstruct the position of the headset with great accuracy. Vive also mounts IMU sensors, so this data too is taken into count in the reconstruction of position and rotation of the device.
For many people Lighthouse tracking is the best available tracking technology at the moment, since it is very precise, cheap and high customizable (it allows tracking of many objects, including the versatile Vive Trackers).
Tracking myths
I just want to clarify two misunderstandings that I myself had:
- Lighthouse doesn’t necessarily need two stations. Once you’ve configured everything, you can just use one as reddit users confirm. From the above explanation is in fact clear that you just need one source of lights to do everything. The second station is fundamental to increase precision of tracking and to guarantee robustness in case of occlusion: putting one station in front of the other makes really probable that at least one of the two manages to hit the sensors;
- Lighthouse is all about IR (infrared) light. I’ve lived until now convinced that Lighthouse was using laser and not infrared, until today, when, digging in technical specs, I understood that I was wrong. But also right. In fact, as Wikipedia says, laser just means
A laser is a device that emits light through a process of optical amplification based on the stimulated emission of electromagnetic radiation. The term “laser” originated as an acronym for “light amplification by stimulated emission of radiation“
So, laser is just a mode to emit light. And Lighthouse stations use IR lasers. That’s why it has so many interferences with IR devices like Kinect.
SteamVR tracking v2.0
Valve has decided that it Lighthouse tracking wasn’t cool enough, so they announced Lighthouse tracking v2.0. We already had in the past some news by Triad semiconductor announcing a new type of sensor reducing the overall cost of the Vive, but the news is even bigger.
Improvements regard both the sensors and the stations, let’s see why.
First of all let’s talk about the new sensors, the amazing TS4231. These are the ones I was talking about in my previous article, because since they have only 5 components instead of 11, they make possible the reduction of costs. Futhermore this little device can also enter in power saving mode, enabling also the device to have less power consumption. But the big news is the addition of the DATA output pin. What does this mean? It does mean that while the previous TS3633 model could only communicate to the brain simple data on the ENVELOPE pin (basically the “I’ve been hit after 5.5ms” information), this can also communicate some other complex data (like “the answer to life the universe and everything is 42”). What this data is taken from? Certain not from the book “The Hitchhiker’s Guide to the Galaxy”, but from the wave of the IR light intercepted by the sensor. The sensor will mount a light-digital converter, able to get information modulated inside the wave and traduce it in digital data in output on the DATA pin. We’ll see why this is a game changer in a while.
Base station will be optimized so they won’t need anymore two different motors to sweep lights. Thanks to engineering magic, a single rotation cylinder will be able to sweep rays both horizontally and vertically. This is an incredible reduction of costs (you cut the price of motors by a half!). Furthermore, the “reset” LEDs have been removed, since they’re now useless. Wait… what?? Useless?
Yes, useless. Since now we’re able to encode/decode data directly into the IR light beams, we don’t need a reset signal anymore. Let me explain that with a simple example. Let’s suppose that the horizontal beam now throws rays with encoded in them the information about their current angle: you don’t need timing anymore. I mean, if at 30° the ray is able to contain an information like “I’m a ray at 30°” or “I’m a ray radiated after 5.5ms”, you don’t need anymore to have a stupid timing reset. You just get that number and give it to the tracking “brain”. How to encode this information inside the light beams is some signals magic, basically I think that like with the radio, you have a carrier wave with a certain frequency and then add secondary waves with the actual data (music in case of the radio, numbers in case of this tracking system)… but I’m not an expert in this field at all. The only thing clear to me from the sensor official datasheet is that the sensor is able to traduce light waves to digital data. So, the important thing is that modifying the waves of the laser beams, we are able to communicate information to the sensors.
Thanks to this engineering efforts, the Lighthouse stations are highly modified and their cost is incredibly reduced. Look how a Lighthouse station box becomes empty in its second version:
In a period where everyone is trying to reduce VR costs, this update means an incredible costs reduction, both on headsets and tracking stations. Vive 2 price will be surely lower than Vive 1 one.
But there’s even more: since we’re removing the IR synchronization signal, that was the one causing the most interferences, Lighthouse tracking will become more stable and I guess that it could become also usable with external sensors like Kinect.
Another great info will be that we’ll also be able to use more than two Lighthouses: since we can encode info in each laser beam, the beam could also contain the ID of the Lighthouse station casting it and this would reduce tracking ambiguity (the sensor would be able to know exactly by which station it has been hit in a certain moment) and make possible the use of any number of tracking stations. So we may use a bazillion of Lighthouse stations and track entire warehouses, something that is awesome for VR arcades. As Triad says:
This higher speed digital link opens up the capability of using more than two base stations. Multiple base stations installations are useful for digital out of home experiences, arcades, “house-scale” tracking, and non-VR tracking applications such as warehouse-scale or grocery-store-scale tracking.
Vive product is becoming the best one for VR arcades: it has an incredible tracking system, a clear commercial license, a completely open architecture that makes all the customizations possible (for example it is possible to create custom controllers with Vive trackers or using directly Triad sensors). And Vive is Taiwanese (so Chinese) and arcades are enormous in China, so I expect lots of money for them.
UPDATE: according to Road To VR, when this new tracking technology will be out, it will allow out of the box to use 4 tracking stations to track an area of 10m x 10m (33 x 33 foot). This is incredible.
But…
But there is a problem: since we’ve removed the synchronization blink, a Vive v1 can’t surely work with this new kind of base stations. Its TS3633 sensors would just count forever, waiting for an IR blink that never happens. This means that there is no backward compatibility of the stations: every Vive v1 hardware can’t work with these base station evolutions. The contrary instead holds: since the TS4231 has been made with backward compatibility in mind, it can be used with Lighthouse v1.
This means that we’re talking about a disruptive innovation of this VR tracking technology. Once we go 2.0, we can’t go back. This has huge implications.
The timing
Valve writes in its statement that
Valve will have base stations available in production quantities starting in November 2017. If you would like engineering samples of those base stations, let us know. Those will be available in June.
This means that from now on, everyone wanting to create hardware for SteamVR has to think it in terms of the new TS4231 sensor. This means HTC itself and this leads me to my last point…
The speculation
Valve has until know worked on a Valve 1.5 device, creating add-ons compatible with Vive 1.0 (like the Audio Strap, for instance). Now is instead proposing a non-backward compatible upgrade, a tracking 2.0 incompatible with Vive 1.0. This makes me immediately think about a Vive 2.0, which will surely use the new Triad sensors.
Considering that at the end of this year mass production of this new Vive technology will begin, I can envision that this mass production is made in prevision of the release of a new device using this kind of stations, i.e. the Vive 2. I think that this could be the time that Vive will announce its new device, that presumably will be shipped in 2018, 2 years after the first one (and that seems a resonable product cycle to me). The new Vive will be far cheaper, thanks to these upgrades and will have new features, like house-scale tracking and maybe eye tracking and foveat rendering (maybe… it would be great). Surely it will add Oculus Touch-like controllers, with full hands tracking.
People with Vive 1.0 will be able to just buy the new 2.0 headset, without buying again Lighthouse stations, thanks to the backward compatibility of sensors; while new users will be able to buy a complete system cheaper than the first one. In both cases, people won’t spend again the full price of a Vive 1.0.
The ascent of the new cheaper Vive, maybe released at half 2018, will contribute to the widespread of VR, that according to John Riccitiello is expected from second half of 2018 – beginning of 2019. Everything makes sense.
Of course this last paragraph is all speculation, because I love speculating about VR future :). Don’t take it for granted!
Let me know your impressions on this new version of the Ligthhouse tracking… and don’t forget to share this article and to subscribe to my newsletter. Thanks 😉
(Header image by Road To VR)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.
Wow really enjoyed this article Tony, thanks! “I admit I’ve never understood this completely until today” …same here, until today :).
Anyway I’m still wondering how the system know the base station’s positions needed (I guess) as input for all the math that comes later in order to get the exact sensor’s spatial position. Maybe some kind of time of flight algorithm for each sensor to get the distance from IR emmiter to sensor so to get an only position in that ray you mention in which all possible positions lay ¿? Or maybe the base stations includes an IMU which enables them to report their position at startup. Donno…
Btw do you have had the chance to visualize a Kinect depth map while Lighthouse base stations were powered on? Just wondering how many of those “I have no information to tell” black holes you get in the Kinect depth map. I’ve used two Kinects facing each other for my CS degree thesis, and the interference was really intense. We managed to solve it using two little shaking motors on top of each Kinect (so the IR patterns gets less interlaced and don’t get messed up). And believe it or not but it really does the trick! Check “H. F. Andrew Maimone, «Reducing interference between multiple structured light depth sensors using motion»” if you have chance.
There’s no time-of-flight for Lighthouse, since you’ve no light bouncing back to stations… I guess that you may reconstruct the pose using informations from multiple point. Let’s make an example: if you have 2 sensors on Vive whose distance is 2cm, once you’ve estimated the rays onto which those points hypothetically lie on, you can select the positions that guarantee a distance of 2cm. On Vive you have lots of sensors, so doing some least-square approach, some optimization algos or something like that, you can reconstruct the pose of the device using all the rays estimated for all the sensors. I admit I don’t know the exact math, even if looking around I think I could obtain something usable.
About Kinect+Vive: Kinect v2 has been studied to live in kitchens of people, so it has some filters that help it in functioning well. In fact it is not the Kinect having interferences, it is the headset… it greys completely itself, since it loses the tracking. This happens only with Kinect v2, Kinect v1 is safe with Vive.
About 2 Kinects interferences, I knew that issue: the interesting part is that it happens only with Kinect v1, Kinect v2 can be used freely in arrays, that’s why at Immotionar we managed at using even six together for performing full-body VR.
Thanks for the insightful comment! 🙂
Donno either the rest of the math but what you mentioned at least makes sense.
My bad, I forget to mention that I was using Kinect v1 (even the XBox360 one, not the Windows version o.O) based on the old structured light algorithm. It is true that v2 has no interferences at all between sensors, thanks for the reminder.
It’s no myth that the Vive base station uses lasers, even the IR light is a laser. You can read the following warning in the manual.
“This product contains Class 3B laser, which can produce hazardous level of Class 3B and 4 laser radiations.
However, the design of this product incorporates optics, a protective housing and scanning safeguard such that
there is no access to level of laser radiation above Class 1. Only trained factory service personnel should open the
protective housing.
CAUTION- Class 3B LASER RADIATION WHEN OPEN
AVOID EXPOSURE TO THE BEAM.”
Thanks for the quote!
I’ve written that since I had a great confusion about the meaning of “laser”: I always thought that lasers were only colored rays of visible lights. So, IMHO infrared and laser were incompatible techs. In reality I was wrong and I talked about my misconception in the article to help other people having learnt it wrong as me.
Wow thanks a lot for this article!
Next to speculations about Vive2, I wondering, what LG will do, since they have also shown their SteamVR headset at several events in the beginning of 2017.
Pretty sure, they will not release it on the old incompatblie Steam VR 1.x tracking.
Probably LG is working on this, since I couldn’t get any updates aboud LG SteamVR headset the last months.
Anyone have any news/rumors about LG?
You’re welcome! 🙂
Unluckily I’ve no more info than the one you already have about LG. The only thing that I remember is that it should come out at the end of this year. Coming out with a v1.0 headset would be a stupid idea, since it wouldn’t work with the new stations, so it would come out already old. IMHO it will be a 2.0 headset, maybe even the first one, since Valve is interested in starting an ecosystem, not in manufacturing hardware.
Thanks for the informative article. Hopefully we can collaborate and speculate the type of functions FPGA performs!
Yeah, it would be great!
I can help speculate on the magic after Watchman v2 is publicly available.
Sorry if I missed this in the article, but do we know if the current gen trackers that have been available to developers will be compatible with the next gen lighthouses? We’ve ordered 6 of them and I would be very sad if I cannot utilize them with the new lighthouses when made available.
Can’t guarantee you 100%, but as far as I know, they’re Steam Tracking v1.0. In fact UploadVR’s Ian Hamilton, in one of his editorials, told that Trackers are an issue when the v2.0 will be released. Somewhere I read that HTC had declared that is thinking about a solution for trackers when the new technology will be out (maybe a cheap substitution service? I don’t know). So I guess you won’t be able to use them 🙁
So when exactly are the new lighthouses coming?
They’re already coming! The new Pimax 8K headset uses SteamVR tracking and will start shipping in December…
How many Headsets do you think it will be able to track inside those 100m2?
Honestly, I don’t know. Currently I’ve seen experiments with 2 Vives using the same Lighthouse stations on 2 different PCs to perform the tracking. With 3 there are currently complications.Theoretically, since the Lighthouses are only light emitters, you should able to track any number of headsets that you wish. Practically there are occlusion, interferences problems, and the players hitting one the others 🙂
There is no limit on the number of objects that can be tracked because the sensor is in the headset and trackers rather than the base station.
If it’s crowed though, people might be blocking others.
I have two of the new vives, and two sets of base station 2.0 (4 base stations). It doesn’t work in SteamVR. The controllers can see all 4 base stations, but the headset can only see 2 at a time and it screws the tracking up when it switches base stations. Maybe I am missing something, but this currently doesn’t work from my testing.
Oh, thanks for your feedback. I don’t 4 base stations so I don’t know… 🙁
Just to be sure… have you done all the setup correctly?
Hey Tony, I actually got it working! It doesn’t have a visible channel button on the back, but there is a small paper clip sized hole that allows you to change the base station channel. They don’t have a digital read out anymore so you have to just hover over them in your steam vr gui. They actually have 16 channels and take about 5-10 seconds to register their channel. For 4 you currently want S0,S1, S2, S3 (S4-S15 are ghost channels for the firmware expansion to 16 base stations, but it does work! It’s just not documented very well currently! Thanks for the reply!
Also has to have the new version of SteamVR for the Vive Pro!
Wow, I’m very happy for you! Fantastic job! And thanks for having written the solution here… so maybe it can help other people 🙂
If you want to write a tutorial, I can publish it on my blog, if you wish