At GDC 2018 Magic Leap has announced the release of the Lumin SDK, that is the tools that we should use to develop for this AR glass. I started toying around with it, because I want to review it on this blog next week (stay tuned for updates… maybe subscribing to my newsletter using the hello bar above!) and I discovered that the SDK contains a fancy emulator so that you can start exploring Magic Leap development even if you don’t have a device.
Inside the emulator, you see your device as if it were inside a room and then you can move it (rotate the glass, move the glass, etc…) and simulate a lot of inputs. You can then develop in Unity and connect your Unity editor to the emulator so that to see a live view of how the app would work on the device: the interesting thing is that you can hit Play in Unity and see the program running live in the emulator… so that if you, for instance, change the color inside Unity, it changes immediately inside the emulator. I followed the tutorial to create a simple cube application and then I started moving the cube to see if this mechanism worked… and actually, it worked very well
https://gfycat.com/FearfulBossyAustraliankestrel
…even too much well: as you can see, moving and rotating the glass inside the emulator, the fixed cube gets clipped when it gets out of Magic Leap field of view, exactly as in real life. This is very powerful because lets you see if your application really fits well into the small field of view of an AR headset or if at the moment it results not usable.
But this triggered also a big idea in me: if the device inside the emulator is the most similar possible to the real device, as it should be… THE FOV OF THE EMULATOR IS THE SAME FOV OF THE DEVICE… so we can detect the field of view of Magic Leap One using its emulator.
I first tried looking the FOV of the Unity Camera: when you run the program into the emulator, it immediately becomes 80, so I thought that the FOV of ML would be 80 degrees. It seemed too much, so I looked the docs and discovered that actually that value of the Unity camera is referred to the vertical fov…while the horizontal fov depends on the size of the viewport. Unity has to be discarded.
I so started looking into the emulator “Mini Map” window, that shows the viewing frustum of the device… but unluckily it has only a projective top-down view and not an orthographic one: so it is not possible to measure reliably and easily the fov there.
Then I got another idea: why couldn’t I move a very small object along the X-axis and see when it disappears from the view? So I created a little sphere, and I put it at 2 meters from the camera on the Z axis. Inside the emulator, I reset device position and orientation, then I hit play and I started moving the sphere along the x-axis. As soon as it disappeared, I stopped it and took the measurement of that point. I took the last point where it was last visible, took a screenshot and super-imposed a goniometer to the picture to see the angle… can you see what I’ve seen it there?
Exactly, a little less than 30 degrees on one side!
So, I decided to make another experiment that was a bit better than the previous one: I attached a script to the camera that generates the writing of the angle number at the exact position of that angle and that stays attached to the main camera. The script was exactly this one
public class NumberAngle : MonoBehaviour {
public int step = 10;
// Use this for initialization
void Start ()
{
for(int i = 180; i >= 0; i-= step)
{
GameObject go = new GameObject();
go.transform.SetParent(transform, false);
go.transform.localPosition = new Vector3(Mathf.Cos(i * Mathf.Deg2Rad), 0, Mathf.Sin(i * Mathf.Deg2Rad));
go.transform.localScale = 0.01f * Vector3.one;
TextMesh tm = go.AddComponent<TextMesh>();
tm.text = (-(i - 90)).ToString();
tm.alignment = TextAlignment.Center;
tm.anchor = TextAnchor.MiddleCenter;
tm.fontSize = 17;
tm.color = Color.blue;
}
}
// Update is called once per frame
void Update () {
}
}
I launched it and these were the results…
As you can see we have approximately 57° of horizontal angle interval. Moving the headset in the emulator strangely these results slightly change and usually stay in the range 57-60°. Let’s take 60° as the actual FOV because it is a round number 🙂
As a final experiment, I extended the above script to also get the vertical FOV…
public class NumberAngle : MonoBehaviour {
public int step = 10;
// Use this for initialization
void Start ()
{
for(int i = 180; i >= 0; i-= step)
{
GameObject go = new GameObject();
go.transform.SetParent(transform, false);
go.transform.localPosition = new Vector3(Mathf.Cos(i * Mathf.Deg2Rad), 0, Mathf.Sin(i * Mathf.Deg2Rad));
go.transform.localScale = 0.01f * Vector3.one;
TextMesh tm = go.AddComponent<TextMesh>();
tm.text = (-(i - 90)).ToString();
tm.alignment = TextAlignment.Center;
tm.anchor = TextAnchor.MiddleCenter;
tm.fontSize = 17;
tm.color = Color.blue;
}
for (int i = 180; i >= 0; i -= step)
{
GameObject go = new GameObject();
go.transform.SetParent(transform, false);
go.transform.localPosition = new Vector3(0, Mathf.Cos(i * Mathf.Deg2Rad), Mathf.Sin(i * Mathf.Deg2Rad));
go.transform.localScale = 0.01f * Vector3.one;
TextMesh tm = go.AddComponent<TextMesh>();
tm.text = (-(i - 90)).ToString();
tm.alignment = TextAlignment.Center;
tm.anchor = TextAnchor.MiddleCenter;
tm.fontSize = 17;
tm.color = Color.blue;
}
}
// Update is called once per frame
void Update () {
}
}
…and from the tests, it seems that it always sits at 45°.
So the Magic Leap One may have a horizontal FOV of around 60° and a vertical FOV of around 45°.
But… there is a big problem: I noticed that the two representations of the device FOV inside the emulator (the one in the “Mini Map” and in the “Eye View”) are actually uncoherent and show two different frustums.
I decided to calculate the frustum of the emulation window, too. So, I’ve used the grid with the angle just created by me in Unity… and with a bit of creativity I started moving an object inside the room inside the emulator to get when it touched the red lines of the frustum. At that point I got which was the angle thanks the blue grid. With this process, I got two different values from before: 45° horizontal FOV and 34° of vertical FOV circa. The only thing that the couples 45-60 and 34-45 have in common is a 3:4 aspect ratio… for the rest they represent a huge difference.
Of course, this is all speculation: if the emulator’s FOV is actually different from the real one, these calculations are completely useless. But I really hope that a company doesn’t ship an emulator with wrong rendering features inside, so I trust more the 60° option. In my opinion, 60° seems a value that has sense and it is coherent with the reviews saying that its FOV is noticeably bigger than the one of HoloLens (that is around 30-35°) and coherent with the one of the next evolution of Hololens that should be around 70°. An AR device released this year may absolutely have 60° FOV. So, this speculation, in my opinion, won’t be that distant from reality. But, at the same time, I have to say that using the info from Rolling Stone article, Oliver Kreylos supposed a FOV of around 40°, so 45° can also be a good guess. As always with Magic Leap, there is a lot of confusion.
What do you think about my assumptions? Do you think that the FOV can actually be 60°? Or 45°? Let me know in the comments!
UPDATE: I asked an opinion about this article to the awesome Oliver Kreylos on reddit and he answered me that what I did was good and that the fact that I obtained an asymmetric FOV with the first method is actually a good news to confirm the 60° result. Here you are his exact quote:
In general, your approach is a good one in lieu of having actual hardware in hand. The fact that you got a skewed field of view is a positive indicator, in that sense. If the emulator is based on measured FoV, then it’s likely either the left- or right-eye FoV, and those are expected to be skewed. If the FoV values in the emulator were made up, they’d probably have made them symmetric.
UPDATE 2: The same Oliver Kreylos published later this other comment:
I’m not an expert on waveguides; the one thing I know is that there is a specific type of waveguide that has a hard ~40° limit, that from the Microsoft patent prior to HoloLens. I don’t actually know if that limit applies to all possible types of waveguide.
I also don’t know what Magic Leap are doing. Maybe they have tiled displays with multiple waveguides at angles to each other (imagine like a shallow pyramid), or picked up some other way of projecting the image.
In general, I’m thinking of /u/SkarredGhost‘s estimate as an upper limit, and expect the actual (horizontal) FoV to be somewhere between 40° and 60°.
(Header image by Magic Leap)