Magic Leap Livestream 02 Gives Up Some Clues

Introduction

In live video event that Magic Leap billed as “Meet Magic Leap One” (MLO) and promoted as, “Tune in to Twitch at 2 pm ET today as we walk you through the Magic Leap One hardware and platform.” For those that tuned in either live or recorded program (LINK), what they mostly got was 50 minutes of saying how wonderful the MLO would be, a “demonstration” of how to wear the Magic Leap One, followed by non-answering and deflecting questions on the chat steam. There was maybe 5 minutes of real content with 45 minutes of filler.

When the chat stream asked to turn on the Magic Leap One, they turned on the unit and only showed that the led indicator on the front of the headset would first turn blue and then green. This seemingly cynical action made the chat stream rather angry. The host, Alan, responded to repeated questions for “spec” such as Field of View (FOV), price, and availability with “You’re not going to get me with those trick questions” which further riled the already hostile chat comments. For more on the reaction to the chat stream, I would suggest reading The Register article.

While most of the viewers were getting frustrated by the “tap dancing” hosts, I was looking for what video might reveal about the optics. The video had the best set of views yet of the MLO and showed what I think is a significant new detail in the optics (more on this later).

Some Mechanical Details Confirmed

The Livestream is the first time that I am aware that Magic leap has shown the two cords that connects the Lightpack™ (computer and battery) to the Lightwear™ (headset) as they were hidden all their earlier pictures. As can be seen from the on the left, the cords are substantial and have strain reliefs wired in at each end.

These cables are not like a thin set of wires from headset jack on a phone that will pull out, or at worse, break the cord if you snag it. These cords are likely to yank either the Lightpack from your pocket and/or rip the headset off your head and crashing to the ground. That is if you are lucky and it does not injure your head in the process. The design looks like both a safety and reliability concern. I think a good question for Magic Leap is “what happens if you snag the cord when you are moving about?”

Another thing the videos confirmed is the downward looking bias build into the headset optics (see above right). You could see when worn that the headset aims downward, and if anything, the way the unit is meant to be worn on the head increases this angle.

I also redid my earlier calculation from the Shaq video and found the light transmission of the real-world to still be in the 15% range. So nothing new here, the lenses block a lot of light, roughly equivalent to very dark sunglasses even though this unit is meant for indoor use only according to the presenters.

For the first time, Magic Leap provided a view inside the headset which gave an idea as to how much padding there was between the person’s forehead and the headset. I roughly estimate that there is was about 15 to 20mm (see picture on the right). The pad is thick compared to other headsets which means the eyes are a bit further back and thus the view out through the goggles is a bit more limited than I had earlier estimated.

On the video, they said that they were planning on prescription glasses inserts. The thickness of the forehead padding suggests that there will be enough distance between the eyes and the lenses to allow for inserts. Typically prescriptions lenses are about 13mm from the eyes.

Clues to Optics Design; I Think Used for Vergence Accommodation Conflict (VAC)

I have looked through around a hundred Magic Leap patents over the last 1.5 years, and when going through the still frames of the video, I saw something I had not noticed before in any of the patents. At first, it had the “glint” reflection of being the waveguide exit grating, but on closer inspection, it revealed itself to be something different. The first thing to notice in the top left picture is that there is a black rectangular outline which you normally would not see with an embedded in glass diffraction grating. But more telling is the connection tabs with what looks like printed circuit wires on a transparent flex cable (see arrows on the top image and ovals on the second image on the left).

I looked at the various screen captures from the front and the back of the headset and determined that the connections and the rectangular structure are on the “eye side” of the waveguide. This I think is key to understanding its function. If it were on the “real world side” of the waveguide, then it could only affect real-world light and might have been, for example, a way to darken just the real world. But being on the “eye side” means that it could affect both the display and the real world since light from both would pass through this device.

I went back to the Magic Leap patent applications to see if this device was described in the patents and did not find it explained, but I did find in Fig. 110C of application 2018/0052277 (‘277) the flex lines illustrated (see lower left). The figure also confirms based on the way it is drawn that the flex is on the eye side of the waveguide (see in particular the flex in the lower right red oval).

Unfortunately, while the flex connections are shown in Figure 110C, they are not numbered and I could find no description of their function. So, I am left to speculate as to their function.

Undescribed Device’s Function I Think Is a Polarization Dependent Variable Focus Element (VFE)

Having a device on the eye side would typically affect both the displayed image and the real world, but this would not make much sense. You would not want to darken both the real world and the display’s image; you would generally only want to darken the real world. If the device was changing the focus, you might want to change the focus of the virtual image but then not want to change the focus of the real world. But then I remembered my interview with DeepOptics at in Lumus’s booth at CES 2018.

The DeepOptics device uses a phase modulated liquid crystal that only acts on one polarization of light. The Lumus device uses, and LCOS microdisplay output polarized light, and DeepOptics added a polarizer to the real-world side of the Lumus waveguide to polarize the real world light in a direction opposite that of the LCOS display output. Thus, the DeepOptics device only changes the focus for the display’s polarization of light but not the real world. The DeepOptics device is a “polarization dependent, variable focus element (VFE).”

DeepOptics VFE requires many horizontal and vertical control lines to modulate a 2D array of electrodes to give them fine control of the device. The Magic Leap device does not appear to have only a few control lines, but they could have a simpler drive method or different patterning.

Magic Leap, based on their patent applications that closely match the Magic Leap One (MLO) also use LCOS microdisplays. I suspect that the dark front lens of the MLO polarizes the real-world light the opposite of the LCOS display’s output. The ‘277 patent application that “In some embodiments, cover lens 1809 may include light modifiers, such as polarized lens to reflect or absorb certain light.” This line out of the patent is not a clear indication that the front lenses are polarized, but it does indicate that they were thinking about it.

From all the available evidence, the MLO only supports two “focus planes” with two sets of 3 (RGB) waveguides as I discussed back in January 2018 (also see this article for a discussion on Vergence Accommodation Conflict, VAC, if you are not familiar with the concept). It’s not clear that even six focus planes that Magic Leap originally hoped for would work for VAC and just two focus planes would result in large discontinuities that would be seen by the eye as objects view would “hop” from being either “near” or “far.”

What I think Magic Leap may be doing (noted this is making an educated guess) is combining the polarization dependent variable focus element with their two planes to try and reduce VAC. Based on eye tracking to figure out where the eye’s “verge,” they select not only one of two planes but drive the “variable focus element” accordingly.

Even if this is a variable focus element, I doubt it will solve the VAC issues. But without something like this, I don’t see how it would work for VAC at all.

A related issue is that the eye tracking cameras are at the bottom. This is another confirmation of the down-looking bias of these headsets. With the cameras in this location, they could not track the eye movement well except when the user is looking down.

Polarizing Front Lenses?

The first question in confirming my theory is whether MLO has polarizing front lenses which I can’t tell from the video. It would be easy to determine in-person if you happen to be wearing polarized sunglasses when looking at the unit and you tilt your head as the darkening of the lenses would change (if anyone is so inclined can would report back).

Having polarizing front lenses would also cause problems when viewing all the LCD devices in the real world like most cell phone screens, computer monitors, and televisions. Even many OLED displays have some polarization added to them to reduce reflections. This is on top of blocking more than 50% of the real world light.

A Lot to Look Through

Regardless of the function of this newly identified device, there are a lot of layers of stuff to look through in the MLO. From the eye out you have this new structure, followed by six layers of waveguides (two focus planes of red, green, and blue) encased in glass, and then with the outer lens (perhaps with polarization).

Then there are the issues with how much the goggles block a person’s peripheral vision, how securely they will stay in place, how comfortable they will be with prolong use, how hot they make one’s head, and so forth.

Next Time AWE 2018 (I Hope)

I was traveling overseas most of May on both personal and business, and then when I got back, I went to AWE which is why there have not been any posts on this blog for a while.

I took a lot of photos of various headsets at AWE last week. In the middle of organizing them when Magic Leap did their Livestream on twitch, and I wanted to get my observations out. So next time, if there are no other breaking events, hope to post my pictures from AWE.

8 comments

  1. I think you are overly critical of the cord from the headset to the computer. If there is anything to snag onto, there is also a risk of tripping over that same object.

    You need a fairly large clear area regardless. After any glitch that freezes the video could send someone tumbling. I would even think that a well padded carpet is advisable, and no tables to fall onto. I know there are ways to provide virtual fences, to keep the person in their walking area. But, that free area needs a no-man’s-land area, around the walking zone, that is ideally the width of a person’s height.

    So, there should be nothing to snag onto.

    I guess that once a company has started down the road of offering hyped technology, they tend to attract a frenzy of criticism. But, any unwarranted criticism tends to reduce the value of the valid criticism.

    1. The main common snag hazards are door and cabinet knobs/handles and the arms of chairs (particularly the modern office oval shaped chair arms). I have a model that includes the Lightpack with a cord that I have worn around and it snags all the time. The way the cord dangles around the back and into one’s pocket perfectly sets it up to snag on chair arms and door handles.

      If Magic Leap is going to have to be used in specially prepared and designed rooms pristine rooms with “handlers” to keep cords from getting snagged, it will at best have an extremely limited potential market.

      As I have pointed out, the whole optical design is heavily biased in terms of having you look down onto something like a table top. So removing tables is not an option. Frankly, I don’t get the use case at all for the Magic Leap beyond being “demoware.”

  2. Karl
    What is your take on Vuzix partnering with QCOM’s XR1 chip and also the Plessey MicroLED displays for their NextGen AR smartglass to come in 2019-2020?

    1. I don’t know about the XR1 chip, but the Vuzix Plessey Micro-LED announcement I found interesting. The Vuzix uses of the Plessey Micro-LEDs is what I call a “hybrid” approach where the micro-LEDs are much bigger than the pixels and act as illumination for the primary pixel modulator, which in the case of Vuzix, I suspect will likely be a DLP. This will hopefully give Plessey a “bridge” to their “Ray” technology with pixel size Micro-LEDs. There is more on the Plessey website at http://www.plesseysemiconductors.com/products/microleds/

  3. Hi Karl,

    Thanks for your insightful article as always! I just have couple of questions regarding on the printed circuits around the output grating and would like to hear your opinion:

    – Would it be possible they are the IR light (vcsel) sources for eye-tracking? Since they only have mentioned the IR camera at the bottom but haven’t disclosed where they place the light source. If not, where you think they will place the IR light source?

    – FOR DeepOptics:
    I have tried DeepOptics demo during SID2018, it didn’t work quite well from my perspective. In addition, if I’m understanding its working principle correctly, it’s moving a small LC lens around the whole aperture for different field angle, thus I’m not sure if it can handle MLO’s large output grating dimension and relatively large eyebox area at once. If not, then the LC lens need to move very fast to cover the whole grating area without getting noticed by user’s eye, otherwise only part of the eyebox will have variable focus effect and it can be really weird.

    However, these are only my personal thoughts based on my limited knowledge, and I would really appreciate if you can share your ideas on these.

    JL

    1. I doubt the circuit is used for eye tracking. I think the eye tracking is the square below the optics aimed at the eye.

      My understanding is that what the DeepOptics does is drive the LCD to change the index of refraction (more or less the path length) for one polarization of light. It is not moving the lens around physically or changing the physical width.

  4. Sad.

    It reminds me of a professional wrestling match where the wrestlers were definitely pulling their punches; it was an athletic cooperative gymnastics performance but nothing near a fight and it was clear the video was edited to hide the punch-pulling.

    What did it to me was realizing that the audience didn’t care, nobody was cheering or booing, instead people were looking at their phones, walking up the stairs or arguing with their kids.

    I look at what people do and say in these videos as revealing as the technical details that Karl gleans from these videos. For instance, Shaq is interviewed while wearing a headset that is projecting some kind of light in front of his eye. Instead of talking to us while he watches basketball on virtual screens, he tells us about the time he watched virtual screens. If they can’t manage a filmed demo of that, what is the light patch that the goggle’s display, what is Shaq seeing now during the demo? Did they just make the Goggles light up so it looked like they worked?

    If Magic Leap was anywhere near where they say it is, they could dispel this kind of argument by just showing somebody use the thing on video. They won’t.

    This lady says she has been in charge of doing demos of this device for a long time, but she and that guy seemed pretty tentative as to how to turn the device on and how exactly it was supposed to light up when it did.

    Supposedly they have show business people involved with this, that’s a good thing because whether or not this or some similar device succeeds comes to down to connecting with an audience. The demo that they need to show to prove critics wrong is somebody using the device and interacting with something that isn’t there. They don’t need to hit particular metrics in terms of resolution, field-of-view, transparency, etc. This is on video so they can mess
    with the light, camera angles, do a few takes to avoid goofs, etc. If you know a little
    about acting, directing and that sort of stuff you ought to be able to take a half-working device and make it look like the bee’s knees.

    The fact that they can’t do it just spreads the suspicion that Magic Leap is just “Theranos with Alligators”.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: