Intel AR Glasses Follow Up

Introduction

The day after I posted an article about news of Intel trying to sell their AR Glasses Unit, The Verge came out with an article and video about Intel’s AR glasses. The AR glasses are called Vaunt, and it is from Intel’s New Device Group (NDG). The Verge article fills in some gaps, namely what Intel has done Intel with the concept after acquiring the Swiss startups of Lemoptix and Composyt (both spun out of EPFL).

The Verge article pretty much seems to present without judgment Intel’s use case rationale for the glasses.  On the surface, the progress is not that far from where EPFL was back in 2012 other than packaging it up into a smaller form factor. I’m assuming the mems scanner that was a cell phone sized Microvision ShowWx (see the Appendix at the end) has been replaced with the Intel acquired Lemoptix beam scanner.

BTW, I’m not trying to pick on Intel or The Verge here. Intel is doing what companies do all the time, presenting their product in the most favorable light to reporters that don’t understand the underlying technology. So it is very easy for the reporter to accept and go along with the company story. It is also self-fulfilling as companies will place exclusives with media sources that tend to report technology favorably. In the extreme, cough . . . cough, a company like Magic Leap will go to Rolling Stone.

Hard Problems Unsolved – Limitations Become “Features”

Key issues relative to Vaunt discussed in my last article included: eye-box/pupil size, the low resolution of laser beam scanning (LBS)  displays, and how would the holographic mirror/lens work with color images. I was curious to see how Vaunt would address these limitations and the simple answer is they didn’t. Vaunt has very small eye box, very low resolution (400 by 150 pixels according to The Verge), and still only has the single color of red. All the hard problems are yet to be addressed. What you are left with is a very expensive toy. What more, I don’t think given 10X more time and money it will solve the key problems while giving a good view of the real-world.

The largest part of an LBS projector is combining the red, green, and blue lasers into a single beam. So doing red only has a major impact on the size and cost. Using a single color also means that the holographic film that bends the light toward the eye could be tuned to the one wavelength and not deal with the optical issues with happens when three different wavelengths. A single color also reduces problems the holographic film will have in diffracting the real world light since it only has to deal with the one wavelength. If anyone else thought they could get away with a single color, their headset would be a lot smaller too.

The application and patent I discussed last time were both attempts to address the inherently small eye box of the near-eye laser scanning optics. Application 2015/0362734 (below left) proposed diffusing the image by the holographic film and then using a contact lens (developed by Innovega) to focus it. This obviously has issues with being not user-friendly (you have to be fitted with contact just to try it). Alternatively, Patent 9,846,307 (‘307 below right) shows using multiple projectors, but this quickly becomes expensive and extremely difficult to implement. The ‘307 patent above shows how if the eye moves the light rays miss the eye with a single projector.

Neither of the approaches to increasing the eye-box is even remotely practical. I was particularly curious how Intel would address the eye box issue and “surprise” they didn’t. Intel has instead turned the small eye box a “feature.” The Verge quotes Mark Eastwood, Intel’s NDG’s industrial design director:

”We didn’t want the notification to appear directly in your line of sight,” says Eastwood. “We have it about 15 degrees below your relaxed line of sight. … An LED display that’s always in your peripheral vision is too invasive. … this little flickering light. The beauty of this system is that if you choose not to look at it, it disappears. It is truly gone.”

 

 

You have to admire some good spin. The reason the image disappears is the eye box is so small and can only be seen when looking directly at it from a precise location (per the diagram in ‘307 above). Quoting the author in The Verge, “you’ll need to have Vaunt glasses adjusted to your pupillary distance,” and this process is shown in the video (left) as part of the process of getting the prescription glasses.

In The Verge video, the author says that it would be so natural to glance over and down to see a message (still from the video shown on the right). But first, the user would have to be alerted by some other means because the image is invisible unless you are looking precisely at it. Also, people are instinctively extremely sensitive to where other people’s eyes are looking, so glancing to look at messages is going to make the user look “shifty-eyed” as the author did in the video (right — hopefully, he was exaggerating).

Most companies go to the effort to make a larger eye box so the image will disappear only when you want it to, not just because you have to look in a certain direction. If they wanted your glance to control the display, then they would track your eye.

The Verge makes that point that everything is small and low powered. But then, supporting only a single color and low-resolution image with a small eye box would be small and low powered with any other display technology.

The good technical point is that because the Vaunt uses laser light, the image is “in-focus” regardless of your prescription and this is true. It could also be true using a panel type display (LCOS or DLP) using laser illumination. What The Verge did not say (and may not have known) is that the same laser light physics can also be problematical for people with “floaters” which most people develop between the ages of 40 and 65, cast sharp shadows on the retina (I learned this first-hand when wearing an LBS headset).

So much was not shown

There are no through the lens videos and no idea as the eventual price and availability, but the video and article made it sound expensive. The author was only shown short demo loops of video and apparently not it fully working.  A big issue for me is that they did not let the author take it outside, this is a big red-flag for being “demoware.” I’m particularly concerned about the hologram’s diffraction grating effects of the real world when looking through the glasses; going outside would expose the glasses to the very bright light coming from all types of angles that could show problems, specifically diffraction grating color separations.

Conclusion – I Just Don’t See  Big Market For Vaunt

The eye box is one of those things most people would not think to ask. Once you understand it, it becomes a problem that may never be solved cost effectively. Laser Beam Scanning is chock full of these types of problems that only become apparent when you dig down into the details. The simple drawing and presentations by LBS companies skip over these issues. There are reasons why even though the concept has been around for over 27 years since the HIT lab at the University of Washington proposed them, you have not seen single- or full-color near-eye displays.

The question becomes how big a market there could be for a low resolution (fewer pixels than an Apple Watch), red (only), with a small eye box, requiring custom prescription glasses, which will have problems with users over 40 (when people typically start getting floaters in their eye)? In other words, take what the technology gives you approach. How much could you sell them for versus how much they would cost to make?

It is not just what the Vaunt glasses currently do; it is that after over five years of development and investment, the last two years or so with Intel’s backing, they have not been able to demonstrate solving any of the key problems with the technology. How can Intel think, at least according to Bloomberg’s article “Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit” (February 1, 2018) that it is worth $350M?

My reading of the tea leaves is that Intel has figured it has pumped enough money into it and wanting someone else to take it off their hands with all the seriously hard problems yet to be solved. It looks like Intel was giving The Verge’s access to their developments to put “lipstick on a pig” in their effort to sell the group. Sorry for being so blunt, but that is honestly the way I see it.

Appendix: Some EPFL and Innovega Background to the Vaunt

After writing the first article, a user on Reddit alerted me to an old video by EPFL. The video gives a snapshot of where the technology was over five years ago and what has been accomplished. Pretty much the Vault is an integration of the EFPL technology into the form factor of the mockup. In 2012, they used a Microvision LBS projector (losing 10’s of millions of dollars for Microvision, BTW) shown in the still captured on the left.

The video also showed a very short “through the optics” video clip taken with an IR sensitive camera with the name “Innovega” on it. This explains patent application 2015/0362734 with the special contact lens that I comment last time looks like what has Innovega talking about for years. Innovega (now branding the product as Emacular) has been trying to develop these special contacts since as far back as 2008.

Also shown in the video is a red laser shooting at the lens and the diffraction pattern created (left).  What is not shown is what happens to real-world images and light sources when viewing through the glasses. One would expect to see the diffraction effects.

6 comments

    1. Hard to know the power consumption. It is only supporting low-resolution and sparse content. I would get they only turn the display on for short time periods. There is not a lot of room for the battery.

  1. Karl, good article. you dispel lots of myths in display industry.
    what I don’t agree is following statement:
    “Also shown in the video is a red laser shooting at the lens and the diffraction pattern created , What is not shown is what happens to real-world images and light sources when viewing through the glasses. One would expect to see the diffraction effects”.
    it’s actaully possible to add only optical effect to reflective path not transparent path, especally laser is used. you have seen our NanoAR demo in CES, which strongly affect reflective beam; same effect doesn’t happen to tranpaent light path.
    Yong-jing

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: