Archive for Pico Projection

Speaking at The Display Summit Oct 4-5th

I just wanted to let my readers know I am going to be speaking at The Projection Summit on Oct 5th in Sterling, Virginia (near Washington Dulles Airport) with a 20 minute presentation titled, “AR and VR Display Technologies for Wide FOV and High Angular Resolution.” Later on Oct. 5th I will be participating in a panel discussion on AR/VR.

Info on the overall conference is available at here

The agenda is given here.

I hope to see a some of you there,


Near Eye Displays (NEDs): Gaps In Pixel Sizes

I get a lot of questions to the effect of “what is the best technology for a near eye display (NED).” There really is no “best” as every technology has its strengths and weaknesses. I plan to right a few articles on this subject as it is way too big for a single article.

Update 2017-06-09I added the Sony Z5 Premium 4K Cell Phone size LCD to the table. Their “pixel” is about 71% the linear dimension of the Samsung S8 or about half the area but still much larger than any of the microdisplay pixels. But one thing I should add is that most cell phone makers are “cheating” on what they call a pixel. The Sony Z5 Premium’s “pixel” really only has 2/3rds of an R, G, and B per pixel it counts. It also has them in a strange 4 pixel zigzag that causes beat frequency artifacts when displaying full resolution 4K content (GSMARENA’s Close Up Pixtures show of the Z5 Premium fails the show the full resolution in both directions). Note similarly Samsung goes with RGBG type patterns that only have 2/3rd the full pixels in the way they count resolution as well. These “tricks in counting are OK when viewed with the naked eye at beyond 300 “pixels” per inch, but become more problematical/dubious when used with optics to support VR. 

Today I want to start with the issue of pixel size as shown in the table at the top (you may want to pop the table out into a separate window as you follow this article). To give some context, I have also included a few major direct view categories of displays as well. I have grouped the technologies into the colored bands in the table. I have given the pixel pitch (distance between pixel centers) as well as the pixel area (the square of the pixel pitch assuming square pixels. Then to give some context for comparison I have compared the pitch and area relative to a 4.27-micron (µm) pixel pitch which is about the smallest being made in large volume. Finally there are columns showing how big the pixel would be in arcminutes when view from 25cm (250mm =~9.84inches) which is the commonly accepted near focus point. Finally there is a column showing how much the pixel would have to be magnified to equal 1-arcminute at 25cm which gives some idea about the optics required.

In the table, I tried to use smallest available pixel in a given technology that was being produced with the exception of “micro-iLED” for which I could not get solid information (thus the “?”). In the case of LCOS, the smallest field sequential color (FSC) pixel I know of is the 4.27µm one by my old company Syndiant used in their new 1080p device. For the OLED, I used the eMagin 9.3 pixel and for the DLP, their 5.4 micron pico pixel. I used the LCOS/smallest pixel as the baseline to give some relative comparisons.

One thing that jumps out in the table are the fairly large gaps in pixel sizes between the microdisplays versus the other technologies. For example you can fit over 100 4.27µm LCOS pixels in the area of a single Samsung S8 OLED pixel or 170 LCOS pixels in the area of a the pixel used in the Oculus CV1. Or to be more extreme you can fit over 5,500 LCOS pixels in one pixel of a 55-inch TV pixel.

Big Gap In Near Eye Displays (NEDs)

The main point of comparison for today are the microdisplay pixels which range from about 4.27µm to about 9.6µm in pitch to the direct view OLED and LCD displays in 40µm to 60µm that have been adapted with optics to be used in VR headsets (NEDs). Roughly we are looking at one order of magnitude in pixel pitch and two orders of magnitude in area. Perhaps the most direct comparison is the microdisplay OLED pixel at 9.3 microns versus the Samsung S8 at 4.8X linear and a 23x area difference.

So why is there this huge gap? It comes down to making the active matrix array circuitry to drive the technology. Microdisplays are made on semiconductor integrated circuits while direct view displays are made on glass and plastic substrates using comparatively huge and not very good transistor. The table below based on one in an article from 2006 by Mingxia Gu while at Kent State University (it is a little out of date, but gives lists the various transistors used in display devices).

The difference in transistors largely explains the gap. With the microdisplays using transistors made in I.C. fabs whereas direct view displays fabricate their larger and less conductive transistors on top of glass or plastic substrates at much lower temperatures.


Within the world of I.C.’s, microdisplays used very old/large transistors often using nearly obsolete semiconductor processes. This is both an effort to keep the cost down and the fact that most display technologies need higher voltages than would be supported by smaller transistor sizes.

There are both display physics and optical diffraction reasons which limit making microdisplay pixels much smaller than 4µm. Additionally, as the pixel size gets below about 6 microns, the optical cost of enlarging the pixel to be seen by the human start to escalate so headset optics makers want 6+ micron pixels which are much more expensive to make. To a first order, microdisplay costs in volume are a function of area of the display so smaller pixels means less expensive devices for the same resolution.

The problem for microdisplays is even using old I.C. fabs, the cost per square millimeter is extremely high compared to TFT on glass/plastic, and yields drop as the size of the device grows so doubling the pixel pitch could result in an 8X or more increase in cost. While is sounds good to be using old/depreciated I.C. fabs, it may also mean they may not have the best/newest/highest yielding equipment or worse yet, they close down the facilities as being obsolete.

The net result is that microdisplays are no where near cost competitive with “re-purposed” cell phone technology for VR if you don’t care about size and weight. They are the only way to do a small lightweight headsets and really the only way to do AR/see through displays (save the huge Meta 2 bug-eye bubble).

I hope to pick up this subject more in some future articles (as each display type could be a long article in and of itself. But for now, I want to get onto the VR systems with larger flat panels.

Direct View Displays Adapted for VR

Direct View VR (ex. Oculus, HTC Vive, and Google Cardboard) have leveraged direct view display technologies developed for cell phones. They then put simple optics in front of the display so that people can focus the image when the display is put so near the eye.

The accepted standard for human “near vision” is 25cm/250mm/9.84-inches. This is about as close as a person can focus and is used for comparing effective magnification. With simple (single/few lens) optics you are not so much making the image bigger per say, but rather moving the display closer to the eye and then using the optics to enable the eye to focus. A typical headset uses a roughly 40mm focal length lens and then put the display at the focal lens or less (e.g. 40mm or less) from the lens.  Putting the display at the focal length of the lens makes the image focus at infinity/far away.

Without getting into all the math (which can be found on the web) the result is that with a 40mm focal length nets an angular magnification (relative to viewing at 25cm) of about 6X. So for example looking back at the table at the top, the Oculus pixel (similar in size to the HTC Vive) which would be about 0.77 arcminutes at 25cm end up appearing to cover about 4.7 arcminutes (which are VERY large/chunky pixels) and about a 95 degree FOV (depends on how close the eye gets to the lens — for a great explanation of this subject and other optical issues with the Oculus CV1 and HTC Vive see this article).

Improving VR Resolution  – Series of Roadblocks

For reference, 1 arcminute per pixel is consider near the limit of human vision and most “good resolution” devices try to be under 2 arcminutes per pixel and preferably under 1.5. So let’s say we want to keep the ~95 FOV but improve the angular resolution by 3x linearly to about 1.5 arcminutes, we have several (bad) options:

  1. Get someone to make a pixel that is 3X smaller linearly or 9X smaller in area. But nobody makes a pixel this size that can support about 3,000 pixels on a side. A microdisplay (I.C. based) will cost a fortune (like over $10,000/eye if it could be made at all) and nobody makes transistors that a cheap and compatible with displays that are small enough. But let’s for a second assume someone figures out a cost effective display, then you have the problem that you need optics that can support this resolution and not the cheap low resolution optics with terrible chroma aberrations, god rays, and astigmatism that you can get away with 4.7 arcminute pixels
  2. Use say the Samsung S8 pixel size (a little smaller) and make two 3K by 3K displays (one for each eye). Each display will be about 134mm or about 5.26 inches on a side and the width of the two displays plus the gap between them will end up at about 12 inches wide. So thing in terms of strapping an large iPad Pro in front of your face only, it now has to be about 100mm (~4 inches) in front of the optics (or about 2.5X as far away at on the current headsets). Hopefully you are starting to get the picture, this thing is going to huge and unwieldy and you will probably need shoulder bracing in addition to head straps. Not to mention that the displays will cost a small fortune along with the optics to go with them.
  3. Some combination of 1 and 2 above.
The Future Does Not Follow a Straight Path

I’m trying to outline above the top level issue (there are many more). Even if/when you solve the display cost/resolution problem, lurking behind that is a massive optical problem to sustain that resolution. These are the problems “straight line futurists” just don’t get; they assume everything will just keep improving at the same rate it has in the past not realizing they are starting to bump up against some very non-linear problems.

When I hear about “Moore’s Law” being applied to displays I just roll my eyes and say that they obviously don’t understand Moore’s Laws and the issued behind it (and why it kept slowing down over time). Back in November 2016 Oculus Chief Scientist Michael Abrash made some “bold predictions” that by 2021 we would have 4K (by 4K) per eye and 140 degree FOV with 2 arcminutes per pixel. He upped my example above by 1.33x more pixels and upped the FOV by almost 1.5X which introduces some serious optical challenges.

At times like this I like to point out the Super Sonic Transport or SST of the 1960’s. The SST seemed inevitable for passenger trave, after all in less than 50 years passenger aircraft when from nothing to the jet age; yet today, over 50 years later, passenger aircraft still fly at about the same speed. Oh by the way, in the 1960’s they were predicting that we would be vacationing on the moon by now and having regular fights to Mars (heck, we made it to the moon in less than 10 years). We certainly could have 4K by 4K displays per eye and 140 degree FOV by 2021 in a head mounted display (it could be done today if you don’t care how big it is), but expect it to be more like the cost of flying supersonic and not a consumer product.

It is easy to play arm chair futurist and assume “things will just happened because I want them to happen. The vastly harder part is to figure out how it can happen. I lived through I.C. development in the late 1970’s through the mid 1990’s so I “get” learning curves and rates of progress.

One More Thing – Micro-iLED

I included in the table at the top Micro Inorganic LEDs, also known as just Micro-LEDs (I’m using iLED to make it clear these are not OLEDs). They are getting a lot of attention lately, particularly after Apple bought LuxVue and Oculus bought InfiniLED. These essentially use very small “normal/conventional” LEDs that are mounted (essentially printed) on a substrate. The fundamental issue is that red requires a very different crystal from blue and green (and even they have different levels of impurities). So they have to make individual LEDs and then combine them (or maybe someday grow the dissimilar crystals on the common substrate).

The allure is that iLEDs have some optics properties that are superior to OLEDs. They have tighter color spectrum, more power efficient, can be driven much brighter, less issues with burn in, and in some cases have less diffuse (better collimated) light.

These Micro-iLEDs are being used in two ways, one to make very large displays by companies such as Sony, Samsung, and NanoLumens or supposedly very small displays (LuxVue and InfiniLED). I understand how the big display approach works, there is lots of room for the LED and these displays are very expensive per pixel.

With the small display approach, they seem to have to double issue of being able to cut very small LEDs and effectively “print” the LEDs on a TFT substrate similar to say OLEDs. What I don’t understand is how these are supposed to be smaller than say OLEDs which would seem to be at least as easy to make on similar TFT or similar transistor substrates. They don’t seem to “fit” in near eye, but maybe there is something I am missing at this point in time.

CES 2017 AR, What Problem Are They Trying To Solve?


First off, this post is a few weeks late. I got sick on returning from CES and then got busy with some other pressing activities.

At left is a picture that caught me next to the Lumus Maximus demo at CES from Imagineality’s “CES 2017: Top 6 AR Tech Innovations“. Unfortunately they missed that in the Lumus booth at about the same time was a person from Magic Leap and Microsoft’s Hololens (it turned out we all knew each other from prior associations).

Among Imagineality’s top 6 “AR Innovations” were ODG’s R-8/R-9 Glasses (#1) and Lumus’s Maximus 55 degree FOV waveguide (#3). From what I heard at CES and saw in the writeups, ODG and Lumus did garner a lot of attention. But by necessity, theses type of lists are pretty shallow in their evaluations and I try to do on this blog is go a bit deeper into the technology and how it applies to the market.

Among the near eye display companies I looked at during CES include Lumus, ODG, Vuzix, Real Wear, Kopin, Wave Optics, Syndiant, Cremotech, QD Laser, Blaze (division of eMagin) plus several companies I met with privately. As interesting to me as their technologies was there different takes on the market.

For this article, I am mostly going to focus on the Industrial / Enterprise market. This is were most of the AR products are shipping today. In future articles, I plan to go into other markets and more of a deep dive on the the technology.

What Is the Problem They Are Trying to Solve?

I have had an number of people asked me what was the best or most interesting AR thing I saw at CES 2017, and I realized that this was at best an incomplete question. You first need to ask, “What problem are they trying to solve?” Which leads to “how well does it solve that problem?” and “how big is that market?

One big takeaway I had at CES having talked to a number of different company’s is that the various headset designs were, intentionally or not, often aimed at very different applications and use cases. Its pretty hard to compare a headset that almost totally blocks a user’s forward view but with a high resolution display to one that is a lightweight information device that is highly see-through but with a low resolution image.

Key Characteristics

AR means a lot of different things to different people. In talking to a number of companies, you found they were worried about different issues. Broadly you can separate into two classes:

  1. Mixed Reality – ex. Hololens
  2. Informational / “Data Snacking”- ex. Google Glass

For most of the companies were focused on industrial / enterprise / business uses at least for the near future and in this market the issues include:

  1. Cost
  2. Resolution/Contrast/Image Quality
  3. Weight/Comfort
  4. See-through and/or look over
  5. Peripheral vision blocking
  6. Field of view (small)
  7. Battery life per charge

For all the talk about mixed reality (ala Hololens and Magic Leap), most of the companies selling product today are focused on helping people “do a job.” This is where they see the biggest market for AR today. It will be “boring” to the people wanting the “world of the future” mixed reality being promised by Hololens and Magic Leap.

You have to step back and look at the market these companies are trying to serve. There are people working on a factory floor or maybe driving a truck where it would be dangerous to obscure a person’s vision of the real world. They want 85% or more transparency, very lightweight and highly comfortable so it can be worn for 8 hours straight, and almost no blocking of peripheral vision. If they want to fan out to a large market, they have to be cost effective which generally means they have to cost less than $1,000.

To meet the market requirements, they sacrifice field of view and image quality. In fact, they often want a narrow FOV so it does not interfere with the user’s normal vision. They are not trying to watch movies or play video games, they are trying to give necessary information for person doing a job than then get out of the way.

Looking In Different Places For the Information

I am often a hard audience. I’m not interested in the marketing spiel, I’m looking for what is the target market/application and what are the facts and figure and how is it being done. I wanting to measure things when the demos in the boths are all about trying to dazzle the audience.

As a case in point, let’s take ODG’s R-9 headset, most people were impressed with the image quality from ODG’s optics with a 1080p OLED display, which was reasonably good (they still had some serious image problems caused by their optics that I will get into in future articles).

But what struck me was how dark the see-through/real world was when viewed in the demos. From what I could calculate, they are blocking about 95% of the real world light in the demos. They also are too heavy and block too much of a person’s vision compared to other products; in short they are at best going after a totally different market.

Industrial Market

Vuzix is representative of the companies focused on industrial / enterprise applications. They are using with waveguides with about 87% transparency (although they often tint it or uses photochromic light sensitive tinting). Also the locate the image toward the outside of the use’s view so that even when an image it displayed (note in the image below-right that the exit port of the waveguide is on the outside and not in the center as it would be on say a Hololens).

The images at right were captured from a Robert Scoble interview with Paul Travers, CEO of Vuzix. BTW, the first ten minutes of the video are relatively interesting on how Vuzix waveguides work but after that there is a bunch of what I consider silly future talk and flights of fancy that I would take issue with. This video shows the “raw waveguides” and how they work.

Another approach to this category is Realwear. They have a “look-over” display that is not see through but their whole design is make to not block the rest of the users forward vision. The display is on a hinge so it can be totally swung out of the way when not in use.


What drew the attention of most of the media coverage of AR at CES was how “sexy” the technology was and this usually meant FOV, resolution, and image quality. But the companies that were actually selling products were more focused on their user’s needs which often don’t line up with what gets the most press and awards.


ODG R-8 and R-9 Optic with a OLED Microdisplays (Likely Sony’s)

ODG Announces R-8 and R-9 OLED Microdisplay Headsets at CES

It was not exactly a secret, but Osterhout Design Group (ODG) formally announce their new R-8 headset with dual 720p displays (one per eye) and R-9 headset with dual 1080p displays.  According to their news release, “R-9 will be priced around $1,799 with initial shipping targeted 2Q17, while R-8 will be less than $1,000 with developer units shipping 2H17.

Both devices use use OLED microdisplays but with different resolutions (the R-9 has twice the pixels). The R-8 has a 40 degree field of view (FOV) which is similar to Microsoft’s Hololens and the R-9 has about a 50 degree FOV.

The R-8 appears to be marketed more toward “consumer” uses with is lower price point and lack of an expansion port, while ODG is targeting the R-9 to more industrial uses with modular expansion. Among the expansion that ODG has discussed are various cameras and better real world tracking modules.

ODG R-7 Beam Splitter Kicks Image Toward Eye

With the announcement comes much better pictures of the headsets and I immediately noticed that their optics were significantly different than I previously thought. Most importantly, I noticed in the an ODG R-8 picture that the beam splitter is angled to kicks the light away from the eye whereas the prior ODG R-7 had a simple beam splitter that kicks the image toward the eye (see below).

ODG R-8 and R-8 Beam Splitter Kicks Image Away From Eye and Into A Curved Mirror

The ODG R-8 (and R-9 but it is harder to see on the available R-9 pictures) does not have a simple beam splitter but rather a beam splitter and curve mirror combination. The side view below (with my overlays of the outline of the optics including some that are not visible) that the beam splitter kicks the light away from the eye and toward partial curved mirror that acts as a “combiner.” This curve mirror will magnify and move the virtual focus point and then reflects the light back through the beam splitter to the eye.

On the left I have taken Figure 169 from ODG’s US Patent 9,494,800. Light from the “emissive display” (ala OLED) passes through two lenses before being reflected into the partial mirror. The combination of the lenses and the mirror act to adjust the size and virtual focus point of the displayed image. In the picture of the ODG R-8 above I have taken the optics from Figure 169 and overlaid them (in red).

According to the patent specification, this configuration “form(s) at wide field of view” while “The optics are folded to make the optics assembly more compact.”

At left I have cropped the image and removed the overlay so you can see the details of the beam splitter and curved mirror joint.  You hopefully can see the seam where the beam splitter appears to be glued to the curved mirror suggesting the interior between the curved mirror and beam splitter is hollow. Additionally there is a protective cover/light shade over the outside of the curved mirror with a small gap between them.

The combined splitter/mirror is hollow to save weight and cost. It is glued together to keep dust out.

ODG R-6 Used A Similar Splitter/Mirror

I could not find a picture of the R-8 or R-9 from the inside, but I did find a picture on the “hey Holo” blog that shows the inside of the R-6 that appears to use the same optical configuration as the R-8/R-9. The R-6 introduced in 2014 had dual 720p displays (one per eye) and was priced at $4,946 or about 5X the price of the R-8 with the same resolution and similar optical design.  Quite a price drop in just 2 years.

ODG R-6, R-8, and R-9 Likely Use Sony OLED Microdisplays

Interestingly, I could not find anywhere were ODG says what display technology they use in the 2014 R-6, but the most likely device is the Sony ECX332A 720p OLED microdisplay that Sony introduced in 2011. Following this trend it is likely that the ODG R-9 uses the newer Sony ECX335 1080p OLED microdisplay and the R-9 uses the ECE332 or a follow-on version. I don’t know any other company that has both a 720p and 1080p OLED microdisplays and the timing of the Sony and ODG products seems to fit. It is also very convenient for ODG that both panels are the same size and could use the same or very similar optics.

Sony had a 9.6 micron pixel on a 1024 by 768 OLED microdisplay back in 2011 so for Sony the pixel pitch has gone from 9.6 in 2011 to 8.2 microns on the 1080p device. This is among the smallest OLED microdisplay pixel pitches I have seen but still is more than 2x linearly and 4x in area bigger than the smallest LCOS (several companies have LCOS pixels pitches in the 4 micron or less range).

It appears that ODG used an OLED microdisplay for the R-6 then switched (likely for cost reasons) to LCOS and a simple beam splitter for the R7 and then back to OLEDs and the splitter/mirror optics for the R-8 and R-9.

Splitter/Combiner Is an Old Optic Trick

This “trick” of mixing lenses with a spherical combiner partial mirror is an old idea/trick. It often turns out that mixing refractive (lenses) with mirror optics can lead to a more compact and less expensive design.

I have seen a beam splitter/mirror used many times. The ODG design is a little different in that the beam splitter is sealed/mated to the curved mirror which with the pictures available earlier make it hard to see. Likely as not this has been done before too.

This configuration of beam splitter and curve mirror even showed up in Magic Leap applications such as Fig. 9 from 2015/0346495 shown at right. I think this is the optical configuration that Magic Leap used with some of their prototypes including the one seen by “The Information.

Conclusion/Trends – Turning the Crank

The ODG optical design while it may seem a bit more complex than a simple beam splitter, is actually probably simpler/easier to make than doing everything with lenses before the beam splitter. Likely they went to this technique to support a wider FOV.

Based on my experience, I would expect that ODG optical design will be cleaner/better than the waveguide designs of Microsoft’s Hololens. The use of OLED microdisplays should give ODG superior contrast which will further improve the perceived sharpness of the image. While not as apparent to the casual observer, but as I have discussed previously, OLEDs won’t work with diffractive/holographic waveguides such as Hololens and Magic Leap are using.

What is also interesting that in terms of resolution and basic optics, the R-8 with 720p is about 1/5th the price of the military/industrial grade 720p R-6 of about 2 years ago. While the R-9 in addition to having a 1080p display, has some modular expansion capability, one would expect there will be follow-on product with 1080p with a larger FOV and more sensors in a price range of the R-8 in the not too distant future and perhaps with integration of the features from one or more of the R-9’s add-on modules; this as we say in the electronics industry, “is just a matter of turning the crank.”

Microvision Laser Beam Scanning: Everything Old Is New Again

Reintroducing a 5 Year Old Design?

Microvision, the 23 year old “startup” in Laser Beam Scanning (LBS), has been a fun topic on this blog since 2011. They are a classic example of a company that tries to make big news out of what other companies would consider to not be news worthy.

Microvision has been through a lot of “business models” in their 23 years. They have been through selling “engines”, building whole products (the ShowWX), licensing model with Sony selling engines, and now with their latests announcement “MicroVision Begins Shipping Samples to Customers of Its Small Form Factor Display Engine they are back to selling “engines.”

The funny thing is this “new” engine doesn’t look very much different from it “old” engine it was peddling about 5 years ago. Below I have show 3 laser microvision engines from 2017, 2012, and 2013 to roughly to the same scale and they all look remarkably similar. The 2012 and 2017 engine are from Microvision and the 2013 engine was inside the 2013 Pioneer aftermarket HUD. The Pioneer HUD appears use a nearly identical engine and within 3mm of the length of the “new” engine. 

The “new” engine is smaller than the 2014 Sony engine that used 5 lasers (two red, two green, and one blue) to support higher brightness and higher power with lower laser speckle shown at left.  It appears that the “new” Microvision engine is really at best a slightly modified 2012 model, with maybe some minor modification and newer laser diodes.

What is missing from Microvision’s announcement is any measurable/quantifiable performance information, such as the brightness (lumens) and power consumption (Watts). In my past studies of Microvision engines, they have proven to have much worse lumens per Watt compared to other (DLP and LCOS) technologies. I have also found their measurable resolution to be considerably less (about half in horizontally and vertically) than they their claimed resolution.

While Microvision says, “The sleek form factor and thinness of the engine make it an ideal choice for products such as smartphones,” one needs to understand that the size of the optical engine with is drive electronics is about equal to the entire contents of a typical smartphone. And the projector generally consumes more power than the rest of the phone which makes it both a battery size and a heat issue.

Everything VR & AR Podcast Interview with Karl Guttag About Magic Leap

With all the buzz surrounding Magic Leap and this blog’s technical findings about Magic Leap, I was asked to do an interview by the “Everything VR & AR Podcast” hosted by Kevin Harvell. The podcast is available on iTunes and by direct link to the interview here.

The interview starts with about 25 minutes of my background starting with my early days at Texas Instruments. So if you just want to hear about Magic Leap and AR you might want to skip ahead a bit. In the second part of the interview (about 40 minutes) we get into discussing how I went about figuring out what Magic Leap was doing. This includes discussing how the changes in the U.S. patent system signed into law in 2011 with the America Invents Act help make the information available for me to study.

There should be no great surprises for anyone that has followed this blog. It puts in words and summarizes a lot that I have written about in the last 2 months.

Update: I listen to the podcast and noticed that I misspoke a few times; it happens in live interviews.  An unfathomable mistake is that I talked about graduating college in 1972 but that was high school; I graduated from Bradley University with a B.S. in Electrical Engineering in 1976 and then received and MSEE from The University of Michigan in 1977 (and joined TI in 1977).  

I also think I greatly oversimplified the contribution of Mark Harward as a co-founder at Syndiant. Mark did much more than just have desigeners, he was the CEO, an investor, and and the company while I “played” with the technology, but I think Mark’s best skill was in hiring great people. Also, Josh Lund, Tupper Patnode, and Craig Waller were co-founders. 


ODG R-9 (Horizon): 1080p Per Eye, Yes Really

Lazy Reporting – The Marketing Hyperbole’s Friend

While I have not ODG’s R-9 in person yet, I fully expect that it will look a lot better than Microsoft’s Hololens. I even think it will look better in terms of image quality than what I think ML is working on. But that is not the key point of this article.

But there is also a layer of marketing hyperbole and misreporting going on that I wanted to clear up. I’m just playing referee hear and calling it like a see them.

ODG 4K “Experience” with 2K (1080p) Per Eye

2016-12-28 Update – It appears I was a bit behind on the marketing hype vernacular being used in VR. Most VR displays today, such as Oculus, take a single flat panel and split it between two eyes. So each eye sees less than half (some pixels are cut off) of the pixels. Since bigger is better in marketing, VR makers like to quote the whole flat panel size and not the resolution per eye. 

ODG “marketing problem” is that historically a person working with near eye displays would talk in in terms of “resolution per eye” but this would not be as big by 2X as the flat panel based VR companies market. Rather than being at a marketing hype disadvantage, ODG apparently has adopted the VR flat panel vernacular, however misleading it might be. 

I have not met Jame Mackie nor have I watched a lot of his videos, but he obviously does not understand display technology well and I would take anything he says about video quality with a grain of salt. If should have understood that ODG’s R-9 has is not “4K” as in the title of his YouTube video: ODG 4K Augmented Reality Review, better than HoloLens ?. And specifically he should of asked questions when the ODG employee stated at about 2:22, “it’s two 1080p displays to each eye, so it is offering a 4K experience.

What the ODG marketing person was I think trying to say was that somehow having 1080p (also known as 2K) for each eye was like having a 2 times 2K or “4K equivalent” it is not. In stumbling to try and make the “4K equivalent” statement, the ODG person simply tripped over his own tongue to and said that there were two 1080p devices per eye, when he meant to say there were two 1080p devices in the glasses (one per eye). Unfortunately Jame Mackie didn’t know the difference and did not realize that this would have been impossible in the R-9’s form factor and didn’t follow up with a question. So the  false information got copied into the title of the video and was left as if it was true.

VRMA’s Micah Blumberg Asks The Right Questions and Get The Right Answer – 1080p Per Eye

This can be cleared up in the following video interview with Nima Shams, ODG’s VP of Headworn: “Project Horizon” AR VR Headset by VRMA Virtual Reality Media“. When asked by Micah Blumberg starting at about 3:50 into the video, “So this is the 4K headset” to which Nima Sham responds, “so it is 1080p to each eye” to which Blumberg astutely makes sure to clarify with, “so we’re seeing 1080p right now and not 4K” to which Nima Sham responds, “okay, yeah, you are seeing 2K to each eye independently“.  And they even added an overlay in the video “confirmed 2K per eye.” (see inside the read circle I added).

A Single 1080p OLED Microdisplay Per Eye

Even with “only” 1080p OLED microdisplay per eye with a simple optical path the ODG R-9 should have superior image quality compared to Hololens:

  1. OLEDs should give better contrast than Hololens’ Himax LCOS device
  2. There will be no field sequential color breakup with head or image movment as there can be with Hololens
  3. They have about the same pixels per arc-minute at Hololens but with more pixels they increase FOV from about 37 degrees to about 50 degrees.
  4. Using a simple plate combiner rather than the torturous path of Hololens’ waveguide, I would expect the pixels to be sharper and with little visible chroma aberrations and no “waveguide glow” (out of focus light around bright objects). So even though the angular resolution of the two is roughly the same, I would expect the R-9 to look sharper/higher resolution.

The known downsides compared to Hololens:

  1. The ODG R-9 does not appear to have enough “eye relief” to support wearing glasses.
  2. The device puts a lot of weight on the nose and ears of the user.

I’m not clear about the level of tracking but ODG’s R-9 does not appear to have the number of cameras and sensors that Hololens has for mapping/locking the real world. We will have to wait and see for more testing on this issue. I also don’t have information on how comparable the level of image and other processing is done by the ODG relative to Horizon.


Micah Blumberg showed the difference between just repeating what he is told and knowing enough to ask the right followup question. He knew that ODG had a 4K marketing message was confusing and that what he was being told was at odds with what he was being told so he made sure to clarify it. Unfortunately while James Makie got the “scoop” on the R-9 being the product name for Horizon, he totally misreported the resolution and other things in his report (more on that later).

Lazy and ill informed reporters are the friend and amplifier of marketing hyperbole. It appears that ODG is trying to equate dual 1080p displays per eye with being something like “4K” which is really is not. You need 1080p (also known as 2K) per eye to do stereo 1080p, but that is not the same as “4K” which which is defined as 3840×2060 resolution or 4 times the spatial resolution of 1080p. Beyond this, qualifiers of like “4K “Experience” which has no real meaning are easily dropped and ill informed reporters will report it as “4K” which does have a real meaning.

Also, my point is not meant to pick on ODG, they just happen to be the case at hand. Unfortunately, most of the display market is “liars poker.” Companies are fudging on display specs all the time. I rarely see a projector that meets or exceeds it “spec” lumens. Resolutions are often spec’ed in misleading ways (such as specifying the input rather than the “native” resolution). Contrast is another place were “creative marketing” is heavily used. The problem is that because “everyone is doing it” people feel they have to just to keep up.

The problem for me comes when I have to deal with people that have read false or misleading information. It gets hard to separate truth from marketing exaggeration.

This also goes back to why I didn’t put much stock in the magazine reports about Magic Leap looked. These reports were made by people that were easy to impress and likely not knowledgeable about display devices. They probably could not tell the display resolution by 2X in each direction or would notice even moderately severe image problems. If they were shown a flashy looking demo they would assume it was high resolution.

One More Thing – Misleading/Fake “True Video”

It will take a while to explain (maybe next time), I believe the James Makie video also falsely indicates at 2:29 in the video (the part with the cars and the metal balls on the table), that what is being shown is how the ODG R-9 works.

In fact, while the images of the cars and balls are generated by the R-9, there tracking of the real world and the reflections off the surfaces are a well orchestrated FAKE. Basically they were playing a pre-rendered video though the glasses (so that part is likely real). But clear and black boxes on the table where props there to “sell the viewer” that this was being rendered on the fly.  There also appears to be some post-processing in the video. Most notably, it looks like the black box was modified in post production. There are several clues in the video that will take a while to explain.

To be fair to ODG, the video does not claim to not be fake/processed, but the way it is presented within Jame Makie’s video is extremely misleading to say the least. It could be that the video was taken out of context.

For the record, I do believe the video starting at 4:02 which I have analyze before is a genuine through the optics video and is correctly so identified on the video. I’m not sure about the “tape replacement” video at 3:23, I think it may be genuine or it could be some cleaver orchestrating.

Kopin Entering OLED Microdisplay Market

Kopin Making OLED Microdisplays

Kopin announced today that they are getting into the OLED Microdisplay business. This is particularly notable because Kopin has been a long time (since 1999) manufacture of transmissive LCD microdisplays used in camera viewfinders and near eye display devices. They also bought Forth Dimension Displays back in 2011, a maker of high resolution ferroelectric reflective LCOS used in higher end near eye products.

OLED Microdisplays Trending in AR/VR Market

With the rare exception of the large and bulky Meta 2, microdisplays, (LCOS, DLP, OLED, and transmissive LCD), dominate the AR/MR see-through market. They also are a significant factor in VR and other non-see-through near eye displays

Kopins entry seems to be part of what may be a trend toward OLED Microdisplays used in near eye products. ODG’s next generation “Horizon” AR glasses is switching from LCOS (used in the current R7) to OLED microdisplays. Epson which was a direct competitor to Kopin in transmissive LCD, switched to OLED microdisplays in their new Moverio BT-300 AR glasses announced back in February.

OLED Microdisplays Could Make VR and Non-See-Through Headsets Smaller/Lighter

Today most of the VR headsets are following Oculus’s use of large flat panels with simple optics. This leads to large bulky headsets, but the cost of OLED and LCD flat panels is so low compared to other microdisplays with their optics that they win out. OLED microdisplays have been far too expensive to compete on price with the larger flat panels, but this could change as there are more entrants into the OLED microdisplay market.

OLEDs Don’t Work With Waveguides As Used By Hololens and Magic Leap

It should be noted that the broad spectrum and diffuse light emitted by OLED is generally incompatible with the flat waveguide optics such as used by Hololens and is expected from Magic Leap (ML). So don’t expect to see these being used by Hololens and ML anytime soon unless they radically redesign their optics. Illuminated microdisplays like DLP and LCOS can be illuminated by narrower spectrum light sources such as LED and even lasers and the light can be highly collimated by the illumination optics.

Transmissive LCD Microdisplays Can’t Compete As Resolution Increases

If anything, this announcement from Kopin is the last nail in the coffin of the transmissive LCD microdisplay in the future. OLED Microdisplays have the advantages over transmissive Micro-LCD in the ability to go to higher resolution and smaller pixels to keep the overall display size down for a given resolution when compared to transmissive LCD. OLEDs consume less power for the same brightness than transmissive LCD. OLED also have much better contrast. As resolution increases transmissive LCDs cannot compete.

OLEDs Microdisplays More Of A Mixed Set of Pros and Cons Compared to LCOS and DLP.

There is a mix of pro’s and con’s when comparing OLED microdisplays with LCOS and DLP. The Pro’s for OLED over LCOS and DLP include:

  1. Significantly simpler optical path (illumination path not in the way). Enables optical solutions not possible with reflective microdisplays
  2. Lower power for a given brightness
  3. Separate RGB subpixels so there is no field sequential color breakup
  4. Higher contrast.

The advantages for LCOS and DLP reflective technologies over OLED microdisplays include:

  1. Smaller pixel equals a smaller display for a given resoluion. DLP and LCOS pixels are typically from 2 to 10 times smaller in area per pixel.
  2. Ability to use narrow band light sources which enable the use of waveguides (flat optical combiners).
  3. Higher brightness
  4. Longer lifetime
  5. Lower cost even including the extra optics and illumination

Up until recently, the cost of OLED microdisplays were so high that only defense contractors and other applications that could afford the high cost could consider them. But that seems to be changing. Also historically the brightness and lifetimes of OLED microdisplays were limited. But companies are making progress.

OLED Microdisplay Competition

Kopin is long from being the first and certainly is not the biggest entry in the OLED microdisplay market. But Kopin does have a history of selling volume into the microdisplay market. The list of known competitors includes:

  1. Sony appears to be the biggest player. They have been building OLED microdisplays for many years for use in camera viewfinders. They are starting to bring higher resolution products to the market and bring the costs down.
  2. eMagin is a 23-year-old “startup”. They have a lot of base technology and are a “pure play” stock wise. But they have failed to break through and are in danger of being outrun by big companies
  3. MicoOLED – Small France startup – not sure where they really stand.
  4. Samsung – nothing announced but they have all the technology necessary to make them. Update: Ron Mertens of informed me that I was rumored that the second generation of Google Glass was considering a Samsung OLED microdisplay and that Samsung had presented a paper going back to 2011.
  5.  LG – nothing announced but they have all the technology necessary to make them.

I included Samsung and LG above not because I have seen or heard of them working on them, but I would be amazed if they didn’t at least have a significant R&D effort given their sets of expertise and their extreme interest in this market.

For More Information:

For more complete information on the OLED microdisplay market, you might want go to OLED-info that has been following both large flat panel and small OLED microdisplay devices for many years. They also have two reports available, OLED Microdisplays Market Report and OLED for VR and AR Market Report.

For those who want to know more about Kopin’s manufacturing plan, Chris Chinnock of Insight Media has an interesting article outlining Kopin’s fabless development strategy.

Magic Leap and Hololens and LCOS

LCOS Used In Hololens and Likely Magic Leap

It is well known that Microsoft’s Hololens uses two Himax manufactured Field Sequential Color (FSC) LCOS microdisplays. Additionally there are reports, particularly from KGI Securities analyst Ming-Chi Kuo as reported in Business Insider that Magic Leap (ML) is also using Himax’s LCOS. Further supporting this is that of all ML patent applications, ML patent application US 2016/0327789 which uses LCOS best fits the available evidence.

I have now from some additional evidence that ML is likely using LCOS. After discussing this new ML evidence, I will relay some Microsoft Hololens 2nd generation (or lack thereof) rumors.

Patent Application Tends To Confirm Magic Leaps Use of Field Sequential LCOS

I came across a bit of a strange patent that seems to confirm that ML is using field sequential color (FSC) LCOS. The patent application US 2016/0241827 was filed in January 2015 just 3 months before the lead inventor, Michael Kass then a ML Fellow, left ML. From what I can tell from their public LinkedIn profiles, Mr. Kass and his fellow inventor both worked on software at ML and not hardware and neither one has any background in hardware.

The patent application is directed towards reducing “Color-Breakup” for color sequential displays and shows an LCOS implementation. The concept they are proposing is at least 15 years old that I know of and it is well known to people in the projection industry that DLP’s projectors had “white segment” color wheels and later with LED illumination. Additionally the way they arranged the LEDs in their diagram above with 3 separate LEDs going to a dichroic mirror is how it is done for front projectors and not for a near eye display. The question I had on finding this application was:

Why are two ML people working on software with only a rudimentary knowledge display design and located in California filing for a patent on an “improvement” for field sequential color?

The only logical answer I could come up with is that they had look through ML prototypes that used an LCOS system and were bothered by seeing color breakup. I’m guessing they were told it was LCOS but did not know how it was designed so they grabbed a LCOS design off the internet (only one for a front projector and not for near eye). They didn’t know the history of FSC projectors using white segments, so they re-invented the 15+ year old concept of adding a “white” period where all the RGB colors are on in order to help reduce color breakup.

For bonus speculation, why did the lead inventor Mr. Kass who had only a month before filing this patent been promoted “Distinguished Fellow” then leave only 3 months after filing the provisional patent? Perhaps, just perhaps, he did not like the color breakup he was seeing (just a guess)?

It should be noted that it has been nearly two years since the provisional application was filed which would give ML time to change. But I doubt they could totally change directions as they would be too far down the road with the rest of the design. At least if, as they claim, they will have a product out “soon.” They might change the type of LCOS device either in resolution or manufacturer but it would seem unlikely that they could totally change the technology.

Hololens Rumored 2nd Generation Delayed?

There was a lot of talk that Hololens, after announcing that Hololens would be focusing first on business applications, would be coming out with a 2nd generation Hololens next year. This sometimes gets conflated with the 2nd generation being a “consumer version.” But apparently the costs to make Hololens are high particularly  with the custom waveguides having very low yield.

The recent scuttlebutt is that expected 2nd generation is on hold while Microsoft management figures out what they want to do with Hololens. For those that were hoping for a Consumer edition, the idea of focusing on “enterprise/business” sounds scarily similar to what Google did with Google Glass when if realized it did not have a high volume market. While Microsoft is continuing to expand sales of Hololens for businesses worldwide, one gets the feeling that Microsoft is trying to figure out if Hololens will have the size market anytime soon that is worthy of a company Microsoft’s size.

Update Dec 20, 2017 – I posed a question on the Reddit Hololens subgroup about finding a public source for issues with Himax and Hololens and they pointed to “A component maker suffers as Microsoft develops next-gen HoloLens” by Kevin Parrish on Dec. 14, 2017 in Digital Trends. In the article they cited Himax CEO Jordan Wu stating, “near-term headwinds” due to a “major AR customer’s shift in focus to the development of future-generation devices.” This would seem to imply that the “AR Customer,” of which Hololens is the most notable/likely, is switching from using a 720p to their new 1080p device on a Gen. 2 Hololens.  

Mixed Bag for Himax’s LCOS

So there is mounting evidence that ML is using LCOS and the most likely manufacturer is Himax. I have had some people write me that ML switched from Himax but I don’t know how credible their sources may be, so this I would categorize as rumor right now.

Either way, Himax can’t be shipping a lot of LCOS to ML right now. The lack of volume coming out of Hololens also means that there are not big new orders from Microsoft for Himax panels.

Meeting At CES 2017 January 5th to 8th

I have had a number of people ask if I was going to CES 2017 in Las Vega and we could meet. I’m going to be at the show from January 5th through the 8th.

If you would like to meet, please email me at

If possible, please include you contact information, reason you or your company wants to meet, and the best dates, times, and if you have a place where you would like to meet if you have a preference.