ODG R-8 and R-9 Optic with a OLED Microdisplays (Likely Sony’s)

ODG Announces R-8 and R-9 OLED Microdisplay Headsets at CES

It was not exactly a secret, but Osterhout Design Group (ODG) formally announce their new R-8 headset with dual 720p displays (one per eye) and R-9 headset with dual 1080p displays.  According to their news release, “R-9 will be priced around $1,799 with initial shipping targeted 2Q17, while R-8 will be less than $1,000 with developer units shipping 2H17.

Both devices use use OLED microdisplays but with different resolutions (the R-9 has twice the pixels). The R-8 has a 40 degree field of view (FOV) which is similar to Microsoft’s Hololens and the R-9 has about a 50 degree FOV.

The R-8 appears to be marketed more toward “consumer” uses with is lower price point and lack of an expansion port, while ODG is targeting the R-9 to more industrial uses with modular expansion. Among the expansion that ODG has discussed are various cameras and better real world tracking modules.

ODG R-7 Beam Splitter Kicks Image Toward Eye

With the announcement comes much better pictures of the headsets and I immediately noticed that their optics were significantly different than I previously thought. Most importantly, I noticed in the an ODG R-8 picture that the beam splitter is angled to kicks the light away from the eye whereas the prior ODG R-7 had a simple beam splitter that kicks the image toward the eye (see below).

ODG R-8 and R-8 Beam Splitter Kicks Image Away From Eye and Into A Curved Mirror

The ODG R-8 (and R-9 but it is harder to see on the available R-9 pictures) does not have a simple beam splitter but rather a beam splitter and curve mirror combination. The side view below (with my overlays of the outline of the optics including some that are not visible) that the beam splitter kicks the light away from the eye and toward partial curved mirror that acts as a “combiner.” This curve mirror will magnify and move the virtual focus point and then reflects the light back through the beam splitter to the eye.

On the left I have taken Figure 169 from ODG’s US Patent 9,494,800. Light from the “emissive display” (ala OLED) passes through two lenses before being reflected into the partial mirror. The combination of the lenses and the mirror act to adjust the size and virtual focus point of the displayed image. In the picture of the ODG R-8 above I have taken the optics from Figure 169 and overlaid them (in red).

According to the patent specification, this configuration “form(s) at wide field of view” while “The optics are folded to make the optics assembly more compact.”

At left I have cropped the image and removed the overlay so you can see the details of the beam splitter and curved mirror joint.  You hopefully can see the seam where the beam splitter appears to be glued to the curved mirror suggesting the interior between the curved mirror and beam splitter is hollow. Additionally there is a protective cover/light shade over the outside of the curved mirror with a small gap between them.

The combined splitter/mirror is hollow to save weight and cost. It is glued together to keep dust out.

ODG R-6 Used A Similar Splitter/Mirror

I could not find a picture of the R-8 or R-9 from the inside, but I did find a picture on the “hey Holo” blog that shows the inside of the R-6 that appears to use the same optical configuration as the R-8/R-9. The R-6 introduced in 2014 had dual 720p displays (one per eye) and was priced at $4,946 or about 5X the price of the R-8 with the same resolution and similar optical design.  Quite a price drop in just 2 years.

ODG R-6, R-8, and R-9 Likely Use Sony OLED Microdisplays

Interestingly, I could not find anywhere were ODG says what display technology they use in the 2014 R-6, but the most likely device is the Sony ECX332A 720p OLED microdisplay that Sony introduced in 2011. Following this trend it is likely that the ODG R-9 uses the newer Sony ECX335 1080p OLED microdisplay and the R-9 uses the ECE332 or a follow-on version. I don’t know any other company that has both a 720p and 1080p OLED microdisplays and the timing of the Sony and ODG products seems to fit. It is also very convenient for ODG that both panels are the same size and could use the same or very similar optics.

Sony had a 9.6 micron pixel on a 1024 by 768 OLED microdisplay back in 2011 so for Sony the pixel pitch has gone from 9.6 in 2011 to 8.2 microns on the 1080p device. This is among the smallest OLED microdisplay pixel pitches I have seen but still is more than 2x linearly and 4x in area bigger than the smallest LCOS (several companies have LCOS pixels pitches in the 4 micron or less range).

It appears that ODG used an OLED microdisplay for the R-6 then switched (likely for cost reasons) to LCOS and a simple beam splitter for the R7 and then back to OLEDs and the splitter/mirror optics for the R-8 and R-9.

Splitter/Combiner Is an Old Optic Trick

This “trick” of mixing lenses with a spherical combiner partial mirror is an old idea/trick. It often turns out that mixing refractive (lenses) with mirror optics can lead to a more compact and less expensive design.

I have seen a beam splitter/mirror used many times. The ODG design is a little different in that the beam splitter is sealed/mated to the curved mirror which with the pictures available earlier make it hard to see. Likely as not this has been done before too.

This configuration of beam splitter and curve mirror even showed up in Magic Leap applications such as Fig. 9 from 2015/0346495 shown at right. I think this is the optical configuration that Magic Leap used with some of their prototypes including the one seen by “The Information.

Conclusion/Trends – Turning the Crank

The ODG optical design while it may seem a bit more complex than a simple beam splitter, is actually probably simpler/easier to make than doing everything with lenses before the beam splitter. Likely they went to this technique to support a wider FOV.

Based on my experience, I would expect that ODG optical design will be cleaner/better than the waveguide designs of Microsoft’s Hololens. The use of OLED microdisplays should give ODG superior contrast which will further improve the perceived sharpness of the image. While not as apparent to the casual observer, but as I have discussed previously, OLEDs won’t work with diffractive/holographic waveguides such as Hololens and Magic Leap are using.

What is also interesting that in terms of resolution and basic optics, the R-8 with 720p is about 1/5th the price of the military/industrial grade 720p R-6 of about 2 years ago. While the R-9 in addition to having a 1080p display, has some modular expansion capability, one would expect there will be follow-on product with 1080p with a larger FOV and more sensors in a price range of the R-8 in the not too distant future and perhaps with integration of the features from one or more of the R-9’s add-on modules; this as we say in the electronics industry, “is just a matter of turning the crank.”

30 comments

    1. The serious challenge for OLED’s in HUD is brightness (Candelas-per-meter-squared = nits). Most OLED microdisplays only have about 200 nits and that is BEFORE you put them into a semi-transparent combiner which should let about 80% of the light through; that means you would be down to 20% x 200 = 40 nits. For an automotive HUD they want 15,000 nits (I was getting over 20,000 on the design I did for Navdy) and for a near-eye outdoor display or “HUD” for say a motorcycle you want 3,000 to 4,000 nits).

      With transmissive panels, DLP and LCOS you can crank the LED (or Laser) up an get lots of nits.

      1. Karl, but if we speak about AR glasses, like ODG which OLED display will be better for this design? Kopin? Emagin? Sony? Or something ELSE?

        Did you see ODG on the CES? By my opinion the safety visor after combiners is very dark, you cant see real world clearly, its low transmittance.

        I thinking external visor ODG is very dark, because OLED display from SONY is low brightness, and ODG make coating for external visor to solve problem with brightness.

        Did you see news about emagin?

        “I am pleased to report we achieved what we set out to do at the beginning of 2016,” commented Andrew Sculley, President and CEO of eMagin. “We are shipping the 2K x 2K display as part of an agreement with a Tier One company that we entered into last year. We previously stated that we expected to sign at least one additional agreement with a Tier One partner in 2016.

        If ODG didn’t use Emagin and build solution with Sony displays, whats the name of Tier One company? Magic Leap or HoloLens (but both companies like waveguide, but waveguide you cant pair with OLED)? Whats your opinion about it?

        1. First, note the R8 was 1280×720 and the R9 is 1920×1080. Neither is 2K by 2K. eMagin does publicly make a 1292 (x3) x 1036 pixels and 1944(3) x 1224 pixels but they are not known for making any OLED in volume or price competitively. If you are ODG, you need a very credible supplier of volume product.

          None of the OLEDs I know of is really right for general AR because they are not bright enough for really see through operation. Ideally with AR you can support 3,000+ nits with 80+% see through which means you need to start with a 15,000+ nits display. The available OLEDs are about two orders of magnitude (100X) away from being bright enough.

          Actually, the optical path for ODG also loses a lot of light both of the display and the real world even ignoring in the outer shade. The display light has to first reflect off the beam splitter, then off the curve combiner, then go back through the beam splitter. The best case for the beam splitter is that is a 50/50 (50% reflective 50% transmissive – it could be other ratios but this is the best case) so after reflecting once and going through one the light is cut down by to least .5 x .5 = 25%. The curved combiner then might also be 50/50 so then you are down to 12.5% of the display less other losses. With everything being 50/50 the real world light passes through two of these so it is cut down to 25% AND this is before you put on an outer shield! If you bias in favor of the “real world” light then you would make the beam splitter be better in transmission and worse in reflection (say 20% reflective and 80% transmissive but then you get .8x.2= 16% of the display light through) and then make the curved combiner more see through but then you are throwing away even more display light. The point is no matter how you play with the ratios, display light to the eye is cut way down which is why you have to darken the real world so much.

          I wouldn’t get too excited about “tier one” comments. This could be a defense contractor running experiments or an R&D group at a big company. ODG is not a tier one company by any stretch of the imagination.

          1. Hi Karl
            Thanks for your insights. The Sony ECX335A is listed at 200 nits. Do you think 200 nits matches the brightness in the ODG video you discussed earlier? https://www.youtube.com/watch?v=7Upy4nSxzbs
            Can you estimate what brightness requirement would be needed produce this video?

            Emagin has a WUXGA XLS at 1000 nits and although is 1944 x 1244, it can be run at 1920 x 1080 (but like you say they might be too expensive and the downgrade in resolution wouldn’t make too much sense).

          2. What I saw from ODG’s products was consistent with roughly 200 lumen nits display.

            You can’t really estimate brightness from the video unless you know the camera settings. I took some pictures and by working back from the camera settings I may be able to get very rough approximation (within 2x because of the logarithmic nature of the calculations and the fact I didn’t have the best setup).

            From what I have heard, the eMagin “1000 nits” display is still not in production.

  1. Did you get to try the R9 at CES? Any thoughts?

    Also did you see the wide FoV Lumus optical system? Would really love to hear your thoughts on that vs this…

        1. This is an apples to oranges comparison. You have to know what you are going to be doing with the Glasses. ODG is better for watching a movie (but not as good as watching a TV) but it has VERY little light throughput for the real world and I think it will be very uncomfortable if worn for long times. Lumus is just a component supplier.

          Hololens is OK but does not thrill me. It has pretty good tracking but the image quality is the poorest of the ones you mentioned. Vuzix is more aimed at doing everyday work and so on.

          I don’t think people “get” that there are different market requirements including price, no one is “best.” If you are walking around a factory, you might require 90% transparency in which case most of the above would fail. One of the articles I am planning is to discuss that there is no one “AR” market only as series of different “AR” applications.

          1. Karl,

            Here is a paper with some interesting optics .

            Only 30 degree FOV but otherwise claiming good optical quailities :

            Optical design for a see-through head-mounted display with high visibility

            Abstract
            An optical design for a see-through head-mounted display with high visibility is described. This design, which is based on a light guide plate with triangular microstructures, can overcome the limitations of the balance in optical efficiency. The optical efficiency of the virtual image can be larger than the optical efficiency of the real image, which ensures high visibility. The visibility can be increased to 22 with this design, much higher than the visibility possible with previous designs for a see-through head-mounted display, making this suitable for daily application.

            Table 1 shows them using an eMagin display – eMagin SVGA + (Rev3) OLED-XL

            https://www.osapublishing.org/oe/fulltext.cfm?uri=oe-24-5-4749&id=336720

          2. Thanks, I took a quick look (I’m busy with working on my CES articles) and it appears to be a bit different approach. What I don’t understand is what the units are for “visibility can be increased to 22”. I would assume this is 22% which is not that impressive. I would also wonder what distortion it causes and how much the FOV can be grown.

          3. I went back and took another look at the article. Figure 6 clears up my issue with how see through it is based on the pitch and size of the microstructure. If we take say a pitch of 200um and a 50um it gives that the “real world” will be about 44% see-though and about 21% of the OLED light will make it toward the eye. This is interesting but nothing out of the ordinary. An interesting metric is to sum the real world % plus the image % (in this case about 44+21=65%). Most combiners are well below 100% and often, like this one does, you reduce the sum if you try and improve one (real or image) at the expense of the other.

            Simple beam splitters can get in the 95+% range and you can dial in the amount for each by how you coat them, but they are not thin.

    1. I meant nits (and have just corrected it). How did you make this conversion from 200 lumens to 2150 nits?

      Assuming they are using a Sony 1080p microdisplay with a 1.8cm diagonal and assuming that the OLED emits uniformly over a hemisphere (it is not perfect but OK for this approximation) I get:
      200 nits * PI = 628 LUX (lumens per square meter). The area of the panel is 0.00000141 square meters. So the panel is only putting out about 628 x 0.00000141 = 0.000886774 lumens or a little less than 1/1000 of a lumen.

      Or there are about 225,436 nits/lumen in this example (factoring in the size of the display and assuming a uniform hemispheric light distribution).

      This also shows why all the waveguide people and those with other see through displays generally use LCOS or DLP projector engines capable of many lumens. They want to have say 80-90% light transmission (20-10% or less of the light gets out) and 3,000 nits for outdoor use. Another big advantage for LCOS and DLP is that the illumination light is highly collimated and they can trade eye-box for Nits to the eye (they can do some of this with OLEDs as well with optics and enhancement films).

  2. I don’t quite follow the various sources citing 22:9 aspect ratio on the R9, considering it’s two 1920×1080 displays. I’m not an optics expert, is there something fundamental I’m missing? Are they talking about the stereo overlap aspect ratio, or do they have rectangular pixels, or can there be a difference between the display aspect ratio and the curved combiner aspect ratio?

    1. You are not confused, the sources are confused by ODG’s marketing. There is no (significant) pixel aspect ratio distortion. They have two 1080p 16:9 displays that are visually overlapped to give 3-D stereo so the aspect ratio is still 16:9.

      The way they get to 22:9 is to CROP/CUT-OFF/LetterBox. The display did not get wider, it gets shorter. but 22:9 sounds so much better (and bigger) than 16:5.55 (approx) which is the same ratio.

      Some articles like techcrunch quote both “Cinema wide (22:9) or 16:9 aspect ratios” https://techcrunch.com/2017/01/03/odg-unveils-its-first-consumer-ar-glasses-built-on-qualcomms-snapdragon-835-chip/

      There is a LOT of marketing misdirection going on. Prior to CES most companies quoted their Horizontal FOV, but with CES everyone was quoting their Diagonal FOV which is about 1.15X bigger with a 16:9 display. You need to count your fingers after some of the marketing spiels you get. Unfortunately, most articles just repeat whatever is told to them without any applying any critical thinking.

  3. Karl take a look at the Zeiss patent if you can find time. Does this design lend itself more towards reflective DLP or LCoS? You’ve mentioned before that OLED isn’t compatible with waveguides, would it be compatible with Zeiss?

      1. On the surface it looks like they have a big mirror matrix and then something like Lumus’s prisms to make the light exit.

        They seem to be suggesting OLED microdisplays or LCOS (such as Himax Font-Lit) which are both specifically mentioned. I don’t think OLEDs are going to bright enough for see-through AR use unless they block a lot of light as ODG does. With a prism exit display the is even a bigger issue with image brigthness.

        I have written that OLED is not compatible with diffractive waveguides (although some people claim theirs will work). OLEDs will work with “prism” waveguides like Lumus but the issue is that a lot of light is lost, so they need a very bright display device which is were OLED microdisplays are limited.

      2. Zeiss is using a 800 x 600 OLED with their Smart Optics Optical System .

        “The 800×600 pixel image of an OLED display is guided through a free-form prism into the right spectacle lens, where it is reflected in total reflection in a zigzag between the front and the back of the glass, and finally directed to a Fresnel structure. ”

        https://www.heise.de/newsticker/meldung/MWC-2016-Prototyp-einer-Smartbrille-von-Zeiss-3114053.html

        Currently Sony and MicroOLED do not make this size .

        Olightek makes a 804 x 604 but has the lowest brightness .

        eMagin makes an 800 x 600

        http://1103zzeykfq2w1en1xnkh94yk.wpengine.netdna-cdn.com/wp-content/uploads/2016/05/DSVGA-Color-XL-Datasheet-PU-D11-501447-02.pdf

        Furthermore, this article about Zeiss Smart Optics from Jan 2016 had this to say :

        “The display’s shortcomings in bright light should be helped, if not entirely remedied, by much brighter OLED displays coming later this year.”

        And then this high brightness PR from eMagin in June 2016 :

        “Immediately following the presentation of the paper, eMagin demonstrated for the first time in public a direct patterned OLED microdisplay that can reach a maximum luminance of 4,500 nits with vivid colors and in full video mode.”

        http://www.businesswire.com/news/home/20160602005730/en/eMagin-Announces-Public-Demonstration-Ultra-High-Brightness-Direct

        As we know it has been rumored Zeiss is working with Apple on an AR device .

        https://www.cnet.com/news/apple-said-to-be-working-on-ar-glasses-with-carl-zeiss-augmented-reality/

        Just saying’ .

        1. Interesting information. It is very different making a prototype than a production product.

          At CES I was asking around about OLEDs and nobody seemed to take eMagin very seriously. Sony was the “go to” OLED display due to cost and manufacturing capability. They seem to treat the high brightness OLEDs from eMagin as being more lab prototypes than production products; I don’t know if this is cost, availability, or lifetime issues.

          I’m trying to understand what is going on. Production OLEDs are in the 200 nits range or less than 1/20th the brightness of what eMagin has claimed. Having 4,000+ nits dramatically changes where OLEDs can be used.

          Note even 4,000 nits at the display is not enough for highly transparent displays used outdoors; in this case they want to be 90% transparent which usually means losing 90% of the light and they want 3,000+ nits so they need to start with over 30,000 nits. This is why the highly transparent displays use LCOS and DLP where they can crank up the illumination.

          A word of caution on the Apple/Zeiss rumor. I don’t know anything good or bad but just remember the Apple is looking at a lot of things and the vast majority of them will never see the light of day.

          1. Thanks for the reply , however it’s likely that what is demonstrated is only a glimpse of what goes on behind closed doors .

            Elbit is currently working with the FAA on new rules for Commercial Aviation HMD ‘s . Once completed , their SkyLens will be the first certified . Although it’s only monochrome green only fighter pilots would have a more demanding need for brightness .

            http://elbitsystems.com/media/Skylens_Aviation_Award_2016.pdf

            Here is the consumer version currently in pilot testing .

            https://everysight.com

            Seaking of military avionics , here is a test flight with OLED micro displays , likely eMagin – working their way up the food chain . Again probably monochrome green .

            https://www.youtube.com/watch?v=sBqAbbFJph0

            Furthermore, eMagin is under a 30 month contract to develop 10,000 nit full color . I suspect 4,500 nit is just a benchmark .

            http://ir.emagin.com/mobile.view?c=96135&v=203&d=1&id=1983021

  4. Hi Karl, thank you for the comprehensive review, very impressive!
    I couldn’t rely understand if ODG’s R7 uses LCOS or OLED as a display?
    Their website indicates all their products are based on OLED but you suggested that only for the R7 they used LCOS
    Did they declare that did you conclude it and if so, can you explain what are you basing it upon?

    1. The new R8 and R9 are OLED based but the older R7 by all reports used LCOS. Take for example https://www.anandtech.com/show/8545/hands-on-with-odgs-r7-augmented-reality-glasses and quoting: “This wearable has a Qualcomm Snapdragon 805 SoC running at 2.7 GHz, with anywhere between one to four gigabytes of RAM and 16 to 128 gigabytes of storage. There are two 720p LCoS displays that run at 100 Hz refresh rate, which means that the display is see-through.” There are many other references to the R7 being LCOS.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: