Navdy Launches Pre-Sale Campaign Today

Bring Jet-Fighter Tech to Your Car with NavdyIts LAUNCH Day for Navdy as our presale campaign starts today. You can go to the  Navdy site to see the video.  It was a little over a year ago that Doug Simpson contacted me via this blog asking about how to make a aftermarket heads up display (HUD) for automobiels.     We went through an incubator program called Highway1 sponsored by PCH International that I discussed in my last blog entry.

The picture above is a “fancy marketing image” that tries to simulate what the eye sees (which is impossible to do with a camera as it turns out).   We figures out how to do some pretty interesting stuff and the optics works better than I thought was possible when we started.    The image image focuses beyond the “combiner/lens” to help with the driver seeing the images in the far vision is about 40 times brighter (for use in bright sunlight) than an iPhone while being very efficient.

Navdy Office

Being CTO at a new start-up has kept me away from this blog (a start-up is very time consuming).  We have raise some significant initial venture capital to get the program off the ground and the pre-sale campaign takes it to the next level to get products to market.  In the early days it was just me and Doug but now we have about a dozen people and growing.

Karl

Highway1 Incubator

Highway1Those that follow my blog are probably wondering what has happened to me these past months.   I have away from home for most of the last 4 months at an “incubator” program for start-ups called Highway1.   Navdy, for which I recently became CTO, was selected as one of 11 companies from over 100 applicants for the very first class of the Highway1  program sponsored by PCH International.

What makes Highway1 different from almost all other incubator programs these days is that it is totally focused on helping hardware start-ups.   Highway1 recognizes that hardware start-ups have special needs, are more difficult to get started, and have have to deliver a physical product unlike software companies.

The Highway1 office is in the Mission District of San Francisco where most of the time is spent, but the program also includes spending two weeks in Shenzhen China where many of the electronic products used around the world are made.   During the program companies are introduced to mentors from other companies and experts in the field as well as helped with introductions to angle and venture investment firms.

While in Shenzhen, the companies were introduced to manufacturers who could eventually be making their products.   Additionally our company received some very crucial support from PCH in Shenzhen in locating a company that could manufacture a critical component of our system.

Along the way, the people at the various 11 companies became friends and helped each other out.  Respecting each other was particularly important as the companies were cranking out prototypes sharing first on one and later two 3-D printers for making prototypes (as demo day neared, the those 3-D printers were pretty much running non-stop).   There was some incredible talent  technically, marketing, and business wise at these companies.

At the end of the program was “Demo Day” where more than 200 venture capitalists, investors, press, and technologist pack a large room at PCH’s U.S. Headquarters in San Francisco.  It was a chance for investors and the press to see what the companies had developed.   While Navdy presented, details of our product and plans were not released to the press because we are planning on launching our product later this year.  Navdy did receive serious interest from a number of VC’s with our demo after the formal presentations.

The whole Highway1 program was the dream of Liam Casey the founder and CEO of PCH, a company with over $700M in revenue.  You may not know the PCH name, but it is very likely that you have brand name products that they helped get to your home or office (be it anywhere in the world).   Liam was personally there to greet us at the beginning of the program and at key points along the way, and he told some great business stories.  The whole of the PCH team, be it the people from San Francisco, China, or Ireland, were always awesome to work with and incredibly nice reflecting PCH’s founder.

Comment: I don’t usually use the word “awesome” but the word was ubiquitous in San Francisco and it seemed to fit the people at PCH.

“If you haven’t tested it, it doesn’t work”

1994 Derek Roskell

Derek Roskell (circa 1994) of TI MOS Design Bedford, UK (Formal Photo – not how I remember him)

When I started this blog, I intended to write about more than displays and include some of my personal IC history.   Today’s story is about Derek Roskell of Texas Instrument’s who led the UK-based design teams I worked with between 1979 and 1997 on a number of the most complex I.C.s done up to that point including the 9995 16-bit CPU, 34010 and 34020 Graphics CPU’s, and the extremely complex 320C80 and 320C82 image processors with a 32-bit RISC CPU and 4 (C80) and 2 (C82) advanced DSP processors on one chip.  Every one of these designs quickly went from first silicon to product.

Having one successful design after the other may not seem so special in today’s era of logic synthesis and all the other computer tools, but back in 1979 we drew logic on paper and  transistors on sheets of frosted Mylar plastic with color pencils that then were then digitized by hand.  We then printed out large “composites” plots on giant flat-bed pen plotters (with each layer of the I.C. in a different color) and then verified all the circuitry by hand and eye (thank goodness by the mid 1980′s we got computer schematic verification).

In those days it all could go very wrong and it did for a 16-bit CPU call the 9940 and a spinoff version the 9985 that were design in Houston Texas in 1977-1978.   It went so bad that the both the 9940 and 9985 were never fully functional, causing the designer to be discredited (whether at fault or not) and many people to leave.

In the wake of the 9940/9985 disaster, in 1979 management pick me, the young hotshot only 1.5 years out of college, to lead the architecture and logic design of a new CPU, the TMS9995, to replace the failed TMS9985.   There was one hitch, they wanted to use a  TI design group in Bedford England.  So after some preliminary work, I packed up for a 6 month assignment in Bedford where I first met Derek Roskell.

Derek in circa 2010 DSCN1430

Derek more “In Character” but taken years later

To say Derek is a self-deprecating is a gross understatement.  The U.S. managers at TI at the time were more the self-assertive, aggressive, “shoot from the hip,” cut corners (which resulted in the 9940/9985 debacle) and generally didn’t take well to Derek’s “English working class” (said with great affection) style with the all too frequent laugh at the “wrong” time.

When I first met Derek he was this “funny old guy” who at had worked on “ancient” TTL technology.  He  was around 40 and seem like an old man in a world of engineers in their 20′s and early 30′s who he led.   As it turned out, Derek was the steady hand that guided a number of brilliant people who worked under him.   He made sure my “brilliant” architecture and logic design actually worked.  You don’t have one successful design after another, particularly back then, by accident.

Upper management  was always pressuring to get thing done faster which could only be accomplished by cutting corners.  They called Bedford a “country club” for resisting the pressure.  Derek was willing to take the heat and do things the “right way” because he understood the consequences of cutting corners.

For most engineers fun part of engineering is doing the original design work.  That is the “creative stuff” and the stuff that gets you noticed.   Also most engineers have big egos and think, “of course what I designed works.”  But when you are designing these massive I.C.’s with hundreds of thousand and later millions of transistors, even if 99.99% of the design is correct, there will be a hopeless number errors to debug and correct.  Most of what it takes to make sure a design works is the tedious process of “verification.”

A couple of months back I had a small reunion in Bedford with some friends from the old days including Derek.   Everyone remembered Derek for one thing he constantly chided the designers with, “If you haven’t tested it, it doesn’t work.”  Pretty good advice.

Epilog

TI, like most companies today, in their search for “shareholder value” closed the large Bedford UK site around 1995 but still kept Bedford MOS designers who had so many proven successes and moved them to a rental building Northhampton.   Through the years TI kept “consolidating/downsizing” and finally 2011 it shut down the last vestiges of their design operation in England and losing a number of extremely talented (and by then) senior people.

Below is a picture taken of the design team in Bedford that worked with me on the 320C80.

320C80 Bedford Team cropped and Sharpend

320C80 Bedford Design Team (1994)

Whatever happened to pico projectors embedding in phones?

iPad smBack around 2007 when I was at Syndiant we started looking at the pico projector market, we talked to many of the major cell phone as well as a number of PC companies and almost everyone had at least an R&D program working on pico projectors.  Additionally there were market forecasts for rapid growth of embedded pico projectors in 2009 and beyond.  This convinced us to develop small liquid crystal on silicon (LCOS) microdisplay for embedded pico projectors.  With so many companies saying they needed pico projectors, it seemed like a good idea at the time.  How could so many people be wrong?

Here we are 6 years later and there are almost no pico projectors embedded in cell phones or much else for that matter.   So what happened?   Well, just about the same time we started working on pico projectors, Apple introduced their first iPhone.    The iPhone overnight roughly tripled the size of the display screen of a smartphone such as a Blackberry.  Furthermore Apple introduced ways to control the screen (pinch/zoom, double clicking to zoom in on a column, etc.) to make better use of what was still a pretty small display.   Then to make matter much worse, Apple introduce the iPad and tablet market took off almost instantaneously.    Today we have larger phones, so called “phablets,” and small tablets filling in just about every size in between.

Additionally I have written about before, the use model for a cell phone pico projector shooting on a wall doesn’t work.   There is very rarely if ever a dark enough place with something that will work well for a screen in a place that is convenient.

I found that to use a pico projector I had to carry a screen (at least a white piece of paper mounted on a stiff board in a plastic sleeve to keep clean and flat) with me.   Then you have the issue of holding the screen up so you can project on it and then find a dark enough place that the image looks good.    By the time you carry a pico projector and screen with you, a thin iPad/tablet works better, you can carry it around the room with ease, and you don’t have to have very dark environment.

The above is the subjective analysis, and the rest of this article will give some more quantitative numbers.

The fundamental problem with a front projector is that it has to compete with ambient light whereas flat panels have screens that absorb generally 91% to 96% of the ambient light (thus they look dark when off).     While display makers market contrast number, these very high contrast numbers assume a totally dark environment, in the real world what counts is the net contrast, that is the contrast factoring in ambient light.

Displaymate has an excellent set of articles (including SmartPhone Brightness Shootout, Mobile Brightness Shootout 2, and Smartphone Shootout 2) on the subject of what they call “Contrast Rating for High Ambient Light” (CRHAL)  which they define as the display brightness per unit area (in candela’s per meter squared, also known as “nits”) of the display divide by the reflectivity of ambient light in percent by the display.

Displaymate’s CRHAL is not a “contrast ratio,” but it gives a good way to compare displays when in reasonable ambient light.  Also important, is that for a front projector it does not take much ambient light to end up dominating the contrast.  For a front projector even dim room light is “high ambient light.”

The total light projected out of a projector is given in lumens so to compare it to a cell phone or tablet we have to know how big the projected image will be and the type of screen.   We can then compute the reflected light in “nits”  which is calculated by the following formula Candelas/meter2 = nits = Gn x (lumens/m2)/PI (where Gn is the gain of the screen and PI = ~3.1416).   If we assume a piece of white paper with a gain of 1 (about right for a piece of good printer paper) then all we have to do is calculate the screen area in meters-square, multiply by the lumens and divide by PI.

A pico projector projecting a 16:9 (HDTV aspect ratio) on a white sheet of notebook paper (with a gain of say 1) results in 8.8-inch by 5-inch image with an area of 0.028 m2 (about the same area as an iPad2 which I will use for comparison).    Plugging a 20 lumen projector in to the equation above with a screen of 0.028 m2 and a gain of 1.0 we get 227 nits.  The problem is that same screen/paper will reflected (diffusing it) about 100% of the ambient light.   Using Displaymate’s CRHAL we get 227/100 = 2.27.

Now compare the pico projector numbers to an iPad2 of the same display area which according to Displaymate has 410 nits and only reflects 8.7% of the ambient light.   The CRHAL for the iPad2 is 410/8.7  = 47.   What really crushes the pico projector by about 20 to 1 with CRHAL metric is that the flat panel display reflects less than 10th of the ambient light where the pico projector’s image has to fight with 100% the ambient light.

In terms of contrast,to get a barely “readable” B&W image, you need at least 1.5:1 contrast (the “white” needs to be 1.5 brighter than the black) and preferably more than 2:1.   To have moderately good (but not great) colors you need 10:1 contrast.

A well lit room has about 100 to 500 lux (see Table 1 at the bottom of this article) and a bright “task area” up to 1500 lux.   If we take 350 lux as a “typical” room then for the sheet of paper screen there are about 10 lumens of ambient light in our 0.028 m2 image from used above.   Thus our 20 lumen projector on top of the 10 lumens of ambient has a contrast ratio of 30/10 or about 3 to 1 which means the colors will be pretty washed out but black on white text will be readable.  To get reasonably good (but not great) colors with a contrast ratio of 10:1 we would need about 80 lumens.   By the same measure, the iPad2 in the same lighting would have a contrast ratio of about 40:1 or over 10x the contrast of a 20 lumen pico projector.   And the brighter the lighting environment the worse the pico projector will compare.    Even if we double or triple the lumens, the pico projector can’t compete.

With the information above, you can plug in whatever numbers you want for brightness and screen size and no matter was reasonable numbers you plug in, you will find that a pico projector can’t compete with a tablet even in moderate lighting conditions.

And all this is before considering the power consumption and space a pico projector would take.   After working on the problem for a number of years it became clear that rather than adding a pico projector with its added battery, they would be better off to just make the display bigger (ala the Galaxy S3 and S4 or even the Note).   The microdisplay devices created would have to look for other markets such as near eye (for example, Google Glass) and automotive Heads Up Display (HUD).

Table 1.  Typical Ambient Lighting Levels (from Displaymate)

Brightness Range

Description

0 lux  –

100 lux  –

500 lux  –

1,000 lux  –

3,000 lux  –

10,000 lux  –

20,000 lux  –

50,000 lux  –

100,000 lux  –

100 lux

500 lux

1,500 lux

5,000 lux

10,000 lux

25,000 lux

50,000 lux

75,000 lux

    120,000 lux

Pitch black to dim interior lightingResidential indoor lighting

Bright indoor lighting:  kitchens, offices, stores

Outdoor lighting in shade or an overcast sky

Shadow cast by a person in direct sunlight

Full daylight not in direct sunlight

Indoor sunlight falling on a desk near a window

Indoor direct sunlight through a window

Outdoor direct sunlight

Himax FSC LCOS in Google Glass — Seeking Alpha Article

Catwig to Himax ComparisonThis blog was the first to identify that there was a Himax panel in an early Google Glass prototype and the first to identify that there was a field sequential color LCOS panel inside Google Glass.  Due to the connection it was a reasonable speculation but there was no proof that Himax was in Google Glass.

Then when Catwig published a teardown of Google Glass last week (and my inbox lit up with people telling me about the article) there were no Himax logos to be seen which started people to wondering if there was indeed a Himax display inside.   As a result of my prior exclusive finds on Himax, LCOS and Google Glass, I was ask to contribute to Seeking Alpha and I just published an article that details my proof that there is a Himax LCOS display inside the current Google Glass.   In that article, I also discounted some recent speculation that Google Glass was going to use a Samsung OLED microdisplay anytime soon.

 

 

 

 

Extended Temperature Range with LC Based Microdisplays

cookies and freezing

Extreme Car Temperatures

A reader, Doug Atkinson, asked a question about meeting extended temperature ranges with LC based microdisplays, particularly with respect to Kopin.    He asked the classic “car dash in the desert and the trunk in Alaska” question. I thought the answer would have broader interest so I decided to answer it it here.

Kopin wrote a good paper that is available on the subject in 2006 titled “A Normally Black, High Contrast, Wide Symmetrical Viewing Angle AMLCD for Military Head Mounted Displays (HMDs) and Other Viewer Applications”. This paper is the most detailed one readily available describing the how Kopin’s transmissive panels meet the military temperature and shock requirements.  It is not clear that Kopin uses this same technology for their consumer products as this paper is specifically addressing what Kopin did for military products.

With respect to LC microdisplays in general, it should realized that there is not a huge difference in the technical spec’s of the liquid crystals between the LC’s  most small panel microdisplays use and large flat panels in most cases. They often just use different “blends” of the very similar materials. There are some major LC differences including TN (twisted nematic), VAN (vertically aligned nematic), and others.   Field sequential color are biased to wanting faster switching “blends” of the LC.

In general, anywhere a large flat panel LC can go, a microdisplay LC can go. The issue is designing the seals and and other materials/structures to withstand the temperature cycling and mechanical shock which requires testing,  experimentation, and development.

The liquid crystals themselves generally will go through different phases from freezing (which is generally fatal) to heating up to the the “clearing point” where the display stops working (but is generally recoverable).  There is also a different spec for “storage temperature range” versus “operating temperature range.” Generally it is assumed the device only has to work in a temperature range in which a human could survive.

At low temperature the LC gets “sluggish” and does not operate well but this can be cured by various “heater mechanisms” including having heating mechanisms designed into the panel itself.  The liquid crystal blends are often designed/picked to work best at a higher temperature range because it is easier to heat than cool.

Field sequential color LCOS is more affected by temperature change because temperature affects not only the LC characteristics, but the switching speed. Once again, this can be dealt with by designing for the higher temperature range and then heating if necessary.

As far as Kopin’s “brightness” goes (another of Doug’s questions), a big factor is how powerful/bright the back light has to be. The Kopin panel blocks something like 98.5% of the light by their own spec’s. What you can get away with in a military headset is different than what you may accept in a consumer product in terms of size, weight, and power consumption. Brightness in daylight is a well known (inside the industry) issue for Kopin’s transmissive panels and one reason that near eye display makers have sought out LCOS.

[As an aside for completeness about FLC]  Displaytech which was sold the Micron and then sold to Citizen Finetech Miyota and the Kopin bought Forth Dimension Display (FDD) both use Ferro-electric LC (FLC / FLCOS) which does have a pretty dramatically different temperature profile that is very near “freezing” (going into a solid state) a little below 0C which would destroy the device. Displaytech claimed (I don’t know about FDD) that they had extended the low temperature range but I don’t know by how much. The point is that the temperature range of FLC is so different that meeting military spec’s is much more difficult.

AR Display Device of the Future: Color Filter, Field Sequential, OLED, LBS and other?

I’m curious what people think will be the near eye microdisplay of the future.   Each technology has its own drawbacks and advantages that are well known.   I thought I would start by listing summarizing the various options:

Color filter transmissive LCD – large pixels with 3 sub-pixels and lets through only 1% to 1.5% of the light (depends on pixel size and other factors).  Scaling down is limited by the colors bleeding together (LC effects) and light throughput.  Low power to panel but very inefficient use of the illumination light.

Color filter reflective (LCOS) – same as CF-transmissive but the sub-pixels (color dots) can be smaller, but still limited scaling due to needing 3 sub-pixels and color bleeding.  Light throughput on the order of 10%.  More complicated optics than transmissive (requires a beam splitter), but shares the low power to panel.

Field Sequential Color (LCOS) – Color breakup from sequential fields (“rainbow effect”), but the pixels can be very small (less than 1/3rd that of color filter).   Light throughput on the order of 40% (assuming a 45% loss in polarization).  Higher power to the panel due to changing fields.  Optical path similar to CF-LCOS, but to take advantage of the smaller size requires smaller but higher quality (low MTF) optics.   Potentially mates well with lasers for very large depth of focus so that the AR image is in focus regardless of where the user’s eyes are focused.

Field Sequential Color (DLP) – Color breakup form FSC but can go to higher field rates than LCOS to reduce the effects.   Device and control is comparatively high powered and has a larger optical path.  The pixel size it bigger than FSC LCOS due to the physical movement of the DLP mirrors.   Light throughput on the order of 80% (does not have the polarization losses) but falls as pixel gets smaller (gap between mirrors is bigger than LCOS).    Not sure this is a serious contender due to cost, power of the panel/controller, and optical path size, and nobody I know of has used it for near eye, but I listed it for completeness

OLED – Larger pixel due to 3 color sub-pixels.  It is not clear how small this technology will scale in the foreseeable future.  OLED while improving the progress has been slow — it has been the “next great near eye technology” for 10 years.   Has a very simple optical path and potentially high light efficiency which has made it seem to many like on technology with the best future, but it is not clear how it scales to very small sizes and higher resolution (the smallest OLED pixel I have found is still about 8 times bigger than the smallest FSC LCOS pixel) .    Also it is very diffuse light and therefore the depth of focus will be low.

Laser Beam Steering – While this one sounds good to the ill-informed, the need to precision combine 3 separate lasers beams tends to make it not very compact and it is ridiculously to expensive today due to the special (particularly green) lasers required.  Similar to field sequential color, there are breakup effects of having a raster scan (particularly with no persistence like a CRT) on a moving platform (as in a head mount display).   While there are still optics involved to produce an image on the eye, it could have a large depth of focus.   There are a lot of technical and cost issues that keep this from being a serious alternative any time soon, but it is in this list for completeness.

I particularly found it interesting that Google’s early prototype used a color filter LCOS and then they switched to field sequential LCOS.    This seems to suggest that they chose size over issues with the field sequential color breakup.    With the technologies I know of today, this is the trade-off for any given resolution; field sequential LCOS pixels are less than 1/3rd the size (a typically closer to 1/9th the size) of any of the existing 3-color devices (color filter LCD/LCOS or OLED).

Olympus MEG4.0

Olympus MEG4.0 – Display Device Over Ear

It should also be noted that in HMD, an extreme “premium” is put on size and weight in front of the eye (weight in front of the eye creates as series of ergonomic and design issues).    This can be mitigated by using light guides to bring the image to eye and locating a larger/heavier display device and its associate optics to a less critical location (such as near the ear) as Olympus has done with their Meg4.0 prototype (note, Olympus has been working at this for many years).  But doing this has trade-offs with the with the optics and cost.

Most of this comparison boils down to size versus field sequential color versus color sub-pixels.    I would be curious what you think.

Kopin Displays and Near Eye (Followup to Seeking Alpha Article)

Kopin Pixel compared to LCOS

Kopin’s smallest transmissive color filter pixel is bigger than nine of the smallest field sequential color LCOS pixels

After posting my discovery of a Himax LCOS panel on a Google Glass prototype, I received a number of inquiries about Kopin including a request from Mark Gomes of SeekingAlpha the give my thoughts about Kopin which were published in “Will Kopin Benefit From the Glass Wars?”  In this post I am adding morel information to supplement what I wrote for the  Seeking Alpha article.

First, a little background on their “CyberDisplay® technology would be helpful.   Back in the 1990′s Kopin developed a unique “lift-off” process to transfer transistor and other circuitry from a semiconductor I.C. onto a glass plate to make a transmissive panel which they call the CyberDisplay®.  Kopin’s “lift-off” technology was amazing for that era. This technology allowed Kopin to apply very (for its day) small transistors on glass to enable small transmissive devices that were used predominantly in video and still camera viewfinders. The transmissive panel has 3 color dots (red, green, blue) that produce a single color pixel similar to a large LCD screen only much smaller. In the late 1990′s Kopin could offer a simple optical design with the transmissive color panel that was smaller than existing black and white displays using small CRTs.  This product was very successful for them, but it has become a commoditized (cheap) device these many years later.

CyberDisplay pixel is large and blocks 98.5% of the light

While the CyberDisplay let Kopin address the market for what are now considered low resolution displays cost effectively, the Achilles’ heel to the technology is that it does not scale well to higher resolution because the pixels are so large relative to other microdisplay technologies.  For example Kopin’s typical transmissive panel is15 by 15 microns and is made up of three 5 by 15 color “dots” (as Kopin calls them).    But what makes matters worse; even these very large pixel devices have an extremely poor light throughput of 1.5% (blocks 98.5% of the light) and scaling the pixel down will block even more light!

While not listed on the website (but included in a news release), Kopin has an 8.7 x 8.7 micron color filter pixel (that I suspect is used in their Golden-i head mount display) but it blocks even more light than the 15×15 pixel as the pixel gets smaller.    Also to be fair, there are CyberDisplay pixels that block “only” 93.5% of the light but they give up contrast and color purity in exchange for light throughput which is not usually desirable.

There are many reasons why the transmissive color filter LCOS light throughput is so poor.  To begin with, the color filters themselves which are going to block more than 2/3rds of the light (blocking the other 3 primary colors plus other losses).    Because it is transmissive, the circuitry and the transistor to control each pixel block the light which becomes significant as the pixel becomes small.

But perhaps the biggest factor (but most complex to understand, I will only touch on it here) is that the electric field for controlling the liquid crystal for a given color dot extent into the neighboring color dots thus causing the colors to bleed together and loose all color saturation/control.  To reduce this problem they can use less light throughput efficient liquid crystal materials that are less susceptible to the neighboring electric fields and use black masks (which block light)  surrounding the each color dot to hide the area where the colors bleed together.

Field Sequential Color – Small Pixels and 80+% light throughput

With reflective LCOS, all the wires and circuitry are hidden behind the pixel mirror so that non of the transistors and other circuitry block the light.  Furthermore the liquid crystal layer is usually less than half as thick which limits the electric field spreading and allows pixels to be closer together without significantly affecting each other.  And of course there are no color filters which waste more than 2/3rds of the light.    The down side to field sequential color is the color field breakup where when the display move quickly relative to the eye, the colors may not line up for a split second.   The color breakup effects can be reduce by going to higher field sequential rates.

Kopin’s pixesl are huge when compared to those of field sequential LCOS devices (from companies such as Himax, Syndiant, Compound Photonics, and Citizen Finetech Miyota) that today can easily have pixels 5 by 5 microns and with some that are smaller than 3 by 3 microns.   Therefore FSC LCOS can have about 9 times the pixel resolution for roughly the same size device!  And the light throughput of the LCOS devices is typically more than 80% which becomes particularly important for outdoor use.

So while a low resolution Kopin CyberDisplay might be able to produce a low resolution image in a headset as small as Google Glass, they would have to limit the device in the future to a low resolution device – - – not a good long-term plan.  I’m guessing that the ability to scale to higher resolutions was at least one reason why Google went with a field sequential color device rather than starting with a transmissive panel that would have at least initially been easier to design with.  Another important factor weight in advantage of LCOS over a transmissive panel is the light throughput so that the display is bright enough for outdoor use.

I don’t want to be accused of ignoring Kopin’s 2011 acquisition of Forth Dimension Displays (FDD) which makes a form of LCOS.  This is clearly a move by Kopin move into reflective FSC LCOS.   It so happens back in 1998 and 1999 I did some cooperative work with CRL Opto (that later became FDD) and they even used I design I worked on for their silicon backplane in their first product.  The FSC LCOS that FDD makes is considerably different in both the design of the device and the manufacturing process required for a high volume product.

Through FDDs many years of history (and several name changes) FDD has drifted to a high end specialized display technology with a large 8+ micron pixels.   For a low volume niche applications FDD is servicing, there was no need to develop more advance silicon to support a very small device and drive electronics.  Other companies aiming more at consumer products (such as Syndiant where I was CTO) have put years of efforts into building “smarter” silicon that enabled minimizing the not only the size of the display;  reducing the number of connection wires going between the display and the controller; and reduced the controller to one small ASIC.

Manufacturing Challenge for Kopin

To cost effectively assemble small pixel LCOS devices requires manufacturing equipment and methods that are almost totally different from what Kopin does with their CyberDisplay or FDD with their large pixel LCOS.   Almost every step in the process is done with an eye to high volume manufacturing cost.   And it is not like a they can just buy the equipment and be up and running, it usually takes over a year to get the yields up to an acceptable level from the time the equipment is installed.  Companies such as Himax have reportedly spent around $300M in developing their LCOS devices and I know of multiple other companies having spend over $100M and many years of effort in the past.

Conclusion

For at least the reasons given above, I don’t see Kopin as currently positioned well to build a competitive high volume head mounted displays that are to meet the future needs of the market as I think all roads lead to higher resolution, yet small devices.  It would seem to me that they would need a lot time, effort, and money to field a long-term competitive product.

Laser Illumination Could Cause LCOS to Win Out Over OLED in Near Eye AR

Steve Mann IEEE adapted

The conventional wisdom is that eventually OLEDs will become inexpensive and they will push out all other technologies in near eye because they will be smaller and lighter with a simple optical path.   But in reading ‘Steve Mann: My “Augmediated” Life”‘ in IEEE Spectrum I was struck by his comment “It requires a laser light source and a spatial light modulator”  (a spatial light modulator are devices like LCOS, transmissive panels, and DLP).     The reason he gives for needing a laser light source is to support a very high depth of focus.   For those that don’t believe LCOS and lasers give a high depth of focus you might want to look at my blog from last year (and the included link to a video demonstration).

Steve Mann has “lived the dream” of Augmented Reality for 35 years and (with due affection) is a geek’s geek when it comes to wearing AR technology.  He makes what I think are valid points as to what he finds wrong about Google Glass including the need to have the camera’s view concentric with the eye’s view and issues of eye strain in the way the Google Glass image is in the upper corner of your field of view which can cause eye muscle strain.

But the part of Steve Mann’s article really caught my attention is the need for laser illumination to give a high depth of focus to reduce eye strain because you need what you see in the images to be in focus at the same depth as what you see in the real world.     Google Glass and other LED illuminated AR generally set the focus so that the display focuses in what would be a persons far vision.   Steve Mann is saying is that the focus in your eye from the display has to match that of the real world or there will be problems and the only known way to do this is to use laser illumination.

This issue of laser light having a large depth of focus when used with a panel is an important “gem” that could have a big impact in terms of the technology used in near eye AR in the future.   LEDs and that includes OLEDs produce light with rays that are scattered and hard to focus.   Wheres lasers produce high f-number light that is easy to focus (and requires smaller optics as well).  As I said at the top of this post, the conventional wisdom is that cost is the only factor keeping OLEDs out of near eye AR, but if Steven Mann is correct, they are also prevented from being good for AR due to the physics of light.   And the best technology I know of for near eye AR to mate up with laser light is LCOS.

Google Glass Is Using Field Sequential Color (FSC) LCOS (Likely Himax)

GG DVF 40-42 RGB (2)

Sequential Red, Green, and Blue Fields Captured From Google YouTube Video DVF [through Glass]

I’m going to have to eat some crow because up until Saturday night, I honestly thought Google was using a transmissive panel based on the shape of the newer Google Glass headset.  I hadn’t seen anything that showed it used Field Sequential Color (FSC) and I had looked for it in several videos before that didn’t appear to show it.  With FSC the various (red, green, blue and perhaps other colors) are presented to the eye in sequence rather than all at the same time and this can show up in videos (usually) and in sometimes in still pictures.

But on a Saturday (March 9th)  I watch the Google produced Video DVF [through Glass] from way back in September 2012.  A careful frame by frame analysis (see above for the images from 3 frames) of the video proves that the newer Google Glass design uses a Field Sequential Color display (FSC).  Note in the picture above captured at 3 separate times, there is a red, green, and blue images in the Google Glass which is indicative of FSC.   Based on the size and shape and some other technical factors (too much to go into here), it has to be a reflective Liquid Crystal on Silicon (LCOS) device, most likely made by Himax.

BTW, as further visual evidence (there are a couple more examples in the video but this one is to me the clearest) of it being an FSC device is given later in the video at 3:30 when Google Co-Founder (and part-time actor?) Sergey Brin wearing Google Glass stands up to applaud and there is a classic FSC color breakup as captured in the picture below one recognizable to anyone that has looked into an FSC projector.  Seeing separate color fields when the projector moves is a classic FSC effect.

GG man jumping up

Sergey Brin Stands Up Rapidly and Reveals Color Sequential Breakup

This (new) evidence largely confirms Seeking Alpha Blogger Mark Gomes conclusion that Himax is in both the old and the newer Google Glass design  (see also his instablog response to my comments).   Back last week I was not convinced and commented that I still thought it was a transmissive panel and Mr. Gomes and I has some cordial back and forth public discussion in each others blogs about it on Seeking Alpha and this blog.   But with the proof that it is using field sequential color, there is only one conclusion and that is that it is a reflective field sequential color LCOS device.   This also adds up as to why the earlier prototype was using a Himax Color Filter LCOS device when it would have been simpler and smaller to have used a transmissive panel at that time.  Apparently the color filter LCOS was a “stand-in” waiting for the smaller field sequential color device and/or optics.

Additionally, while I had dismissed the Digitimes Himax and Google Glass article as confirming it was Himax because it appeared a couple of days after Mark Gomes’ article and so I thought it was just an “echo” of what he and I had written.   But in public comments Mr. Gomes pointed out that it was adding some more details.

So why do I now agree with Mr. Gomes that the Google Glasses most likely uses a Himax panel?  The evidence is overwhelming that it is field sequential color and it seems that Himax is the obvious candidate since in my first blog on the subject appear Feb 28, 2012 clearly identified Himax as supplying the earlier Google Glass prototype and they have had FSC LCOS devices for about 6 years.    This is further reinforced by what Mark Gomes has posted as well as the Digitimes article.   Both the technical and the financial/business analysis agree.

There are a few other but IMO much less likely candidates.  My old company Syndiant has digital field FSC LCOS technology that last I knew about both was technically  superior to that of Himax’s analog LCOS technology, but I don’t think Syndiant would be ready for a Google sized order yet (and the announced JVC-Kenwood deal happened too recently).  Citizen Finetech Miyota (CFM) recently bought FSC LCOS technology from Micron, but I can’t see why Micron would have sold the technology to CFM if a deal with Google was in the works.   Omnivision bought the the FSC LCOS technology of Aurora Systems, but it was not very good technology IMO and so far I only know of the continuing to make the old Aurora devices which are aimed at front projectors.   Then there is Compound Photonics who bought the FSC assets from the now defunct Brillian but they have stated that they are working on  laser pico projectors.

Also, please don’t give me the conspiracy and collusion theories.   The video I watched on March 9th was the first one I had seen that proved Google Glass was field sequential color.  Additionally, I never corresponded with or even knew of Mark Gomes before the Seeking Alpha article came out mentioning my blog and I was legitimately concerned that he may have ignored some of my original article and only considered the parts that supported his position so I wanted to correct the record.  Mark Gomes for his part was very respectful, yet emphatic in his position based on his research which now appears to me to have been largely correct (although I still say the Himax web site looks abandoned and Himax did give the appearance of having given up on FSC LCOS back around 2010).   Frankly, I was as surprise as anyone at the wild swings in Himax stock and didn’t buy any before my first article.

Full Disclosure:  I never traded in Himax stock before today (or any other stock discussed on this blog other than being a well know holder of the private company Syndiant stock as a form Founder, CTO, and Investor).  But seeing how the Google Glass news last week affected the stock and based on Mr. Gomes’ articles, combined with this new evidence, I decide to put some money where my mouth is and just bought some Himax (HIMX) to see what happens.

Appendix (For Those that Want to duplicate my findings)

Figuring out that Google Glass used FSC would have been instantly recognizable to anyone that got to use the newer Google Glass device, but I didn’t have one to play with and I was using the available on-line video and pictures.   The crafted Google videos that give the appearance of looking through the Google Glass didn’t show this because they simulation of the display.  And in most of the videos the image in the Google Glass was not visible and/or the camera exposure and other settings didn’t pick up the FSC effects.  Perhaps Ironically, it appears that the camera in Google Glass tends to pick up the FSC effect more than other cameras used to shoot pictures of people wearing it.

Some video cameras more so than others will tend to pick up the signature color breakup of FSC.   Also the camera angle has to be right so you can see the image when videoing someone wearing Google Glass.   And perhaps most importantly, the exposure of the camera, which is usually based on the overall scene, has to be such that the sequential colors from the small spot of light in the viewfinder (haven’t ever seen a close up of the viewfinder) does not over-expose and wash out the colors (in this case you may notice a more white flicker).

All I did was play the video DVF [through Glass]  on my PC and kept pausing and un-pausing it.  It is tricky to catch the frames that show FSC.  One reason is that the video has many frames per second and the Youtube player does not support “shuttle/jog” frame by frame.   One could download the video and play it frame by frame but it is not necessary.   I just kept going over the time around 0:38 to 0:44 a few times to capture the images.   Similarly went through the video at about 3:30 to get the FSC breakup with Sergey Brin.

Note that you will not always see a red, green, or blue color when you capture a frame.   When colors get too bright in the image, it will saturate the camera sensor and result in white.     I don’t believe there is a “white field” in the Google Glass but rather it is just that the camera is not picking up the colors due to over saturation.

I should also add that FSC effects show up differently on different cameras and in different lighting and camera exposure.   I have looked previously at other Google Glass stills and videos trying to find FSC effect and did not find them.    Unless the camera angle and the exposure is right, you just aren’t going to see the colors.    Even in this whole video, I only found a few seconds of video that demonstrated FSC.