I don’t care how good you think your eyes are you can’t tell the resolution of display without a test pattern. On the show floor and in their demo rooms, companies are going to pick videos and images that make their product look good and that often will mean avoiding test patterns like the plague. In particular, those who are “fudging” on their resolution will avoid any test patterns.
More than once, a company has made a claim for “resolution” or “pixels” that won’t stand up to being measured. For this blog entry, I’m going to demonstrate with pictures I took of test patterns on the ShowWX how Microvision’s Laser Beam Scanning falls seriously short of their claimed resolution of “WVGA” (848×480 pixels) and in addition has some pretty severe imaging artifacts caused by their non-linear, bi-directional, scanning process.
I would be particularly curious to see how Microvision’s demo of their 720P device holds up to being measured. My guess is that they won’t let you test it, but it is worth asking (I don’t think they will let me try for some reason). I know they have people reading this blog, so if they really think it will stand up to being measured, they could use my test patterns or similar ones.
At the end of this article I am going to give you as series of simple royalty free test patterns that you can download (while not required, it would be nice to include attribution to this blog if you use the patterns). You can verify my results or use them to measure the real resolution of Microvision’s WVGA, their claimed to be 720P demo projector, or any other pico projector.
Laser beam scanning (LBS) has a multitude of problems with the way a mirror scans; it is far from the simple process they want you to think. Those familiar with the ShowWX know that it has “bowtie” distortion of the overall image, but what really hurts their effective resolution is that the scanning process very poorly matches that of a normal computer or camera image.
Fig. 4 from Microvision’s patent application 20110249020 gives some idea as to the problem as it diagrams the basics of the Microvision bi-directional scanning process. The key thing you should notice is that it doesn’t look anything like simple raster scans through a square grid of pixels. The Microvision scanning process follows two crisscrossing sine-wave-like patterns and the pixels of the original image have to be scaled/resample to the non regular beam scanning pattern. The beam scanning doesn’t go everywhere the pixels need to be and in scaling the image to match the scanning process, significant resolution is lost.
The mirror does not sweep the laser beam in straight lines at a uniform speed. It follows more of a curved path (thus curves in the patent application Fig. 4 and the bow-tie effect). As the mirror scans the laser beam it is constantly speeding up or slows down which if left uncompensated, would change the width and brightness of a pixel. But this is just the start of the problems with the scanning process.
A lot of people think that Microvision’s scanning process works sort of like an old raster scanned TV CRT but it doesn’t. On a CRT, the magnetic deflection of the beam’s horizontal retrace is very fast so beam is only on in one direction and it “retraces” with the electron beam off.
But with Microvision’s MEMs mirror horizontal “retrace” is the same speed as its forward direction. Therefore if they turned off the laser beam during retrace, the laser would have to be off over half the time and they would need 2X more powerful lasers. So Microvision uses a “bi-directional scan” where the lasers are turned on in both directions. In their patent application Fig. 4 above, I have colored each of the two scans, one in blue and the other in red to make it easier to follow them. A single scan takes about 1/60th of a second and Microvision makes two sweeps each offset by a half a line. It takes 1/30th of a second for both sweeps to complete (which also causes some very undesirable flicker on the outsides of the image where the blue and red scans don’t overlap). The two sweeps are offset by half a line to create the crisscross effect seen in Fig. 4.
A key thing to notice is that in the middle of the two horizontal sweeps the blue and red sweeps cross, but on the outsides they don’t. Also notice the spacing between lines of a given scan varies. The lines are pinched together on the left and right sides (right after the retrace starts) while in the middle they are very far apart. This makes for a tough mapping/scaling of the pixels and the net effect as the pictures will show is to make the left and right sided of the image blurry.
Below is a picture of a test pattern generated by the ShowWX. When you click on the thumbnail you will get a very large image to see all the detail. The test pattern has a series of 4 pairs of black and white horizontal or vertical lines. If the ShowWx met its claimed resolution, you should see these 4 line-pairs distinctly everywhere in the image. But what the picture shows is that even in the center of the screen there are problems which get worst at the left and right side of the image. Quite literally, in some spots the resolution is about 1/4th (half vertical and half horizontal) that which is claimed (the 4 line-pairs blur into a mass). You will notice that the vertical line pairs are blurry in most places. One special feature I added to this pattern is some groups of 2 horizontal line pairs where the second set of line pairs is on the odd lines relative to the first line pairs; interestingly one set is blurrier than the other set of lines.
The picture was shot at 1/30th of a second and has a “roll bar” where only one of the two scans is present.
Another major problem is that a vertical lines all have to be scaled to fit the laser scanning and this process tends to blur all the vertical lines (some more than others). Yet one more problem is that the red, green, and blue lasers are not perfectly aligned with respect to each other which means that the image for red, green, and blue are all scaled independently of each other. This in turn causes a color “aliasing” or “twisted rope effect” on vertical lines (see red arrows in in the picture below from the center of the projected image).
To top it all off, there seems to be some quantizing effect in the Microvision scaling process. This causes vertical lines to jump sideways about 1/2 a pixel every so often.
The problems with the various colors aliasing in the “white” test pattern, make it hard to see the scanning process. Below I have included a “green only” pattern so you can more clearly see the effects of the scanning process in a single color.
Below, I have included a crops with some arrows pointing to some of the problems on the right side and center of the projected image. One thing to notice is that different line-pairs are blurrier than others.
Below is the whole image (click to see the higher resolution version)
Finally, I have included a number of test patterns including the ones I used withthe Microvision ShowWX in the example above. The ShowWX ones were a bit special because I found the unusual odd/even issues with the scanning process. I have created test pattern that are aimed at the common resolutions used by pico projectors today including WVGA (848×480), SVGA (800×600), WSVGA (1024×600), 720P (1280×720) and WXGA (1280×800). They should be used at 100% = native resolution of the projector.
Appendix – How the test pattern images were shot
It is kind of tricky to shoot a sharp image of a laser projector. When shooing an LED illuminated projector, if you want a sharp picture you can stop down (use a higher f-number) the camera’s lens and use a slow shutter speed (say 1/6th of a second) to take out any “roll-bars” from a scanning projector or color field effects from a field sequential color projector. But with a laser projector if you stop down the lens you make the laser speckle worse and obscure the resolution effects.
The slowest ISO speed on the DSLRs I uses were ISO100. I didn’t have a set of neutral density filters available so this limited the ability control the shutter speed while getting the proper exposure. Through some trial and error I settled on f/2.8 for the aperture and a shutter speed of 1/30th of a second (to capture both scans) and then used ISO200 to get the proper exposure. Since the camera was synchronized to the ShowWx, this mean there would be exactly 1 roll-bar in the image so I took a number of pictures to get the roll-bar in a least objectionable position. I could have shot at ISO100 and 1/15th of a second but then I would get two faint roll-bars in two places, I decide that one roll-bar was better than two faint ones.
The image above were taken with Canon 50mm f/1.8 prime (non-zoom) lens with the camera mounted on a tripod with an infrared remote. A prime lens was used to give a sharp image will little distortion and chroma aberration. All the images of the test pattern were shot at f/2.8 to give some “sharpness gain” over the len’s wide open aperture but still a low enough f-number to limit speckle.
From observations, shooting at f/2.8 resulted in less speckle than I observed with my naked eye. The speckle you see is a function of the structure of the human eye including the f-number of your iris, the size of your retina, the size of the rods and cones in your eye, the surface of the retina. When you take a picture of a laser projected image with a camera, all these factors are different. About the best you can do is adjust the f-number of the camera to try an approximate what you see. In a future article, I plan on talking about the physics of laser speckle.
A 3-stop or more neutral density filter combined with shooting at ISO100 (or less if the cameras supported it) would have allow me to shoot at a lower shutter speed and remove (average out) the roll-bar. If the projector was much brighter, a neutral density filter would have been absolutely required.