Augmented Reality and Google Project Glass Part 2

Google Glasses

Since my last post on Augmented Reality (AR) and near eye (head mount) display Google put out some publicity on their Project Glass concept.   Google made it abundantly clear that this was only for the purposes of concept testing and not a real product, but they also said that there would likely be some test products at the end of 2012.

Jason, “The Frugal Dad” wrote saying he had seen my first article on AR and that he has a new “Infographic” that includes Google Glasses as a Future “disruptive technology.”   Unfortunately most predictions about the future turn out to be wrong and nothing I have seen so far in the way of near eye AR, including Google Glasses, I believe will meet the consumer expectations and become pervasive.  I’m not saying it won’t ever happen, but rather there are still many major problems to be solved.

As I wrote before, I think there are many practical issues with near eye displays and augmented reality.   The Google Video “Project Glass: One day…” was obviously a “concept video” and all the images in the display were “fake” as in not what the actual display will look like.

Along these lines, the April 5, 2012 Wired had an article called “Google Glasses Face Serious Hurdles, Augmented-Reality Experts Say” which raises concerns that Google is over-touting the concept.  The Wired article quotes Pranav Mistry, from the MIT Media Lab  and and one of the inventors of the SixthSense wearable computing system, “The small screen seen in the photos cannot give the experience the video is showing.”  Also in the Wired article, Blair MacIntyre, director of the Augmented Environments Lab at Georgia Tech raised concerns that Google is raising expectations too much.   Both Dr. Mistry and Dr. MacIntyre are certainly proponents of AR.  Their concerns, and mine as well, are that that raising expectation too high could backfire on the AR concept in long run.

Dr. Thrun on Charlie Rose Looking Up

Sebastian Thrun, Google Fellow and Stanford professor, was on Charlie Rose on April 25, 2012 wearing a working Google Glasses prototype.   The first 4 and a half minutes of the Charlie Rose video discuss Google Glasses and gives some insight into the issues, not the least of which is whether people are really going to wear something like this.

To see the images in the “glasses” he has to look up, where the Google Concept video suggest the images are right in front of you all the time.  So he can’t see the person he is talking to and the computer image at the same time.   Imagine talking to somebody wearing these when they are clearly looking up while talking to you (particularly notice Dr. Thrun’s eyes in the picture above at 1:11 into the Charles Rose Video).   By instinct, humans are very sensitive to eye behavior, and someone constantly looking away (up) is a distracting behavior.  Now imagine you are walking down the street and searching for something on your glasses and a truck comes by — big oops.

The most insightful comment by Dr. Thrun was “we having yet found this [augmented reality] to be the compelling use case” but he didn’t elaborate as to why.   But this does indicate that Google is still be trying to figure out if AR is really compelling.    Dr. Thrun did say that “the compelling use case is the sharing experience” and commented on sharing pictures at being something they enjoyed — I guess this is Tweeting on steroids where all your friends can see what you are doing as you do it.  In this case the glasses become a hands free video camera.

The Google video has inspired some funny spoofs of it that in their own way make some of the points above:

ADmented Realility — Google Glasses Parody extrapolates on the what could happen with advertising gone wild.

Google Glasses: A New Way to Hurt Yourself and a video shown on Jimmy Kimmel Live demonstrate the dangers of “distractive walking”

The next time on the subject of AR, I plan to talk about more of the technical issues with AR.

 

 

Karl Guttag
Karl Guttag
Articles: 257

5 Comments

  1. I presume that you have not used augmented reality in the form of RSD. If you had you would realize that indeed if the glasses are positioned properly, the need to “glance” up is eliminated by the proper positioning of the reflected image source in front of the wearer similar to wearing small eyeglasses.. i.e. you can either look through them or around them. One could “paint” a big nose and moustache on a persons face in front of them for example.

    • Actually, I have looked in some depth at various near eye display technologies. I have seen some RSD (raster scanned displays) as well and there is nothing magically better about them. There are different design choices as to whether you have monocular (one eye), biocular (two eye), and whether they are see-through or not and how much the obscure your normal vision. They can fill the whole field of view or be above or below the field of view. Each of these has its pros and cons. Then you have issues of fit, placement, how the display is mounted to the head (supported back by the ears, nose support, band over the head, etc.) were there further trade-offs in comfort and looks. These issues are further complicated by the optics involved (yes even RSD has optics) the size of the “pupil” which dictates how accurately the display has to be placed for you to see an image without it being cut off and whether the whole thing works with people with glasses (about 60% of the US population uses glasses at some time during the day).

      The Google glasses that Dr. Thrun wore were clearly “view the display above and see under” type. This is interesting particularly since the Google demo videos were clearly displayed images on top of the normal field of view. If the display is viewable in the normal looking forward field of view, it by necessity has some effect on your normal vision either partially or totally blocking/darkening/distorting your normal vision. If you eliminate the “need to glance up” then there is something in your normal field of view which has a different set of drawbacks.

  2. I see you point about the position of the exit pupil but with a smaller optic and selective positioning it is no more of a hassle to glance into it than it is to keep it in front of you view than would be wearing bifocals. You look down and you see and keep in view what it is that you are reading. My Nomad sysem is very good this way. The only problem is that the corners of the virtual screen are not always in view, at least with my eyes. Daylight readability is fantastic and I don’t see many other drawbacks with the concept thought I can’t stand the user interface. (C’mon Google)

  3. […] Technology has the potential to disrupt a market when necessity and scientific achievement match with an accepting public. This requires incredible timing. There are many historical cases in which scientific achievement exceeded the willingness for society to accept growth. It is not just the development of new technology that is critical, it is cultural acceptance and the transition to real-world application where time, money, and even lives are lost. Investors and companies need to thread that needle delicately otherwise they become a cautionary tale: I’m looking at you, Google Glass. […]

Leave a Reply

Discover more from KGOnTech

Subscribe now to keep reading and get access to the full archive.

Continue reading