The vision and desire for wearable displays has been around since 1965 when Ivan Sutherland suggested the idea. Many groups across the world have been working on various methods of executing this vision with large and small devices, which started out looking bulky and geeky, but are beginning to achieve more and more acceptable weight and size. Products from Epson’s Moverio BT-100, Vuzix 1200 XLD and new M100, Sony, Lumus, Digilens from the SBG Labs, Optinvent, and finally Google’s Glass, have all attracted a lot of attention in recent times. As wearable displays start to get more and more practical, applications of all sorts appear more feasible.
|Freeform lens from Hong's Lab|
Hong Hua from University of Arizona gave an exceptional invited talk on light-weight low cost wearable displays for augmented reality applications at the Applied Industrial Optics meeting. Hong’s talk was from the perspective of practical implementation of the optical design for such light-weight wearables. A fundamental limitation in wearables is a problem of fixed etendue, which forces a tradeoff between field of view and the size of the pupil aperture of the human eye. Since the aperture is fixed, it is difficult to get a large field of view. This problem can be overcome by using off-axis optics which conserve the etendue, such as used by groups like Spitzer et al. at Micro-Optical Corp, and Rolland, et al. Glasses by Epson and Google use reflective waveguides. Nokia uses Holographic diffractive waveguides. Lumus, Optinvent and Q-sight by BAE uses pupil expansion techniques. Hua’s team uses a wedge prism with freeform waveguides. Sensics provides a wide field of view using optical tiling. A common problem with wearables for AR is a ghosting effect from computer generated objects overlying real-world objects. Hua suggested handling it by using a spatial light modulator to modulate the direct-view path, combining the light from the display via a beamsplitter such as done by Kiyokawa (2003), Rolland (2004) and Hua (2012). She proposed managing the vergence-accommodation conflict by using a deformable mirror to change the plane of virtual focus (presumably with eye tracking).
Freeform optics design and manufacturing has been a recent and welcome revolution in optics, enabling many unusual designs that depart from purely symmetric, spherical or aspheric surfaces. So Hua’s faith in them does seem very reasonable. There still is some time before we see glasses that cover our entire field of view, give us a perfect Augmented Reality experience and exhibit a stylish design, but her results and optimism portend a bright outlook.
I haven't written here about Kevin Thomson and Jannick Rolland's talks. Parallel sessions! But do check out their websites too.. some of the best work is going on these areas! Jannick just started a Center for Freeforms at Rochester!
Along similar lines, Ram Narayanswamy from the Intel User Experience group spoke about some applications of AR and related technology. He stressed on the importance of combining Optics, Image processing and User experience to deliver unique value to mass markets. He highlighted the role multi-aperture cameras, multiple cameras, and the additional sensors and increased computational abilities of cell phones today in the concept of a social camera. He showed how 3D visualization and location assisted image capture can be used, say, in a tourist application on a mobile device to take pictures in a park and augment your image to appear in the company of a dragon or something fantastic that says “Happy Chinese New Year!” He outlined some of the challenges and desirables in the spaces of sensors, optics and modules.
Teams like Narayanswamy's that combine user-need with computer vision, imaging, software and hardware are crucial to delivering technologies like Hua's to the consumers of the future!