AI and holography bring 3D augmented reality to regular glasses
Through holography and AI, these glasses can display full-color, 3D moving images over an otherwise direct view of the real world. Credit: Andrew Brodhead

Researchers in the emerging field of spatial computing have developed a prototype augmented reality headset that uses holographic imaging to overlay full-color, 3D moving images on the lenses of what would appear to be an ordinary pair of glasses. Unlike the bulky headsets of present-day augmented reality systems, the new approach delivers a visually satisfying 3D viewing experience in a compact, comfortable, and attractive form factor suitable for all-day wear.

"Our headset appears to the outside world just like an everyday pair of glasses, but what the wearer sees through the lenses is an enriched world overlaid with vibrant, full-color 3D computed imagery," said Gordon Wetzstein, an associate professor of electrical engineering and an expert in the fast-emerging field of spatial computing.

Wetzstein and a team of engineers introduce their device in a paper in the journal Nature.

Though only a prototype now, such a technology, they say, could transform fields stretching from gaming and entertainment to training and education—anywhere computed imagery might enhance or inform the wearer's understanding of the world around them.

"One could imagine a surgeon wearing such glasses to plan a delicate or complex surgery or airplane mechanic using them to learn to work on the latest jet engine," Manu Gopakumar, a doctoral student in the Wetzstein-led Stanford Computational Imaging lab and co-first author of the paper said.

Barriers overcome

The new approach is the first to thread a complex maze of engineering requirements that have so far produced either ungainly headsets or less-than-satisfying 3D visual experiences that can leave the wearer visually fatigued, or even a bit nauseous at times.

"There is no other augmented reality system out there now with comparable compact form factor or that matches our 3D image quality," said Gun-Yeal Lee, a postdoctoral researcher in the Stanford Computational Imaging lab and co-first author of the paper.

To succeed, the researchers have overcome technical barriers through a combination of AI-enhanced holographic imaging and new nanophotonic device approaches. The first hurdle was that the techniques for displaying augmented reality imagery often require the use of complex optical systems.

In these systems, the user does not actually see the real world through the lenses of the headset. Instead, cameras mounted on the exterior of the headset capture the world in real time and combine that imagery with computed imagery. The resulting blended image is then projected to the user's eye stereoscopically.

"The user sees a digitized approximation of the real world with computed imagery overlaid. It's sort of augmented virtual reality, not true augmented reality," explained Lee.

These systems, Wetzstein explains, are necessarily bulky because they use magnifying lenses between the wearer's eye and the projection screens that require a minimum distance between the eye, the lenses, and the screens, leading to additional size.

"Beyond bulkiness, these limitations can also lead to unsatisfactory perceptual realism and, often, visual discomfort," said Suyeon Choi, a doctoral student in the Stanford Computational Imaging lab and co-author of the paper.

Killer app

To produce more visually satisfying 3D images, Wetzstein leapfrogged traditional stereoscopic approaches in favor of holography, a Nobel-winning visual technique developed in the late-1940s. Despite great promise in 3D imaging, more widespread adoption of holography has been limited by an inability to portray accurate 3D depth cues, leading to an underwhelming, sometimes nausea-inducing, visual experience.

The Wetzstein team used AI to improve the depth cues in the . Then, using advances in nanophotonics and waveguide display technologies, the researchers were able to project computed holograms onto the lenses of the glasses without relying on bulky additional optics.

A waveguide is constructed by etching nanometer-scale patterns onto the lens surface. Small holographic displays mounted at each temple project the computed imagery through the etched patterns which bounce the light within the before it is delivered directly to the viewer's eye. Looking through the glasses' lenses, the user sees both the real world and the full-color, 3D computed images displayed on top.

Life-like quality

The 3D effect is enhanced because it is created both stereoscopically, in the sense that each eye gets to see a slightly different image as they would in traditional 3D imaging, and holographically.

"With holography, you also get the full 3D volume in front of each eye increasing the life-like 3D image quality," said Brian Chao, a doctoral student in the Stanford Computational Imaging lab and also co-author of the paper.

The ultimate outcome of the new waveguide display techniques and the improvement in holographic imaging is a true-to-life 3D visual experience that is both visually satisfying to the user without the fatigue that has challenged earlier approaches.

"Holographic displays have long been considered the ultimate 3D technique, but it's never quite achieved that big commercial breakthrough," Wetzstein said. "Maybe now they have the killer app they've been waiting for all these years."

More information: Gordon Wetzstein, Full-colour 3D holographic augmented-reality displays with metasurface waveguides, Nature (2024). DOI: 10.1038/s41586-024-07386-0. www.nature.com/articles/s41586-024-07386-0

Citation: AI and holography bring 3D augmented reality to regular glasses (2024, May 8) retrieved 8 May 2024 from https://techxplore.com/news/2024-05-ai-holography-3d-augmented-reality.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.