Affiliation: College of Arts and Sciences, Department of Computer Science
Augmented and Virtual Reality (AR/VR) are emerging as next generation personal and social computing platforms. However, for AR and VR displays to be practical and mass adopted, they need to be compact and lightweight, and should support focal accommodation like everyday prescription eyeglasses. The existing commercial displays however are bulky and do not support focus cues for virtual imagery. Furthermore, users without normal vision need to wear additional pairs of prescription correction glasses to see well-focused real imagery, which is an unreasonable and awkward requirement. In my dissertation, I focus on three key areas for augmented and virtual reality (AR/VR): 1) high-quality 2D and 3D holographic near-eye display approaches compensating for severe real-world aberrations and supporting high resolution, 2) auto-focus eyeglasses for viewing both real and virtual imagery, and 3) wearable eye tracker capable of sensing pupil position, gaze direction, and optical accommodative state. The only demonstrated compact, wide-angle eyeglasses-style near-eye display (NED) for augmented/virtual imagery uses holographic techniques. However, holographic NEDs suffer from a small eyebox and poor image quality with debilitating artifacts. To overcome these, I developed a new holographic rendering framework which directly optimizes for 2D and 3D phase-only holograms, compensating for any hardware and optical aberrations in the display, producing state-of-the-art high quality true holographic imagery. The auto-focus capability is achieved by using focus-tunable lenses with focal power dynamically adjusted to also consider the user's prescription, thereby allowing for well-focused viewing of both real and synthetic imagery. This requires knowing not only the vergence but also the accommodation state of the user's eye at any instant. To achieve this, I developed comprehensive 3D eye tracking of both eyes, by analyzing the Purkinje reflections from multiple layers in the eye's cornea and lens, using a multitude of cameras and infrared LEDs, and thereby directly estimating the dynamically changing shape and focus of the eye's crystalline lens.