The human visual system always focuses at a distinct depth and objects that lie at different depths appear blurred, as the user’s focus depth changes different objects come in and out of focus. This phenomenon is known as Depth of Field (DoF).
Augmented Reality (AR) is a technology which superimposes computer graphics (CG) images onto a users view of the real world. A commonly used AR display device is an Optical See-Through Head-Mounted Display (OST-HMD), which is a transparent HMD, enabling users to observe the real-world directly, with CG added to it. A common problem in such systems is the mismatch between the DoF properties of the users eyes and the virtual camera used to generate CG.
The goal of this thesis is the creation of a system which can accurately reflect the state of the user’s eyes in our renderings. Using an Autorefractometer, we measure the user’s pupil size and accommodative state and feed these values into a real-time distributed raytracer. The resulting renderings accurately reflect the DoF blur effect the user perceives in their view of the real world.
We believe that the technique described and implemented in this thesis is the most promising way for achieving augmentations that are indistinguishable from real objects using commodity OST-HMDs. Integrating it into consumer OST-HMDs will be a crucial step for bringing photo realistic AR to the masses