Researcher:
Başak, Uğur Yekta

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

Uğur Yekta

Last Name

Başak

Name

Name Variants

Başak, Uğur Yekta

Email Address

Birth Date

Search Results

Now showing 1 - 2 of 2
  • Placeholder
    Publication
    Wide-field-of-view dual-focal-plane augmented reality display
    (Spie-Int Soc Optical Engineering, 2019) N/A; N/A; N/A; Department of Electrical and Electronics Engineering; Başak, Uğur Yekta; Kazempourradi, Seyedmahdi; Ulusoy, Erdem; Ürey, Hakan; Master Student; PhD Student; Resercher; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; N/A; College of Engineering; N/A; N/A; N/A; 8579
    Stereoscopic augmented reality (AR) displays with a fixed focus plane suffer from visual discomfort due to vergence-accommodation conflict (VAC). In this study, we propose a biocular dual-focal plane AR system. Two separate liquid crystal displays (LCDs) are placed at slightly different distances to a Fresnel relay lens such that virtual images of LCDs appear at 25 cm and 50 cm to the user. Both LCDs are totally viewed by both eyes, such that the rendered images are not parallax images for each eye. While the system is limited to two depths, it provides correct focus cues and natural blur effect in two distinct depths. This allows the user to distinguish virtual information, even when the virtual objects overlap and partially occlude in the axial direction. Displays are driven by a single computation unit and the objects in the virtual scene are distributed over the LCDs according to their depths. Field-of-view is 60 x 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing.
  • Thumbnail Image
    PublicationOpen Access
    Dual focal plane augmented reality interactive display with gaze-tracker
    (Optical Society of America (OSA), 2019) Department of Electrical and Electronics Engineering; Başak, Uğur Yekta; Kazempourradi, Seyedmahdi; Yılmaz, Cemalettin; Ulusoy, Erdem; Ürey, Hakan; Master Student; PhD Student; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; N/A; N/A; N/A; 111927; 8579
    Stereoscopic augmented reality (AR) displays have a fixed focus plane and they suffer from visual discomfort due to vergence-accommodation conflict (VAC). In this study, we demonstrated a biocular (i.e. common optics for two eyes and same images are shown to both eyes) two focal-plane based AR system with real-time gaze tracker, which provides a novel interactive experience. To mitigate VAC, we propose a see-through near-eye display mechanism that generates two separate virtual image planes at arm's length depth levels (i.e. 25 cm and 50 cm). Our optical system generates virtual images by relaying two liquid crystal displays (LCDs) through a beam splitter and a Fresnel lens. While the system is limited to two depths and discontinuity occurs in the virtual scene, it provides correct focus cues and natural blur effect at the corresponding depths. This allows the user to distinguish virtual information through the accommodative response of the eye, even when the virtual objects overlap and partially occlude in the axial direction. The system also provides correct motion parallax cues within the movement range of the user without any need for sophisticated head trackers. A road scene simulation is realized as a convenient use-case of the proposed display so that a large monitor is used to create a background scene and the rendered content in the LCDs is augmented into the background. Field-of-view (FOV) is 60 x 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing. The system includes a single camera-based pupil and gaze tracker, which is able to select the correct depth plane based on the shift in the interpupillary distance with user's convergence angle. The rendered content can be distributed to both depth planes and the background scene simultaneously. Thus, the user can select and interact with the content at the correct depth in a natural and comfortable way. The prototype system can be used in tasks that demand wide FOV and multiple focal planes and as an AR and vision research tool.