Researcher:
Kazempourradi, Seyedmahdi

Loading...
Profile Picture
ORCID

Job Title

PhD Student

First Name

Seyedmahdi

Last Name

Kazempourradi

Name

Name Variants

Kazempourradi, Seyedmahdi

Email Address

Birth Date

Search Results

Now showing 1 - 9 of 9
  • Placeholder
    Publication
    Wireless controller for interactive virtual reality games
    (Ieee, 2017) Yazıcı, Mahmut Sami; Ozmen, Laurence; Ulusoy, Erdem; N/A; Department of Electrical and Electronics Engineering; Department of Electrical and Electronics Engineering; Department of Electrical and Electronics Engineering; Department of Computer Engineering; Department of Mathematics; Department of Electrical and Electronics Engineering; Kazempourradi, Seyedmahdi; Öztürk, Seyfettin Onurhan; Erdemli, Murat Berke; Tuncer, Sidem Işıl; Dağıdır, Can Hakan; Ürey, Hakan; PhD Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; Faculty Member; Department of Computer Engineering; Department of Mathematics; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; College of Engineering; College of Engineering; College of Sciences; College of Engineering; N/A; N/A; N/A; N/A; N/A; N/A; 8579
    An array of tiny, low-cost, stand-alone and wireless inertial motion sensor units is designed and fabricated. These sensor units recognise gestures of a user, enabling a comfort control in game applications. Despite the wide usage of motion sensor units in various applications, we utilize an array of detached and low-cost controller units and an Oculus Rift DK2 to develop two VR games, for the first time. In one application, the user can control a spaceship movements by its hand movements. The other game is a first-person shooting game, in which an array of sensors are used for aiming and shooting purpose. This type of control provides the feeling of full immersion inside a VR environment. The developed sensor unit is a promising controller for a broad range of applications in virtual and augmented reality industry.
  • Placeholder
    Publication
    Wide-field-of-view dual-focal-plane augmented reality display
    (Spie-Int Soc Optical Engineering, 2019) N/A; N/A; N/A; Department of Electrical and Electronics Engineering; Başak, Uğur Yekta; Kazempourradi, Seyedmahdi; Ulusoy, Erdem; Ürey, Hakan; Master Student; PhD Student; Resercher; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; N/A; College of Engineering; N/A; N/A; N/A; 8579
    Stereoscopic augmented reality (AR) displays with a fixed focus plane suffer from visual discomfort due to vergence-accommodation conflict (VAC). In this study, we propose a biocular dual-focal plane AR system. Two separate liquid crystal displays (LCDs) are placed at slightly different distances to a Fresnel relay lens such that virtual images of LCDs appear at 25 cm and 50 cm to the user. Both LCDs are totally viewed by both eyes, such that the rendered images are not parallax images for each eye. While the system is limited to two depths, it provides correct focus cues and natural blur effect in two distinct depths. This allows the user to distinguish virtual information, even when the virtual objects overlap and partially occlude in the axial direction. Displays are driven by a single computation unit and the objects in the virtual scene are distributed over the LCDs according to their depths. Field-of-view is 60 x 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing.
  • Placeholder
    Publication
    Next generation augmented reality displays
    (IEEE, 2018) N/A; N/A; N/A; N/A; N/A; Department of Electrical and Electronics Engineering; Hedili, M. Kıvanç; Ulusoy, Erdem; Kazempourradi, Seyedmahdi; Soomro, Shoaib Rehman; Ürey, Hakan; Master Student; Researcher; Researcher; PhD Student; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; N/A; N/A; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; N/A; 8579
    Wearable AR/VR displays have a long history and earlier efforts failed due to various limitations. Advances in various sensors, optical technologies, and computing technologies renewed the interest in this area. AR glasses can be the new computing platform and potentially replace smart phones but there are some challenges ahead. We have been working on various wearable display architectures and here we summarize our efforts on laser MEMS scanning displays, head-mounted projectors, and holographic near-eye displays
  • Thumbnail Image
    PublicationOpen Access
    Full-color computational holographic near-eye display
    (Taylor _ Francis, 2019) Department of Electrical and Electronics Engineering; Kazempourradi, Seyedmahdi; Ulusoy, Erdem; Ürey, Hakan; PhD Student; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; N/A; 111927; 8579
    Near-eye displays (NEDs) are an excellent candidate for the future of augmented reality. Conventional micro-display based NED designs mostly provide stereoscopic 3D experience, which leads to visual discomfort due to vergence-accommodation conflict. Computational holographic near-eye displays (HNEDs) can simultaneously provide wide FOV, retinal resolution, attractive form-factor, and natural depth cues including accommodation. In HNEDs, computer-generated holograms (CGHs) are displayed on a spatial light modulators (SLMs). We propose a CGH computation algorithm that applies to arbitrary paraxial optical architectures; where the SLM illumination beam can be collimated, converging or diverging, and the SLM image as seen by the eye box plane may form at an arbitrary location, and can be virtual or real. Our CGH computation procedure eliminates speckle noise, which is observed in all other laser-based displays, and chromatic aberrations resulting from the light sources and the optics. Our proof-of-concept experiments demonstrate that HNEDs with simple optical architectures can deliver natural 3D images within a wide FOV (70 degrees) at retinal resolution (30 cycles-per-degree), exceeding 4000 resolvable pixels on a line using a printed binary mask. With the advances in SLM technology, HNEDs can realize the ultimate personalized display, meeting the demand of emerging augmented and virtual reality applications.
  • Thumbnail Image
    PublicationOpen Access
    Wireless controller for interactive virtual reality games
    (Institute of Electrical and Electronics Engineers (IEEE), 2017) Department of Electrical and Electronics Engineering; Ulusoy, Erdem; Kazempourradi, Seyedmahdi; Ürey, Hakan; Öztürk, Seyfettin Onurhan; Erdemli, Murat Berke; Özmen, Levent; Dağıdır, Can Hakan; Tuncer, Sidem Işıl; Faculty Member; Department of Electrical and Electronics Engineering; College of Engineering; 111927; N/A; 8579; N/A; N/A; N/A; N/A; N/A; N/A; N/A
    An array of tiny, low-cost, stand-alone and wireless inertial motion sensor units is designed and fabricated. These sensor units recognise gestures of a user, enabling a comfort control in game applications. Despite the wide usage of motion sensor units in various applications, we utilize an array of detached and low-cost controller units and an Oculus Rift DK2 to develop two VR games, for the first time. In one application, the user can control a spaceship movements by its hand movements. The other game is a first-person shooting game, in which an array of sensors are used for aiming and shooting purpose. This type of control provides the feeling of full immersion inside a VR environment. The developed sensor unit is a promising controller for a broad range of applications in virtual and augmented reality industry.
  • Thumbnail Image
    PublicationOpen Access
    Fast computer-generated hologram computation using rendered depth map image
    (Society of Photo-optical Instrumentation Engineers (SPIE), 2017) Bjelkhagen, Hans I.; Bove, V. Michael; Department of Electrical and Electronics Engineering; Ürey, Hakan; Ulusoy, Erdem; Kazempourradi, Seyedmahdi; Faculty Member; PhD Student; Department of Electrical and Electronics Engineering; College of Engineering; 8579; 111927; N/A
    We propose a method for computing realistic computer-generated holograms (CGHs) of three-dimensional (3D) objects, where we benefit from well-established graphical processing units (GPUs) and computer graphics techniques to handle occlusion, shading and parallax effects. The graphics render provides a 2D perspective image including occlusion and shading effects. We also extract the depth map data of the scene. The intensity values and 3D positions of object points are extracted by combining the rendered intensity image and the depth map (Z-buffer) image. We divide the depth range into several planes and quantize the depth value of 3D image points to the nearest plane. In the CGH computation part, we perform proper Fresnel transformations of these planar objects and sum them up to create the hologram corresponding to the particular viewpoint. We then repeat the entire procedure for all possible viewpoints and cover the hologram area. The experimental results show that the technique is capable of performing high quality reconstructions in a fast manner.
  • Thumbnail Image
    PublicationOpen Access
    Dual focal plane augmented reality interactive display with gaze-tracker
    (Optical Society of America (OSA), 2019) Department of Electrical and Electronics Engineering; Başak, Uğur Yekta; Kazempourradi, Seyedmahdi; Yılmaz, Cemalettin; Ulusoy, Erdem; Ürey, Hakan; Master Student; PhD Student; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; N/A; N/A; N/A; 111927; 8579
    Stereoscopic augmented reality (AR) displays have a fixed focus plane and they suffer from visual discomfort due to vergence-accommodation conflict (VAC). In this study, we demonstrated a biocular (i.e. common optics for two eyes and same images are shown to both eyes) two focal-plane based AR system with real-time gaze tracker, which provides a novel interactive experience. To mitigate VAC, we propose a see-through near-eye display mechanism that generates two separate virtual image planes at arm's length depth levels (i.e. 25 cm and 50 cm). Our optical system generates virtual images by relaying two liquid crystal displays (LCDs) through a beam splitter and a Fresnel lens. While the system is limited to two depths and discontinuity occurs in the virtual scene, it provides correct focus cues and natural blur effect at the corresponding depths. This allows the user to distinguish virtual information through the accommodative response of the eye, even when the virtual objects overlap and partially occlude in the axial direction. The system also provides correct motion parallax cues within the movement range of the user without any need for sophisticated head trackers. A road scene simulation is realized as a convenient use-case of the proposed display so that a large monitor is used to create a background scene and the rendered content in the LCDs is augmented into the background. Field-of-view (FOV) is 60 x 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing. The system includes a single camera-based pupil and gaze tracker, which is able to select the correct depth plane based on the shift in the interpupillary distance with user's convergence angle. The rendered content can be distributed to both depth planes and the background scene simultaneously. Thus, the user can select and interact with the content at the correct depth in a natural and comfortable way. The prototype system can be used in tasks that demand wide FOV and multiple focal planes and as an AR and vision research tool.
  • Thumbnail Image
    PublicationOpen Access
    Micro-mirror-array based off-axis flat lens for near-eye displays
    (Optical Society of America (OSA), 2019) Department of Electrical and Electronics Engineering; Kazempourradi, Seyedmahdi; Yaras, Yusuf Samet; Ulusoy, Erdem; Ürey, Hakan; Faculty Member; Department of Electrical and Electronics Engineering; Graduate School of Sciences and Engineering; N/A; N/A; 111927; 8579
    We developed an off-axis diffractive lens using a micro-mirror array on a flat substrate. MMA creates an on-axis converging beam from a 45 degrees off-axis diverging illumination beam and functions similar to a large and bulky elliptical mirror. The array consists of individual micro-mirrors with normal directions that vary across the component. The size, normal direction and the center height of each micro-mirror are optimized to achieve a phase matching condition so that the smallest focal spot size is achieved at the design wavelength. Design can also be optimized for full color applications using a synthetic design wavelength. A sample MMA of size 3 mm by 5 mm is fabricated using grayscale lithography. The designed MMA is used to illuminate a computer-generated hologram in a near-eye display system. Experimental results verify the premises of the designed component.
  • Thumbnail Image
    PublicationOpen Access
    Next generation augmented reality displays
    (Institute of Electrical and Electronics Engineers (IEEE), 2018) Department of Electrical and Electronics Engineering; Ürey, Hakan; Soomro, Shoaib Rehman; Hedili, M. Kıvanç; Ulusoy, Erdem; Kazempourradi, Seyedmahdi; Faculty Member; Master Student; Department of Electrical and Electronics Engineering; College of Engineering; 8579; N/A; N/A; 111927; N/A
    Wearable AR/VR displays have a long history and earlier efforts failed due to various limitations. Advances in various sensors, optical technologies, and computing technologies renewed the interest in this area. AR glasses can be the new computing platform and potentially replace smart phones but there are some challenges ahead. We have been working on various wearable display architectures and here we summarize our efforts on laser MEMS scanning displays, head-mounted projectors, and holographic near-eye displays.