Publication:
Metameric varifocal holograms

dc.contributor.coauthorWalton, D.R.
dc.contributor.coauthorDos Anjos, R.K.
dc.contributor.coauthorSwapp, D.
dc.contributor.coauthorWeyrich, T.
dc.contributor.coauthorSteed, A.
dc.contributor.coauthorRitschel, T.
dc.contributor.coauthorAksit K.
dc.contributor.departmentDepartment of Electrical and Electronics Engineering
dc.contributor.kuauthorÜrey, Hakan
dc.contributor.kuauthorKavaklı, Koray
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Electrical and Electronics Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokid8579
dc.contributor.yokidN/A
dc.date.accessioned2024-11-09T13:18:47Z
dc.date.issued2022
dc.description.abstractComputer-Generated Holography (CGH) offers the potential for genuine, high-quality three-dimensional visuals. However, fulfilling this potential remains a practical challenge due to computational complexity and visual quality issues. We propose a new CGH method that exploits gaze-contingency and perceptual graphics to accelerate the development of practical holographic display systems. Firstly, our method infers the user's focal depth and generates images only at their focus plane without using any moving parts. Second, the images displayed are metamers; in the user's peripheral vision, they need only be statistically correct and blend with the fovea seamlessly. Unlike previous methods, our method prioritises and improves foveal visual quality without causing perceptually visible distortions at the periphery. To enable our method, we introduce a novel metameric loss function that robustly compares the statistics of two given images for a known gaze location. In parallel, we implement a model representing the relation between holograms and their image reconstructions. We couple our differentiable loss function and model to metameric varifocal holograms using a stochastic gradient descent solver. We evaluate our method with an actual proof-of-concept holographic display, and we show that our CGH method leads to practical and perceptually three-dimensional image reconstructions.
dc.description.fulltextYES
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuN/A
dc.description.sponsorshipUK Research & Innovation (UKRI)
dc.description.sponsorshipEngineering & Physical Sciences Research Council (EPSRC)
dc.description.sponsorshipRoyal Society of London
dc.description.versionAuthor's final manuscript
dc.formatpdf
dc.identifier.doi10.1109/VR51125.2022.00096
dc.identifier.embargoNO
dc.identifier.filenameinventorynoIR03708
dc.identifier.isbn9781665496179
dc.identifier.linkhttps://doi.org/10.1109/VR51125.2022.00096
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-85129446481
dc.identifier.urihttps://hdl.handle.net/20.500.14288/3038
dc.identifier.wos828657500080
dc.keywordsAugmented reality
dc.keywordsComputer-generated holography
dc.keywordsFoveated rendering
dc.keywordsMetamerisation
dc.keywordsVarifocal near-eye Displays
dc.keywordsVirtual reality
dc.languageEnglish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.grantnoEP/T01346X/1
dc.relation.grantnoRGS\R2\212229
dc.relation.urihttp://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/10570
dc.source2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)
dc.subjectComputer science
dc.titleMetameric varifocal holograms
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-2031-7967
local.contributor.authoridN/A
local.contributor.kuauthorÜrey, Hakan
local.contributor.kuauthorKavaklı, Koray
relation.isOrgUnitOfPublication21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isOrgUnitOfPublication.latestForDiscovery21598063-a7c5-420d-91ba-0cc9b2db0ea0

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
10570.pdf
Size:
17.83 MB
Format:
Adobe Portable Document Format