Publication: Labdesignar: configuring multi-camera motion capture systems in augmented reality
dc.contributor.coauthor | Fjeld, Morten | |
dc.contributor.department | N/A | |
dc.contributor.department | Department of Media and Visual Arts | |
dc.contributor.kuauthor | Baytaş, Mehmet Aydın | |
dc.contributor.kuauthor | Yantaç, Asım Evren | |
dc.contributor.kuprofile | PhD Student | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.other | Department of Media and Visual Arts | |
dc.contributor.researchcenter | KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR) | |
dc.contributor.schoolcollegeinstitute | Graduate School of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.yokid | N/A | |
dc.contributor.yokid | 52621 | |
dc.date.accessioned | 2024-11-09T23:49:58Z | |
dc.date.issued | 2017 | |
dc.description.abstract | We present LabDesignAR, an augmented reality application to support the planning, setup, and reconfiguration of marker-based motion capture systems with multiple cameras. LabDesignAR runs on the Microsoft HoloLens and allows the user to place an arbitrary number of virtual "holographic" motion capture cameras into an arbitrary space, in situ. The holographic cameras can be arbitrarily positioned, and different lens configurations can be selected to visualize the resulting fields of view and their intersections. LabDesignAR also demonstrates a hybrid natural gestural interaction technique, implemented through a fusion of the vision-based hand tracking capabilities of an augmented reality headset and instrumented gesture recognition with an electromyography armband. The source code for LabDesignAR and its supporting components can be found online. | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | NO | |
dc.description.publisherscope | International | |
dc.identifier.doi | 10.1145/3139131.3141778 | |
dc.identifier.eissn | N/A | |
dc.identifier.isbn | 978-1-4503-5548-3 | |
dc.identifier.issn | N/A | |
dc.identifier.quartile | N/A | |
dc.identifier.scopus | 2-s2.0-85038571541 | |
dc.identifier.uri | http://dx.doi.org/10.1145/3139131.3141778 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/14466 | |
dc.identifier.wos | 455354500086 | |
dc.keywords | LabDesignAR | |
dc.keywords | Motion capture | |
dc.keywords | Augmented reality | |
dc.keywords | HoloLens | |
dc.keywords | Gestural interaction | |
dc.keywords | natural interaction | |
dc.language | English | |
dc.publisher | Assoc Computing Machinery | |
dc.source | VRST'17: Proceedings of The 23rd ACM Symposium on Virtual Reality Software and Technology | |
dc.subject | Computer science | |
dc.subject | Software engineering | |
dc.subject | Computer science | |
dc.subject | Theory methods | |
dc.title | Labdesignar: configuring multi-camera motion capture systems in augmented reality | |
dc.type | Conference proceeding | |
dspace.entity.type | Publication | |
local.contributor.authorid | N/A | |
local.contributor.authorid | 0000-0002-3610-4712 | |
local.contributor.kuauthor | Baytaş, Mehmet Aydın | |
local.contributor.kuauthor | Yantaç, Asım Evren | |
relation.isOrgUnitOfPublication | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isOrgUnitOfPublication.latestForDiscovery | 483fa792-2b89-4020-9073-eb4f497ee3fd |