Publication: How would you gesture navigate a drone? a user-centered approach to control a drone
dc.contributor.coauthor | Obaid, Mohammad | |
dc.contributor.coauthor | Kistler, Felix | |
dc.contributor.coauthor | Kasparavičiute, Gabriele | |
dc.contributor.coauthor | Fjeld, Morten | |
dc.contributor.department | Department of Media and Visual Arts | |
dc.contributor.department | KUAR (KU Arçelik Research Center for Creative Industries) | |
dc.contributor.kuauthor | Yantaç, Asım Evren | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | Research Center | |
dc.date.accessioned | 2024-11-09T23:34:13Z | |
dc.date.issued | 2016 | |
dc.description.abstract | Gestural interaction with flying drones is now on the rise; however, little work has been done to reveal the gestural preferences from users directly. In this paper, we present an elicitation study to help in realizing user-defined gestures for drone navigation. We apply a user-centered approach in which we collected data from 25 participants performing gestural interactions for twelve drone actions of which ten are navigational actions. The analyses of 300 gesture data collected from our participants reveal a user-defined gestural set of possible suitable gestures to control a drone. We report results that can be used by software developers, engineers or designers; and included a taxonomy for the set of user-defined gestures, gestural agreement scores, time performances and subjective ratings for each action. Finally, we discuss the gestural set with implementation insights and conclude with future directions. | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | YES | |
dc.description.publisherscope | International | |
dc.description.sponsoredbyTubitakEu | N/A | |
dc.identifier.doi | 10.1145/2994310.2994348 | |
dc.identifier.isbn | 9781-4503-4367-1 | |
dc.identifier.scopus | 2-s2.0-84994905094 | |
dc.identifier.uri | https://doi.org/10.1145/2994310.2994348 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/12312 | |
dc.keywords | Drone | |
dc.keywords | Gesture | |
dc.keywords | Interaction | |
dc.keywords | Quadcopter | |
dc.keywords | Study | |
dc.keywords | User-defined | |
dc.language.iso | eng | |
dc.publisher | Association for Computing Machinery (ACM) | |
dc.relation.ispartof | AcademicMindtrek 2016 - Proceedings of the 20th International Academic Mindtrek Conference | |
dc.subject | Media | |
dc.subject | Visual arts | |
dc.title | How would you gesture navigate a drone? a user-centered approach to control a drone | |
dc.type | Conference Proceeding | |
dspace.entity.type | Publication | |
local.contributor.kuauthor | Yantaç, Asım Evren | |
local.publication.orgunit1 | College of Social Sciences and Humanities | |
local.publication.orgunit1 | Research Center | |
local.publication.orgunit2 | Department of Media and Visual Arts | |
local.publication.orgunit2 | KUAR (KU Arçelik Research Center for Creative Industries) | |
relation.isOrgUnitOfPublication | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isOrgUnitOfPublication | 738de008-9021-4b5c-a60b-378fded7ef70 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isParentOrgUnitOfPublication | 3f7621e3-0d26-42c2-af64-58a329522794 | |
relation.isParentOrgUnitOfPublication | d437580f-9309-4ecb-864a-4af58309d287 | |
relation.isParentOrgUnitOfPublication.latestForDiscovery | 3f7621e3-0d26-42c2-af64-58a329522794 |