Publication:
How would you gesture navigate a drone? a user-centered approach to control a drone

dc.contributor.coauthorObaid, Mohammad
dc.contributor.coauthorKistler, Felix
dc.contributor.coauthorKasparavičiute, Gabriele
dc.contributor.coauthorFjeld, Morten
dc.contributor.departmentDepartment of Media and Visual Arts
dc.contributor.departmentKUAR (KU Arçelik Research Center for Creative Industries)
dc.contributor.kuauthorYantaç, Asım Evren
dc.contributor.schoolcollegeinstituteCollege of Social Sciences and Humanities
dc.contributor.schoolcollegeinstituteResearch Center
dc.date.accessioned2024-11-09T23:34:13Z
dc.date.issued2016
dc.description.abstractGestural interaction with flying drones is now on the rise; however, little work has been done to reveal the gestural preferences from users directly. In this paper, we present an elicitation study to help in realizing user-defined gestures for drone navigation. We apply a user-centered approach in which we collected data from 25 participants performing gestural interactions for twelve drone actions of which ten are navigational actions. The analyses of 300 gesture data collected from our participants reveal a user-defined gestural set of possible suitable gestures to control a drone. We report results that can be used by software developers, engineers or designers; and included a taxonomy for the set of user-defined gestures, gestural agreement scores, time performances and subjective ratings for each action. Finally, we discuss the gestural set with implementation insights and conclude with future directions.
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuN/A
dc.identifier.doi10.1145/2994310.2994348
dc.identifier.isbn9781-4503-4367-1
dc.identifier.scopus2-s2.0-84994905094
dc.identifier.urihttps://doi.org/10.1145/2994310.2994348
dc.identifier.urihttps://hdl.handle.net/20.500.14288/12312
dc.keywordsDrone
dc.keywordsGesture
dc.keywordsInteraction
dc.keywordsQuadcopter
dc.keywordsStudy
dc.keywordsUser-defined
dc.language.isoeng
dc.publisherAssociation for Computing Machinery (ACM)
dc.relation.ispartofAcademicMindtrek 2016 - Proceedings of the 20th International Academic Mindtrek Conference
dc.subjectMedia
dc.subjectVisual arts
dc.titleHow would you gesture navigate a drone? a user-centered approach to control a drone
dc.typeConference Proceeding
dspace.entity.typePublication
local.contributor.kuauthorYantaç, Asım Evren
local.publication.orgunit1College of Social Sciences and Humanities
local.publication.orgunit1Research Center
local.publication.orgunit2Department of Media and Visual Arts
local.publication.orgunit2KUAR (KU Arçelik Research Center for Creative Industries)
relation.isOrgUnitOfPublication483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isOrgUnitOfPublication738de008-9021-4b5c-a60b-378fded7ef70
relation.isOrgUnitOfPublication.latestForDiscovery483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isParentOrgUnitOfPublication3f7621e3-0d26-42c2-af64-58a329522794
relation.isParentOrgUnitOfPublicationd437580f-9309-4ecb-864a-4af58309d287
relation.isParentOrgUnitOfPublication.latestForDiscovery3f7621e3-0d26-42c2-af64-58a329522794

Files