Publication: Gestanalytics: experiment and analysis tool for gesture-elicitation studies
dc.contributor.department | N/A | |
dc.contributor.department | Department of Media and Visual Arts | |
dc.contributor.kuauthor | Buruk, Oğuz Turan | |
dc.contributor.kuauthor | Özcan, Oğuzhan | |
dc.contributor.kuprofile | PhD Student | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.other | Department of Media and Visual Arts | |
dc.contributor.researchcenter | KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR) | |
dc.contributor.schoolcollegeinstitute | Graduate School of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.yokid | N/A | |
dc.contributor.yokid | 12532 | |
dc.date.accessioned | 2024-11-10T00:11:43Z | |
dc.date.issued | 2017 | |
dc.description.abstract | Gesture-elicitation studies are common and important studies for understanding user preferences. In these studies, researchers aim at extracting gestures which are desirable by users for different kinds of interfaces. During this process, researchers have to manually analyze many videos which is a tiring and a time-consuming process. Although current tools for video analysis provide annotation opportunity and features like automatic gesture analysis, researchers still need to (1) divide videos into meaningful pieces, (2) manually examine each piece, (3) match collected user data with these, (4) code each video and (5) verify their coding. These processes are burdensome and current tools do not aim to make this process easier and faster. To fill this gap, we developed "GestAnalytics" with features of simultaneous video monitoring, video tagging and filtering. Our internal pilot tests show that GestAnalytics can be a beneficial tool for researchers who practice video analysis for gestural interfaces. | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | NO | |
dc.identifier.doi | 10.1145/3064857.3079114 | |
dc.identifier.isbn | 978-1-4503-4991-8 | |
dc.identifier.scopus | 2-s2.0-85022211932 | |
dc.identifier.uri | http://dx.doi.org/10.1145/3064857.3079114 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/17532 | |
dc.identifier.wos | 482931500007 | |
dc.keywords | User-elicitation | |
dc.keywords | Gestures | |
dc.keywords | Video Annotation | |
dc.keywords | Video analysis | |
dc.keywords | Gesture elicitation | |
dc.keywords | User centered design | |
dc.language | English | |
dc.publisher | Assoc Computing Machinery | |
dc.source | Dis'17 Companion: Proceedings Of The 2017 Acm Conference On Designing Interactive Systems | |
dc.subject | Computer science | |
dc.subject | Theory methods | |
dc.title | Gestanalytics: experiment and analysis tool for gesture-elicitation studies | |
dc.type | Conference proceeding | |
dspace.entity.type | Publication | |
local.contributor.authorid | 0000-0002-8655-5327 | |
local.contributor.authorid | 0000-0002-4410-3955 | |
local.contributor.kuauthor | Buruk, Oğuz Turan | |
local.contributor.kuauthor | Özcan, Oğuzhan | |
relation.isOrgUnitOfPublication | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
relation.isOrgUnitOfPublication.latestForDiscovery | 483fa792-2b89-4020-9073-eb4f497ee3fd |