Publication: Two-hand on-skin gesture recognition: a dataset and classification network for enhanced human-computer interaction
| dc.contributor.coauthor | Keskin, Ege | |
| dc.contributor.department | Department of Media and Visual Arts | |
| dc.contributor.department | Department of Computer Engineering | |
| dc.contributor.kuauthor | Faculty Member, Özcan, Oğuzhan | |
| dc.contributor.kuauthor | Faculty Member, Yemez, Yücel | |
| dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
| dc.contributor.schoolcollegeinstitute | College of Engineering | |
| dc.date.accessioned | 2025-09-10T04:55:16Z | |
| dc.date.available | 2025-09-09 | |
| dc.date.issued | 2025 | |
| dc.description.abstract | Gestural interaction is an increasingly utilized method for controlling devices and environments. Despite the growing research on gesture recognition, datasets tailored specifically for two-hand on-skin interaction remain scarce. This paper presents the two-hand on-skin (THOS) dataset, comprising 3096 labeled samples and 92,880 frames from three subjects across nine gesture classes. The dataset is based on hand-specific on-skin (HSoS) gestures, which involve direct contact between both hands. We also introduce THOSnet, a hybrid model leveraging transformer decoders and bi-directional long short-term memory (BiLSTM) for gesture classification. Evaluations show that THOSnet outperforms standalone transformer encoders and BiLSTMs, achieving an average test accuracy of 79.31% on the THOS dataset. Our contributions aim to bridge the gap between dynamic gesture recognition and on-skin interaction research, offering valuable resources for developing and testing advanced gesture recognition models. By open-sourcing the dataset and code through https://github.com/ege621/thos-dataset, we facilitate further research and reproducibility in this area. | |
| dc.description.fulltext | No | |
| dc.description.harvestedfrom | Manual | |
| dc.description.indexedby | WOS | |
| dc.description.publisherscope | International | |
| dc.description.readpublish | N/A | |
| dc.description.sponsoredbyTubitakEu | N/A | |
| dc.identifier.doi | 10.1007/s00371-025-04125-y | |
| dc.identifier.eissn | 1432-2315 | |
| dc.identifier.embargo | No | |
| dc.identifier.issn | 0178-2789 | |
| dc.identifier.quartile | N/A | |
| dc.identifier.uri | https://doi.org/10.1007/s00371-025-04125-y | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14288/30052 | |
| dc.identifier.wos | 001541832500001 | |
| dc.keywords | On-skin gestures | |
| dc.keywords | Dataset | |
| dc.keywords | Dynamic gesture | |
| dc.keywords | Two-hand gestures | |
| dc.keywords | Deep learning | |
| dc.language.iso | eng | |
| dc.publisher | Springer | |
| dc.relation.affiliation | Koç University | |
| dc.relation.collection | Koç University Institutional Repository | |
| dc.relation.ispartof | Visual computer | |
| dc.subject | Computer Science, Software Engineering | |
| dc.title | Two-hand on-skin gesture recognition: a dataset and classification network for enhanced human-computer interaction | |
| dc.type | Journal Article | |
| dspace.entity.type | Publication | |
| relation.isOrgUnitOfPublication | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
| relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
| relation.isOrgUnitOfPublication.latestForDiscovery | 483fa792-2b89-4020-9073-eb4f497ee3fd | |
| relation.isParentOrgUnitOfPublication | 3f7621e3-0d26-42c2-af64-58a329522794 | |
| relation.isParentOrgUnitOfPublication | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 | |
| relation.isParentOrgUnitOfPublication.latestForDiscovery | 3f7621e3-0d26-42c2-af64-58a329522794 |
