Publication:
Two-hand on-skin gesture recognition: a dataset and classification network for enhanced human-computer interaction

dc.contributor.coauthorKeskin, Ege
dc.contributor.departmentDepartment of Media and Visual Arts
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorFaculty Member, Özcan, Oğuzhan
dc.contributor.kuauthorFaculty Member, Yemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Social Sciences and Humanities
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.date.accessioned2025-09-10T04:55:16Z
dc.date.available2025-09-09
dc.date.issued2025
dc.description.abstractGestural interaction is an increasingly utilized method for controlling devices and environments. Despite the growing research on gesture recognition, datasets tailored specifically for two-hand on-skin interaction remain scarce. This paper presents the two-hand on-skin (THOS) dataset, comprising 3096 labeled samples and 92,880 frames from three subjects across nine gesture classes. The dataset is based on hand-specific on-skin (HSoS) gestures, which involve direct contact between both hands. We also introduce THOSnet, a hybrid model leveraging transformer decoders and bi-directional long short-term memory (BiLSTM) for gesture classification. Evaluations show that THOSnet outperforms standalone transformer encoders and BiLSTMs, achieving an average test accuracy of 79.31% on the THOS dataset. Our contributions aim to bridge the gap between dynamic gesture recognition and on-skin interaction research, offering valuable resources for developing and testing advanced gesture recognition models. By open-sourcing the dataset and code through https://github.com/ege621/thos-dataset, we facilitate further research and reproducibility in this area.
dc.description.fulltextNo
dc.description.harvestedfromManual
dc.description.indexedbyWOS
dc.description.publisherscopeInternational
dc.description.readpublishN/A
dc.description.sponsoredbyTubitakEuN/A
dc.identifier.doi10.1007/s00371-025-04125-y
dc.identifier.eissn1432-2315
dc.identifier.embargoNo
dc.identifier.issn0178-2789
dc.identifier.quartileN/A
dc.identifier.urihttps://doi.org/10.1007/s00371-025-04125-y
dc.identifier.urihttps://hdl.handle.net/20.500.14288/30052
dc.identifier.wos001541832500001
dc.keywordsOn-skin gestures
dc.keywordsDataset
dc.keywordsDynamic gesture
dc.keywordsTwo-hand gestures
dc.keywordsDeep learning
dc.language.isoeng
dc.publisherSpringer
dc.relation.affiliationKoç University
dc.relation.collectionKoç University Institutional Repository
dc.relation.ispartofVisual computer
dc.subjectComputer Science, Software Engineering
dc.titleTwo-hand on-skin gesture recognition: a dataset and classification network for enhanced human-computer interaction
dc.typeJournal Article
dspace.entity.typePublication
relation.isOrgUnitOfPublication483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery483fa792-2b89-4020-9073-eb4f497ee3fd
relation.isParentOrgUnitOfPublication3f7621e3-0d26-42c2-af64-58a329522794
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication.latestForDiscovery3f7621e3-0d26-42c2-af64-58a329522794

Files