Publication:
The eHRI database: a multimodal database of engagement in human-robot interactions

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentGraduate School of Sciences and Engineering
dc.contributor.departmentKUIS AI (Koç University & İş Bank Artificial Intelligence Center)
dc.contributor.kuauthorBayramoğlu, Öykü Zeynep
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorHussain, Nusrah
dc.contributor.kuauthorKesim, Ege
dc.contributor.kuauthorNumanoğlu, Tuğçe
dc.contributor.kuauthorSezgin, Tevfik Metin
dc.contributor.kuauthorTürker, Bekir Berker
dc.contributor.kuauthorYemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SCIENCES AND ENGINEERING
dc.contributor.schoolcollegeinstituteResearch Center
dc.date.accessioned2024-11-09T23:36:42Z
dc.date.issued2023
dc.description.abstractWe present the engagement in human-robot interaction (eHRI) database containing natural interactions between two human participants and a robot under a story-shaping game scenario. The audio-visual recordings provided with the database are fully annotated at a 5-intensity scale for head nods and smiles, as well as with speech transcription and continuous engagement values. In addition, we present baseline results for the smile and head nod detection along with a real-time multimodal engagement monitoring system. We believe that the eHRI database will serve as a novel asset for research in affective human-robot interaction by providing raw data, annotations, and baseline results.
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TÜBİTAK)
dc.identifier.doi10.1007/s10579-022-09632-1
dc.identifier.eissn1574-0218
dc.identifier.issn1574-020X
dc.identifier.quartileQ3
dc.identifier.scopus2-s2.0-85147748798
dc.identifier.urihttps://doi.org/10.1007/s10579-022-09632-1
dc.identifier.urihttps://hdl.handle.net/20.500.14288/12695
dc.identifier.wos931760100001
dc.keywordsEngagement
dc.keywordsGesture
dc.keywordsMultimodal data
dc.keywordsHuman-robot interaction
dc.language.isoeng
dc.publisherSpringer
dc.relation.grantnoGrant Number 217E04
dc.relation.ispartofLanguage Resources and Evaluation
dc.subjectComputer science
dc.titleThe eHRI database: a multimodal database of engagement in human-robot interactions
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.kuauthorKesim, Ege
local.contributor.kuauthorNumanoğlu, Tuğçe
local.contributor.kuauthorBayramoğlu, Öykü Zeynep
local.contributor.kuauthorTürker, Bekir Berker
local.contributor.kuauthorHussain, Nusrah
local.contributor.kuauthorSezgin, Tevfik Metin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorErzin, Engin
local.publication.orgunit1GRADUATE SCHOOL OF SCIENCES AND ENGINEERING
local.publication.orgunit1College of Engineering
local.publication.orgunit1Research Center
local.publication.orgunit2Department of Computer Engineering
local.publication.orgunit2KUIS AI (Koç University & İş Bank Artificial Intelligence Center)
local.publication.orgunit2Graduate School of Sciences and Engineering
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication3fc31c89-e803-4eb1-af6b-6258bc42c3d8
relation.isOrgUnitOfPublication77d67233-829b-4c3a-a28f-bd97ab5c12c7
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication434c9663-2b11-4e66-9399-c863e2ebae43
relation.isParentOrgUnitOfPublicationd437580f-9309-4ecb-864a-4af58309d287
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files