Publication:
Monitoring infant's emotional cry in domestic environments using the capsule network architecture

dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorTuran, Mehmet Ali Tuğtekin
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T23:51:35Z
dc.date.issued2018
dc.description.abstractAutomated recognition of an infant's cry from audio can be considered as a preliminary step for the applications like remote baby monitoring. In this paper, we implemented a recently introduced deep learning topology called capsule network (CapsNet) for the cry recognition problem. A capsule in the CapsNet, which is defined as a new representation, is a group of neurons whose activity vector represents the probability that the entity exists. Active capsules at one level make predictions, via transformation matrices, for the parameters of higher-level capsules. When multiple predictions agree, a higher level capsule becomes active. We employed spectrogram representations from the short segments of an audio signal as an input of the CapsNet. For experimental evaluations, we apply the proposed method on INTERSPEECH 2018 computational paralinguistics challenge (ComParE), crying sub-challenge, which is a three-class classification task using an annotated database (CRIED). Provided audio samples contains recordings from 20 healthy infants and categorized into the three classes namely neutral, fussing and crying. We show that the multi-layer CapsNet is competitive with the baseline performance on the CRIED corpus and is considerably better than a conventional convolutional net.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipAdobe
dc.description.sponsorshipet al.
dc.description.sponsorshipJD.Com
dc.description.sponsorshipMI
dc.description.sponsorshipSamsung
dc.description.sponsorshipUniphore
dc.description.volume2018-September
dc.identifier.doi10.21437/Interspeech.2018-2187
dc.identifier.issn2308-457X
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85054988099&doi=10.21437%2fInterspeech.2018-2187&partnerID=40&md5=6b3c8e096381fcce670331be661e961f
dc.identifier.scopus2-s2.0-85054988099
dc.identifier.urihttp://dx.doi.org/10.21437/Interspeech.2018-2187
dc.identifier.urihttps://hdl.handle.net/20.500.14288/14738
dc.identifier.wos465363900027
dc.keywordsBaby cry detection
dc.keywordsCapsule network
dc.keywordsComParE
dc.keywordsComputational paralinguistic
dc.keywordsEmotion recognition Classification (of information)
dc.keywordsDeep learning
dc.keywordsLinear transformations
dc.keywordsLinguistics
dc.keywordsNetwork architecture
dc.keywordsBase-line performance
dc.keywordsComParE
dc.keywordsDomestic environments
dc.keywordsEmotion recognition
dc.keywordsExperimental evaluation
dc.keywordsParalinguistic
dc.keywordsThree-class classification
dc.keywordsTransformation matrices
dc.keywordsSpeech communication
dc.languageEnglish
dc.publisherInternational Speech and Communication Association
dc.sourceProceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH
dc.subjectComputer Science
dc.subjectArtificial intelligence
dc.subjectElectrical electronics engineering
dc.titleMonitoring infant's emotional cry in domestic environments using the capsule network architecture
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-3822-235X
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorTuran, Mehmet Ali Tuğtekin
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files