Publication:
Food intake detection using autoencoder-based deep neural networks

dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorTuran, Mehmet Ali Tuğtekin
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T23:22:52Z
dc.date.issued2018
dc.description.abstractWearable systems have the potential to reduce bias and inaccuracy in current dietary monitoring methods. The analysis of food intake sounds provides important guidance for developing an automated diet monitoring system. Most of the attempts in recent years can be ragarded as impractical due to the need for multiple sensors that specialize in swallowing or chewing detection separately. In this study, we provide a unified system for detecting swallowing and chewing activities with a laryngeal microphone placed on the neck, as well as some daily activities such as speech, coughing or throat clearing. Our proposed system is trained on the dataset containing 10 different food items collected from 8 subjects. The spectrograms, which are extracted from the 276 minute records in total, are fed into a deep autoencoder architecture. In the three-class evaluations (chewing, swallowing and rest), we achieve 71.7% of the F-score and 76.3% of the accuracy. These results provide a promising contribution to an automated food monitoring system that will be developed under everyday conditions.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipAselsan
dc.description.sponsorshipet al.
dc.description.sponsorshipHuawei
dc.description.sponsorshipIEEE Signal Processing Society
dc.description.sponsorshipIEEE Turkey Section
dc.description.sponsorshipNetas
dc.identifier.doi10.1109/SIU.2018.8404522
dc.identifier.isbn9781-5386-1501-0
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85050809300&doi=10.1109%2fSIU.2018.8404522&partnerID=40&md5=93897485c25f3ed4a321be91554018e6
dc.identifier.scopus2-s2.0-85050809300
dc.identifier.urihttp://dx.doi.org/10.1109/SIU.2018.8404522
dc.identifier.urihttps://hdl.handle.net/20.500.14288/11148
dc.identifier.wos511448500375
dc.keywordsAutomated dietary monitoring
dc.keywordsEating activity detection
dc.keywordsSparse autoencoders
dc.keywordsThroat microphone
dc.keywordsWearable sensors
dc.languageTurkish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.source26th IEEE Signal Processing and Communications Applications Conference, SIU 2018
dc.subjectCivil engineering
dc.subjectElectrical electronics engineering
dc.subjectTelecommunication
dc.titleFood intake detection using autoencoder-based deep neural networks
dc.title.alternativeOtokodlayıcı tabanlı derin sinir aǧları kullanarak gıda tüketiminin tespit edilmesi
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-3822-235X
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorTuran, Mehmet Ali Tuğtekin
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files