Publication:
Food intake detection using autoencoder-based deep neural networks

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Advisor

Publication Date

2018

Language

Turkish

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

Wearable systems have the potential to reduce bias and inaccuracy in current dietary monitoring methods. The analysis of food intake sounds provides important guidance for developing an automated diet monitoring system. Most of the attempts in recent years can be ragarded as impractical due to the need for multiple sensors that specialize in swallowing or chewing detection separately. In this study, we provide a unified system for detecting swallowing and chewing activities with a laryngeal microphone placed on the neck, as well as some daily activities such as speech, coughing or throat clearing. Our proposed system is trained on the dataset containing 10 different food items collected from 8 subjects. The spectrograms, which are extracted from the 276 minute records in total, are fed into a deep autoencoder architecture. In the three-class evaluations (chewing, swallowing and rest), we achieve 71.7% of the F-score and 76.3% of the accuracy. These results provide a promising contribution to an automated food monitoring system that will be developed under everyday conditions.

Description

Source:

26th IEEE Signal Processing and Communications Applications Conference, SIU 2018

Publisher:

Institute of Electrical and Electronics Engineers (IEEE)

Keywords:

Subject

Civil engineering, Electrical electronics engineering, Telecommunication

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

0

Downloads

View PlumX Details