Researcher:
Keçeci, Sinan

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

Sinan

Last Name

Keçeci

Name

Name Variants

Keçeci, Sinan

Email Address

Birth Date

Search Results

Now showing 1 - 3 of 3
  • Placeholder
    Publication
    The JESTKOD database: an affective multimodal database of dyadic interactions
    (Springer, 2017) N/A; N/A; N/A; N/A; Department of Computer Engineering; Department of Computer Engineering; Bozkurt, Elif; Khaki, Hossein; Keçeci, Sinan; Türker, Bekir Berker; Yemez, Yücel; Erzin, Engin; PhD Student; PhD Student; Master Student; PhD Student; Faculty Member; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; N/A; N/A; N/A; N/A; 107907; 34503
    in human-to-human communication, gesture and speech co-exist in time with a tight synchrony, and gestures are often utilized to complement or to emphasize speech. in human-computer interaction systems, natural, Affective and believable use of gestures would be a valuable key component in adopting and emphasizing human-centered aspects. However, natural and affective multimodal data, for studying computational models of gesture and speech, is limited. in this study, we introduce the JESTKOD database, which consists of speech and full-body motion capture data recordings in dyadic interaction setting under agreement and disagreement scenarios. Participants of the dyadic interactions are native Turkish speakers and recordings of each participant are rated in dimensional affect space. We present our multimodal data collection and annotation process, As well as our preliminary experimental studies on agreement/disagreement classification of dyadic interactions using body gesture and speech data. the JESTKOD database provides a valuable asset to investigate gesture and speech towards designing more natural and affective human-computer interaction systems.
  • Placeholder
    Publication
    Analysis of JestKOD database using affective state annotations
    (Institute of Electrical and Electronics Engineers (IEEE), 2016) N/A; N/A; Department of Computer Engineering; Department of Computer Engineering; Keçeci, Sinan; Erzin, Engin; Yemez, Yücel; Master Student; Faculty Member; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; N/A; 34503; 107907
    Gesticulation, together with the speech, is an important part of natural and affective human-human interaction. Analysis of gesticulation and speech is expected to help designing more natural human-computer interaction (HCI) systems. We build the JestKOD database, which consists of speech and motion capture recordings of dyadic interactions. In this paper we describe our annotation efforts and present the evaluations that we performed on the annotations of the JestKOD database. These evaluations suggest important findings for usability of the JestKOD database in analysis and modeling of HCI systems.
  • Placeholder
    Publication
    JESTKOD database: dyadic interaction analysis
    (IEEE, 2015) Department of Computer Engineering; N/A; Department of Computer Engineering; N/A; N/A; N/A; Erzin, Engin; Bozkurt, Elif; Yemez, Yücel; Türker, Bekir Berker; Keçeci, Sinan; Khaki, Hossein; Faculty Member; PhD Student; Faculty Member; PhD Student; Master Student; PhD Student; Department of Computer Engineering; College of Engineering; Graduate School of Sciences and Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 34503; N/A; 107907; N/A; N/A; N/A
    In the nature of human-to-human communication, gesture and speech co-exist in time with a tight synchrony. We tend to use gestures to complement or to emphasize speech. In this study we present the JESTKOD database, which will be a valuable asset to examine gesture and speech in defining more natural human-computer interaction systems. This JESTKOD database consists of speech and motion capture data recordings of dyadic interactions under friendly and unfriendly interaction scenarios. In this paper we present our multimodal data collection process as well as the early experimental studies on friendly/unfriendly classification of dyadic interactions using body gesture and speech data. © 2015 IEEE./ Öz: Vücut jestleri konuşma ile beraber, vurgulayıcı ve tamamlayıcı olarak insan-insan iletişiminin önemli bir parçasını oluşturmaktadır. Bu çalışmada sunduğumuz çok kipli JESTKOD veritabanı ile insan-insan iletişiminin önemli bir parçası olan konuşmayı ve vücut jestlerini inceleyerek, insan-bilgisayar etkileşimini daha doğal hale getirmeyi amaçlamaktayız. JESTKOD veritabanı kişilerin duygu değişimlerini esas alan konuşma ve hareket yakalamaya dayalı vücut hareketlerinin olumlu ve olumsuz iletişim senaryoları altında kayıtlarından oluşmaktadır. Bu bildiride çok kipli veritabanının hazırlanma yöntemlerini ve aşamalarını sunuyoruz. Aynı zamanda, veritabanımızın ikili iletişim senaryolarını değerlendirmek üzere konuşma ve hareket yakalama kayıtları üzerinden yaptıgımız olumlu/olumsuz iletişim sınıflandırma sonuçlarımızı da sunuyoruz.