Publication: Speech Disfluencies and Hand Gestures as Metacognitive Cues
Program
KU Authors
Co-Authors
Yilmaz, Begum
Furman, Reyhan
Goksun, Tilbe
Eskenazi, Terry
Publication Date
Language
Type
Embargo Status
No
Journal Title
Journal ISSN
Volume Title
Alternative Title
Abstract
How language interacts with metacognitive processes is an understudied area. Earlier research shows that people produce disfluencies (i.e., uh s or um s) in their speech when they are not sure of their answers, indicating metacognitive monitoring. Gestures have monitoring and predictive roles in language, also implicating metacognitive processes. Further, the rate of speech disfluencies and gestures change as a function of the communicational setting. People produce fewer disfluencies and more gestures when they can see the listener than when the listener is not visible. In the current study, 50 participants (32 women, Mage = 21.16, SD = 1.46) were asked 40 general knowledge questions, either with a visible (n = 25) or nonvisible (n = 25) listener. They provided feelings-of-knowing (FOK) judgment immediately after seeing the question and were asked to think aloud while pondering their answers. Then, they provided retrospective confidence judgments (RCJs). Results showed that gestures and speech disfluencies were not related either to the accuracy or the FOK judgments. However, both gestures and speech disfluencies predicted RCJs uniquely and interactively. Speech disfluencies negatively predicted RCJs. In contrast, hand gestures were positively related to RCJs. Importantly, the use of gestures was more strongly related to RCJs when disfluencies were also higher. No effect of communicational setting on the rate of gestures or speech disfluencies was found. These results highlight the importance of multimodal language cues in the elaboration of metacognitive judgments.
Source
Publisher
Wiley
Subject
Psychology, Experimental
Citation
Has Part
Source
Cognitive science
Book Series Title
Edition
DOI
10.1111/cogs.70093
