Publication: Visual-spatial and verbal abilities differentially affect processing of gestural vs. spoken expressions
dc.contributor.coauthor | N/A | |
dc.contributor.department | N/A | |
dc.contributor.department | Department of Psychology | |
dc.contributor.kuauthor | Özer, Demet | |
dc.contributor.kuauthor | Göksun, Tilbe | |
dc.contributor.kuprofile | PhD Student | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.other | Department of Psychology | |
dc.contributor.schoolcollegeinstitute | Graduate School of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.yokid | N/A | |
dc.contributor.yokid | 47278 | |
dc.date.accessioned | 2024-11-09T23:26:31Z | |
dc.date.issued | 2020 | |
dc.description.abstract | Listeners are sensitive to speakers' co-speech iconic gestures. Concurrent visual and verbal information compete for attentional resources during multimodal comprehension. The current study examined the role of individual differences in visual-spatial vs. verbal abilities on individuals' differential sensitivity to gestural vs. spoken expressions. Turkish-speaking adults (N = 83) were tested on their sensitivity to concurrent gesture vs. speech in an online task (Kelly, S. D., ozyurek, A., & Maris, E. (2010a). Two sides of the same coin. Psychological Science, 21(2), 260-267) and were administered spatial and verbal working memory measures. Participants were slower and less accurate when gesture and speech were incongruent to one another compared to the baseline condition, in which they expressed congruent information. People with higher spatial working memory capacity were more efficient in processing gestures whereas people with higher verbal working memory capacity were more sensitive to spoken expressions. These suggest that not all people are equally sensitive to co-speech gestures and some people may benefit more from gestures during comprehension. | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.issue | 7 | |
dc.description.openaccess | NO | |
dc.description.sponsorship | Turkiye Bilimler Akademisi (Turkish Academy of Sciences) Outstanding Young Scientist Award This work was supported by Turkiye Bilimler Akademisi (Turkish Academy of Sciences) Outstanding Young Scientist Award 2018 given to Tilbe Goksun. | |
dc.description.volume | 35 | |
dc.identifier.doi | 10.1080/23273798.2019.1703016 | |
dc.identifier.eissn | 2327-3801 | |
dc.identifier.issn | 2327-3798 | |
dc.identifier.scopus | 2-s2.0-85076900211 | |
dc.identifier.uri | http://dx.doi.org/10.1080/23273798.2019.1703016 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/11565 | |
dc.identifier.wos | 503305200001 | |
dc.keywords | Gesture-speech incongruency | |
dc.keywords | Gesture perception | |
dc.keywords | Individual differences | |
dc.keywords | Visual-spatial resources | |
dc.keywords | Verbal resources | |
dc.keywords | Cognitive load theory | |
dc.keywords | Working-memory capacity | |
dc.keywords | Iconic gestures | |
dc.keywords | Individual-differences | |
dc.keywords | Hand gestures | |
dc.keywords | Visible speech | |
dc.keywords | Semantic information | |
dc.keywords | Integration | |
dc.keywords | Language | |
dc.keywords | Communication | |
dc.language | English | |
dc.publisher | Routledge Journals, Taylor and Francis Ltd | |
dc.source | Language Cognition and Neuroscience | |
dc.subject | Audiology | |
dc.subject | Speech-language pathology | |
dc.subject | Behavioral sciences | |
dc.subject | Linguistics | |
dc.subject | Psychology | |
dc.subject | Experimental psychology | |
dc.title | Visual-spatial and verbal abilities differentially affect processing of gestural vs. spoken expressions | |
dc.type | Journal Article | |
dspace.entity.type | Publication | |
local.contributor.authorid | 0000-0003-3230-2874 | |
local.contributor.authorid | 0000-0002-0190-7988 | |
local.contributor.kuauthor | Özer, Demet | |
local.contributor.kuauthor | Göksun, Tilbe | |
relation.isOrgUnitOfPublication | d5fc0361-3a0a-4b96-bf2e-5cd6b2b0b08c | |
relation.isOrgUnitOfPublication.latestForDiscovery | d5fc0361-3a0a-4b96-bf2e-5cd6b2b0b08c |