Publication: PROTEST-ER: retraining BERT for protest event extraction
Files
Program
KU-Authors
KU Authors
Co-Authors
Caselli, Tommaso
Basile, Angelo
Advisor
Publication Date
2021
Language
English
Type
Conference proceeding
Journal Title
Journal ISSN
Volume Title
Abstract
We analyze the effect of further pre-training BERT with different domain specific data as an unsupervised domain adaptation strategy for event extraction. Portability of event extraction models is particularly challenging, with large performance drops affecting data on the same text genres (e.g., news). We present PROTEST-ER, a retrained BERT model for protest event extraction. PROTEST-ER outperforms a corresponding generic BERT on out-of-domain data of 8.1 points. Our best performing models reach 51.91-46.39 F1 across both domains.
Description
Source:
Proceedings of the 4th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021)
Publisher:
Association for Computational Linguistics (ACL)
Keywords:
Subject
Computer science, Artificial intelligence, Computer science, interdisciplinary applications, Linguistics