Department of SociologyDepartment of Computer Engineering2024-11-092021978-1-954085-79-410.18653/v1/2021.case-1.42-s2.0-85119310309https://hdl.handle.net/20.500.14288/3375We analyze the effect of further pre-training BERT with different domain specific data as an unsupervised domain adaptation strategy for event extraction. Portability of event extraction models is particularly challenging, with large performance drops affecting data on the same text genres (e.g., news). We present PROTEST-ER, a retrained BERT model for protest event extraction. PROTEST-ER outperforms a corresponding generic BERT on out-of-domain data of 8.1 points. Our best performing models reach 51.91-46.39 F1 across both domains.pdfComputer scienceArtificial intelligenceComputer science, interdisciplinary applicationsLinguisticsPROTEST-ER: retraining BERT for protest event extractionConference proceedinghttps://doi.org/10.18653/v1/2021.case-1.4694853100004N/ANOIR03287