Publication:
PROTEST-ER: retraining BERT for protest event extraction

Placeholder

School / College / Institute

Program

KU Authors

Co-Authors

Caselli, Tommaso
Basile, Angelo

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

We analyze the effect of further pre-training BERT with different domain specific data as an unsupervised domain adaptation strategy for event extraction. Portability of event extraction models is particularly challenging, with large performance drops affecting data on the same text genres (e.g., news). We present PROTEST-ER, a retrained BERT model for protest event extraction. PROTEST-ER outperforms a corresponding generic BERT on out-of-domain data of 8.1 points. Our best performing models reach 51.91-46.39 F1 across both domains.

Source

Publisher

Assoc Computational Linguistics-Acl

Subject

Computer science, Artificial intelligence, Linguistics

Citation

Has Part

Source

Case 2021: The 4th Workshop On Challenges And Applications Of Automated Extraction Of Socio-Political Events From Text (Case)

Book Series Title

Edition

DOI

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads