Publication: PROTEST-ER: retraining BERT for protest event extraction
dc.contributor.coauthor | Caselli, Tommaso | |
dc.contributor.coauthor | Basile, Angelo | |
dc.contributor.department | Department of Sociology | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.kuauthor | Hürriyetoğlu, Ali | |
dc.contributor.kuauthor | Mutlu, Osman | |
dc.contributor.kuprofile | Teaching Faculty | |
dc.contributor.kuprofile | Researcher | |
dc.contributor.other | Department of Sociology | |
dc.contributor.other | Department of Computer Engineering | |
dc.contributor.schoolcollegeinstitute | College of Social Sciences and Humanities | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.date.accessioned | 2024-11-09T13:23:33Z | |
dc.date.issued | 2021 | |
dc.description.abstract | We analyze the effect of further pre-training BERT with different domain specific data as an unsupervised domain adaptation strategy for event extraction. Portability of event extraction models is particularly challenging, with large performance drops affecting data on the same text genres (e.g., news). We present PROTEST-ER, a retrained BERT model for protest event extraction. PROTEST-ER outperforms a corresponding generic BERT on out-of-domain data of 8.1 points. Our best performing models reach 51.91-46.39 F1 across both domains. | |
dc.description.fulltext | YES | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | YES | |
dc.description.publisherscope | International | |
dc.description.sponsoredbyTubitakEu | EU | |
dc.description.sponsorship | European Union (EU) | |
dc.description.sponsorship | Horizon 2020 | |
dc.description.sponsorship | European Research Council (ERC) | |
dc.description.version | Publisher version | |
dc.format | ||
dc.identifier.doi | 10.18653/v1/2021.case-1.4 | |
dc.identifier.embargo | NO | |
dc.identifier.filenameinventoryno | IR03287 | |
dc.identifier.isbn | 978-1-954085-79-4 | |
dc.identifier.link | https://doi.org/10.18653/v1/2021.case-1.4 | |
dc.identifier.quartile | N/A | |
dc.identifier.scopus | 2-s2.0-85119310309 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/3375 | |
dc.identifier.wos | 694853100004 | |
dc.keywords | Adaptation strategies | |
dc.keywords | Different domains | |
dc.keywords | Domain adaptation | |
dc.keywords | Domain specific | |
dc.keywords | Events extractions | |
dc.keywords | Extraction modeling | |
dc.keywords | Performance | |
dc.keywords | Pre-training | |
dc.keywords | Text genre | |
dc.language | English | |
dc.publisher | Association for Computational Linguistics (ACL) | |
dc.relation.grantno | 714868 | |
dc.relation.uri | http://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/10071 | |
dc.source | Proceedings of the 4th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021) | |
dc.subject | Computer science | |
dc.subject | Artificial intelligence | |
dc.subject | Computer science, interdisciplinary applications | |
dc.subject | Linguistics | |
dc.title | PROTEST-ER: retraining BERT for protest event extraction | |
dc.type | Conference proceeding | |
dspace.entity.type | Publication | |
local.contributor.kuauthor | Hürriyetoğlu, Ali | |
local.contributor.kuauthor | Mutlu, Osman | |
relation.isOrgUnitOfPublication | 10f5be47-fab1-42a1-af66-1642ba4aff8e | |
relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
relation.isOrgUnitOfPublication.latestForDiscovery | 10f5be47-fab1-42a1-af66-1642ba4aff8e |
Files
Original bundle
1 - 1 of 1