Publication:
KU_ai at MEDIQA 2019: domain-specific pre-training and transfer learning for medical NLI

dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorCengiz, Cemil
dc.contributor.kuauthorSert, Ulaş
dc.contributor.kuauthorYüret, Deniz
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokid179996
dc.date.accessioned2024-11-09T23:26:19Z
dc.date.issued2019
dc.description.abstractin this paper, we describe our system and results submitted for the Natural Language inference (NLI) track of the MEDIQa 2019 Shared Task (Ben abacha et al., 2019). as KU ai team, we used BERT (Devlin et al., 2018) as our baseline model and pre-processed the MedNLI dataset to mitigate the negative impact of de-identification artifacts. Moreover, we investigated different pre-training and transfer learning approaches to improve the performance. We show that pre-training the language model on rich biomedical corpora has a significant effect in teaching the model domain-specific language. in addition, training the model on large NLI datasets such as MultiNLI and SNLI helps in learning task-specific reasoning. Finally, we ensembled our highest-performing models, and achieved 84.7% accuracy on the unseen test dataset and ranked 10th out of 17 teams in the official results.
dc.description.indexedbyWoS
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsorshipHuawei Turkey R&D Center through the Huawei Graduate Research Support Scholarship Cemil Cengiz is supported by Huawei Turkey R&D Center through the Huawei Graduate Research Support Scholarship.
dc.identifier.doiN/A
dc.identifier.isbn978-1-950737-28-4
dc.identifier.quartileN/A
dc.identifier.urihttps://hdl.handle.net/20.500.14288/11535
dc.identifier.wos521946800045
dc.languageEnglish
dc.publisherassoc Computational Linguistics-acl
dc.sourceSigbiomed Workshop on Biomedical Natural Language Processing (Bionlp 2019)
dc.subjectComputer science
dc.subjectArtificial intelligence
dc.subjectMedical informatics
dc.titleKU_ai at MEDIQA 2019: domain-specific pre-training and transfer learning for medical NLI
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0003-2681-5059
local.contributor.authoridN/A
local.contributor.authorid0000-0002-7039-0046
local.contributor.kuauthorCengiz, Cemil
local.contributor.kuauthorSert, Ulaş
local.contributor.kuauthorYüret, Deniz
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files