Publication:
Tree-stack LSTM in transition based dependency parsing

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorYüret, Deniz
dc.contributor.kuprofileFaculty Member
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.yokid179996
dc.contributor.yokidN/A
dc.date.accessioned2024-11-09T13:50:59Z
dc.date.issued2018
dc.description.abstractWe introduce tree-stack LSTM to model state of a transition based parser with recurrent neural networks. Tree-stack LSTM does not use any parse tree based or hand-crafted features, yet performs better than models with these features. We also develop new set of embeddings from raw features to enhance the performance. There are 4 main components of this model: stack's σ-LSTM, buffer's βLSTM, actions' LSTM and tree-RNN. All LSTMs use continuous dense feature vectors (embeddings) as an input. Tree-RNN updates these embeddings based on transitions. We show that our model improves performance with low resource languages compared with its predecessors. We participate in CoNLL 2018 UD Shared Task as the”KParse” team and ranked 16th in LAS, 15th in BLAS and BLEX metrics, of 27 participants parsing 82 test sets from 57 languages.
dc.description.fulltextYES
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TÜBİTAK)
dc.description.versionPublisher version
dc.formatpdf
dc.identifier.doi10.18653/v1/K18-2012
dc.identifier.embargoNO
dc.identifier.filenameinventorynoIR01960
dc.identifier.isbn9781948087827
dc.identifier.linkhttps://doi.org/10.18653/v1/K18-2012
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-85072878637
dc.identifier.urihttps://hdl.handle.net/20.500.14288/3931
dc.keywordsEmbeddings
dc.keywordsComputational linguistics
dc.keywordsSyntactics
dc.keywordsDependency parser
dc.languageEnglish
dc.publisherAssociation for Computational Linguistics (ACL)
dc.relation.grantno114E628 and 215E201
dc.relation.urihttp://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/8490
dc.sourceProceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
dc.subjectComputer engineering
dc.subjectLong short-term memory
dc.titleTree-stack LSTM in transition based dependency parsing
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-7039-0046
local.contributor.authoridN/A
local.contributor.kuauthorYüret, Deniz
local.contributor.kuauthorKırnap, Ömer; Dayanık, Erenay
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
8490.pdf
Size:
421.14 KB
Format:
Adobe Portable Document Format