Publication:
Tree-stack LSTM in transition based dependency parsing

Thumbnail Image

School / College / Institute

Organizational Unit

Program

KU-Authors

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

NO

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

We introduce tree-stack LSTM to model state of a transition based parser with recurrent neural networks. Tree-stack LSTM does not use any parse tree based or hand-crafted features, yet performs better than models with these features. We also develop new set of embeddings from raw features to enhance the performance. There are 4 main components of this model: stack's σ-LSTM, buffer's βLSTM, actions' LSTM and tree-RNN. All LSTMs use continuous dense feature vectors (embeddings) as an input. Tree-RNN updates these embeddings based on transitions. We show that our model improves performance with low resource languages compared with its predecessors. We participate in CoNLL 2018 UD Shared Task as the”KParse” team and ranked 16th in LAS, 15th in BLAS and BLEX metrics, of 27 participants parsing 82 test sets from 57 languages.

Source

Publisher

Association for Computational Linguistics (ACL)

Subject

Computer engineering, Long short-term memory

Citation

Has Part

Source

Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies

Book Series Title

Edition

DOI

10.18653/v1/K18-2012

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

1

Views

4

Downloads

View PlumX Details