Publication:
Learning syntactic categories using paradigmatic representations of word context

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Advisor

Publication Date

Language

English

Journal Title

Journal ISSN

Volume Title

Abstract

We investigate paradigmatic representations of word context in the domain of unsupervised syntactic category acquisition. Paradigmatic representations of word context are based on potential substitutes of a word in contrast to syntagmatic representations based on properties of neighboring words. We compare a bigram based baseline model with several paradigmatic models and demonstrate significant gains in accuracy. Our best model based on Euclidean co-occurrence embedding combines the paradigmatic context representation with morphological and orthographic features and achieves 80% many-to-one accuracy on a 45-tag 1M word corpus.

Source:

EMNLP-CoNLL 2012 - 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Proceedings of the Conference

Publisher:

Association for Computational Linguistics

Keywords:

Subject

Computer engineering

Citation

Endorsement

Review

Supplemented By

Referenced By

Copyrights Note

0

Views

0

Downloads

View PlumX Details