Publication:
Learning syntactic categories using paradigmatic representations of word context

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorYüret, Deniz
dc.contributor.kuauthorYatbaz, Mehmet Ali
dc.contributor.kuauthorSert, Enis Rıfat
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileMaster Student
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.yokid179996
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.date.accessioned2024-11-09T23:50:08Z
dc.date.issued2012
dc.description.abstractWe investigate paradigmatic representations of word context in the domain of unsupervised syntactic category acquisition. Paradigmatic representations of word context are based on potential substitutes of a word in contrast to syntagmatic representations based on properties of neighboring words. We compare a bigram based baseline model with several paradigmatic models and demonstrate significant gains in accuracy. Our best model based on Euclidean co-occurrence embedding combines the paradigmatic context representation with morphological and orthographic features and achieves 80% many-to-one accuracy on a 45-tag 1M word corpus.
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipBaidu
dc.description.sponsorshipGoogle
dc.description.sponsorshipMicrosoft Research
dc.identifier.doiN/A
dc.identifier.isbn9781-9372-8443-5
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84883382321andpartnerID=40andmd5=dbc08bb03806209d5a4a0e3cbd219a0d
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-84883382321
dc.identifier.uriN/A
dc.identifier.urihttps://hdl.handle.net/20.500.14288/14482
dc.keywordsBaseline models
dc.keywordsBest model
dc.keywordsCo-occurrence
dc.keywordsContext representation
dc.keywordsMany-to-one
dc.keywordsOn potentials
dc.keywordsParadigmatic models
dc.keywordsWord contexts
dc.keywordsSyntactics
dc.keywordsNatural language processing systems
dc.languageEnglish
dc.publisherAssociation for Computational Linguistics
dc.sourceEMNLP-CoNLL 2012 - 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Proceedings of the Conference
dc.subjectComputer engineering
dc.titleLearning syntactic categories using paradigmatic representations of word context
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-7039-0046
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.kuauthorYüret, Deniz
local.contributor.kuauthorYatbaz, Mehmet Ali
local.contributor.kuauthorSert, Enis Rıfat
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files