Researcher:
Han, Aydın

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

Aydın

Last Name

Han

Name

Name Variants

Han, Aydın

Email Address

Birth Date

Search Results

Now showing 1 - 2 of 2
  • Placeholder
    Publication
    Semeval-2010 task 12: parser evaluation using textual entailments
    (Association for Computational Linguistics (ACL), 2010) Department of Computer Engineering; Yüret, Deniz; Turgut, Zehra; Han, Aydın; Faculty Member; Master Student; Master Student; Department of Computer Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 179996; N/A; N/A
    Parser Evaluation using Textual Entailments (PETE) is a shared task in the SemEval-2010 Evaluation Exercises on Semantic Evaluation. The task involves recognizing textual entailments based on syntactic information alone. PETE introduces a new parser evaluation scheme that is formalism independent, less prone to annotation error, and focused on semantically relevant distinctions.
  • Thumbnail Image
    PublicationOpen Access
    Parser evaluation using textual entailments
    (Springer, 2013) Rimell, Laura; Department of Computer Engineering; Yüret, Deniz; Han, Aydın; Faculty Member; Master Student; Department of Computer Engineering; College of Engineering; Graduate School of Sciences and Engineering; 179996; N/A
    Parser Evaluation using Textual Entailments (PETE) is a shared task in the SemEval-2010 Evaluation Exercises on Semantic Evaluation. The task involves recognizing textual entailments based on syntactic information alone. PETE introduces a new parser evaluation scheme that is formalism independent, less prone to annotation error, and focused on semantically relevant distinctions. This paper describes the PETE task, gives an error analysis of the top-performing Cambridge system, and introduces a standard entailment module that can be used with any parser that outputs Stanford typed dependencies.