Publication:
Sentiment and context-refined word embeddings for sentiment analysis

Placeholder

School / College / Institute

Program

KU-Authors

KU Authors

Co-Authors

Deniz, Ayca
Angin, Pelin

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

Word embeddings have become the de-facto tool for representing text in natural language processing (NLP) tasks, as they can capture semantic and syntactic relations, unlike their precedents such as Bag-of-Words. Although word embeddings have been employed in various studies in recent years and proven to be effective in many NLP tasks, they are still immature for sentiment analysis, as they suffer from insufficient sentiment information. General word embedding models pre-trained on large corpora with methods such as Word2Vec or GloVe achieve limited success in domain-specific NLP tasks. On the other hand, training domain-specific word embeddings from scratch requires a high amount of data and computation power. In this work, we target both shortcomings of pre-trained word embeddings to boost the performance of domain-specific sentiment analysis tasks. We propose a model that refines pre-trained word embeddings with context information and leverages the sentiment scores of sentences obtained from a lexicon-based method to further improve performance. Experiment results on two benchmark datasets show that the proposed method significantly increases the accuracy of sentiment classification.

Source

Publisher

IEEE

Subject

Computer science, Cybernetics, Computer science, Information systems

Citation

Has Part

Source

2021 IEEE International Conference on Systems, Man, and Cybernetics (Smc)

Book Series Title

Edition

DOI

10.1109/SMC52423.2021.9659189

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads

View PlumX Details