Publication:
UNSEE: Unsupervised Non-contrastive Sentence Embeddings

Placeholder

Departments

School / College / Institute

Program

KU Authors

Co-Authors

Editor & Affiliation

Compiler & Affiliation

Translator

Other Contributor

Date

Language

Embargo Status

N/A

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

We present UNSEE: Unsupervised Non-Contrastive Sentence Embeddings, a novel approach that outperforms SimCSE in the Massive Text Embedding benchmark. Our exploration begins by addressing the challenge of representation collapse, a phenomenon observed when contrastive objectives in SimCSE are replaced with non-contrastive objectives. To counter this issue, we propose a straightforward solution known as the target network, effectively mitigating representation collapse. The introduction of the target network allows us to leverage non-contrastive objectives, maintaining training stability while achieving performance improvements comparable to contrastive objectives. Our method has achieved peak performance in non-contrastive sentence embeddings through meticulous fine-tuning and optimization. This comprehensive effort has yielded superior sentence representation models, showcasing the effectiveness of our approach. © 2024 Association for Computational Linguistics.

Source

Publisher

Association for Computational Linguistics (ACL)

Subject

Computational linguistics

Citation

Has Part

Source

EACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference

Book Series Title

Edition

DOI

item.page.datauri

Link

Rights

N/A

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

Related Goal

0

Views

0

Downloads