Publication: UNSEE: Unsupervised Non-contrastive Sentence Embeddings
Program
KU-Authors
KU Authors
Co-Authors
Advisor
Publication Date
Language
en
Journal Title
Journal ISSN
Volume Title
Abstract
We present UNSEE: Unsupervised Non-Contrastive Sentence Embeddings, a novel approach that outperforms SimCSE in the Massive Text Embedding benchmark. Our exploration begins by addressing the challenge of representation collapse, a phenomenon observed when contrastive objectives in SimCSE are replaced with non-contrastive objectives. To counter this issue, we propose a straightforward solution known as the target network, effectively mitigating representation collapse. The introduction of the target network allows us to leverage non-contrastive objectives, maintaining training stability while achieving performance improvements comparable to contrastive objectives. Our method has achieved peak performance in non-contrastive sentence embeddings through meticulous fine-tuning and optimization. This comprehensive effort has yielded superior sentence representation models, showcasing the effectiveness of our approach. © 2024 Association for Computational Linguistics.
Source:
EACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference
Publisher:
Association for Computational Linguistics (ACL)
Keywords:
Subject
Computational linguistics