Publication:
Transfer learning for low-resource neural machine translation

Thumbnail Image

Organizational Units

Program

KU-Authors

KU Authors

Co-Authors

Zoph, Barret
May, Jonathan
Knight, Kevin

Advisor

Publication Date

2016

Language

English

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

The encoder-decoder framework for neural machine translation (NMT) has been shown effective in large data scenarios, but is much less effective for low-resource languages. We present a transfer learning method that significantly improves BLEU scores across a range of low-resource languages. Our key idea is to first train a high-resource language pair (the parent model), then transfer some of the learned parameters to the low-resource pair (the child model) to initialize and constrain training. Using our transfer learning method we improve baseline NMT models by an average of 5.6 BLEU on four low-resource language pairs. Ensembling and unknown word replacement add another 2 BLEU which brings the NMT performance on low-resource machine translation close to a strong syntax based machine translation (SBMT) system, exceeding its performance on one language pair. Additionally, using the transfer learning model for re-scoring, we can improve the SBMT system by an average of 1.3 BLEU, improving the state-of-the-art on low-resource machine translation.

Description

Source:

Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing

Publisher:

Association for Computational Linguistics (ACL)

Keywords:

Subject

Computer engineering, Learning systems

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

1

Downloads

View PlumX Details