Joint training on both translation directions for leveraging monolingual data for Low-Resource NMT
Considering that both the source and target sides may have monolingual data that has valuable information, one can leverage both at once via joint training on the two translation directions. This type of dual learning simultaneously improves the two models on both translation directions by aligning the original monolingual sentences and the sentences translated forward and then backward () by the two models. Dual learning can be further improved by introducing multi-agent for both translation directions. Iterative back translation simultaneously trains the NMT models on both translation directions and iteratively updates the back-translated corpus via the updated better NMT models. Bi-directional NMT trains both the translation directions in the same model with a tag indicating the direction at the beginning of source sentences, after which it leverages both source-side and target-side monolingual data by back and forward translation. Mirror-generative NMT jointly trains the translation models on both directions and the language models for both source and target languages with a shared latent variable.
0
1
Tags
Deep Learning (in Machine learning)
Data Science
Related
Back translation and forward translation for leveraging monolingual data for Low-Resource NMT
Joint training on both translation directions for leveraging monolingual data for Low-Resource NMT
Unsupervised NMT for leveraging monolingual data for Low-Resource NMT
Language model pre-training for leveraging monolingual data for Low-Resource NMT
Exploiting comparable corpus for leveraging monolingual data for Low-Resource NMT
Enhancing with bilingual dictionary for leveraging monolingual data for Low-Resource NMT