Concept

Joint training on both translation directions for leveraging monolingual data for Low-Resource NMT

Considering that both the source and target sides may have monolingual data that has valuable information, one can leverage both at once via joint training on the two translation directions. This type of dual learning simultaneously improves the two models on both translation directions by aligning the original monolingual sentences xx and the sentences xx’ translated forward and then backward (xyxx \to y’ \to x’) by the two models. Dual learning can be further improved by introducing multi-agent for both translation directions. Iterative back translation simultaneously trains the NMT models on both translation directions and iteratively updates the back-translated corpus via the updated better NMT models. Bi-directional NMT trains both the translation directions in the same model with a tag indicating the direction at the beginning of source sentences, after which it leverages both source-side and target-side monolingual data by back and forward translation. Mirror-generative NMT jointly trains the translation models on both directions and the language models for both source and target languages with a shared latent variable.

0

1

Updated 2022-05-29

Contributors are:

Who are from:

Tags

Deep Learning (in Machine learning)

Data Science