Learn Before
Early Stopping in Multilingual Pre-training
To counteract the phenomenon of interference, where a multilingual model's overall performance begins to decline after an extended period of training, practical systems often implement early stopping. This technique involves halting the pre-training process before the degradation of performance occurs.
0
1
Tags
Foundations of Large Language Models
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
A research team is training a large multilingual language model on a dataset containing English, Spanish, and Swahili. They observe that after an extensive number of training steps, the model's performance on a Swahili translation task begins to degrade, even though its performance on English remains strong and the overall training loss continues to decrease. Which of the following concepts best explains this specific outcome?
Diagnosing Performance Degradation in a Multilingual Model
Addressing Performance Imbalance in a Multi-Language Model
Early Stopping in Multilingual Pre-training