Learn Before
A pre-training strategy for a multilingual model involves taking an aligned sentence pair (e.g., an English sentence and its German translation) and concatenating them to form a single input sequence for one training step. What is the primary advantage of this method compared to training the model on the English and German sentences in separate, independent training steps?
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A pre-training strategy for a multilingual model involves taking an aligned sentence pair (e.g., an English sentence and its German translation) and concatenating them to form a single input sequence for one training step. What is the primary advantage of this method compared to training the model on the English and German sentences in separate, independent training steps?
Example of an Aligned Bilingual Sentence Pair
Constructing a Packed Bilingual Input
A researcher is pre-training a cross-lingual language model using a technique that combines sentences from two different languages into a single training input. Arrange the following steps to accurately describe this process.