Learn Before
A model is being prepared to understand relationships between aligned sentences in different languages. An input sequence is created by joining a Spanish sentence and its English translation. To train the model to predict missing words, some original words are replaced with a special [MASK] symbol. Given the original packed sequence below, which option correctly demonstrates this replacement process?
Original Sequence: [CLS] El gato se sentó en la alfombra . [SEP] The cat sat on the mat . [SEP]
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Transformer Encoding of a Masked Bilingual Sentence Pair
A model is being prepared to understand relationships between aligned sentences in different languages. An input sequence is created by joining a Spanish sentence and its English translation. To train the model to predict missing words, some original words are replaced with a special
[MASK]symbol. Given the original packed sequence below, which option correctly demonstrates this replacement process?Original Sequence:
[CLS] El gato se sentó en la alfombra . [SEP] The cat sat on the mat . [SEP]Optimizing a Model's Training Strategy
Evaluating a Masking Strategy for Specialized Translation