Learn Before
A language model is given the input sentence: 'The quick brown [MASK] jumps over the lazy dog.' The model's objective is to predict the masked word by considering the full context of the unmasked words around it, both to the left and to the right. Which set of words provides the necessary context for the model to make this prediction?
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A language model is given the input sentence: 'The quick brown [MASK] jumps over the lazy dog.' The model's objective is to predict the masked word by considering the full context of the unmasked words around it, both to the left and to the right. Which set of words provides the necessary context for the model to make this prediction?
Masked Language Model Prediction Task
Consider a language model being trained with the input sequence: 'The quick brown [MASK] jumps over the [MASK] dog.' During the training process, the model's objective is to correctly predict the words for the two
[MASK]tokens, and also to confirm the identities of the unmasked words ('The', 'quick', 'brown', etc.).