Examples of Masked Prediction Tasks
Masked prediction is a language modeling task where a model must predict masked or hidden words within a sentence, based on a given context [C]. The placeholders for these masked words can be represented in various ways, such as [M], [X], or [Y]. For instance, with the context [C] being 'The kitten', the model's objective could be to fill in the blanks in phrases like '[M] chasing the [M] .', '[X] the [Y]', or within a partial sentence like The kitten [M] is [M] . to reconstruct a complete sentence.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Examples of Masked Prediction Tasks
A language model is designed to generate the most probable sequence of words based on a given text. If the model is provided with the input: 'After a long day of hiking in the mountains, the tired traveler sat down by the campfire and...'. Which of the following continuations would the model most likely generate?
A language model is tasked with completing a sentence. Given the initial context 'After the storm, a brilliant rainbow', arrange the following words into the most logical sequence that the model would generate one word at a time to complete the thought.
Analyzing a Language Model's Prediction Error
Learn After
Example of an Ordered Target Sequence for Masked Prediction
A language model is presented with the following input text where a portion is hidden: 'The sun shines brightly in the [X] sky.' Which of the following tasks best represents the model's primary objective if it is performing a masked prediction task?
Example of Masked Prediction: Kitten Chasing
Example of Masked Language Modeling: Kitten Playing
Example of a Simple Target Sequence
Example of Masked Prediction with a Known Verb
Example of Masked Prediction with Distinct Placeholders
Choosing a Model Training Strategy
Selecting a Training Objective for a Grammar-Focused Model