Consequences of Independent Predictions in Language Models
Consider a language model trained with a self-supervised objective where it learns to fill in randomly blanked-out words in a text. If the model encounters the sentence 'The artist picked up a [MASK] and a [MASK] to begin the painting,' it will try to predict the two missing words. However, it will predict each missing word without considering what the other missing word might be. Analyze the potential problem this creates for generating a coherent and logical sentence completion.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Diagnosing a Language Model's Predictive Behavior
A language model pre-trained with a standard masked language modeling objective is given the input sentence: 'The capital of the United Kingdom is [MASK] [MASK].' Which statement best describes how the model will predict the two masked tokens?
Consequences of Independent Predictions in Language Models
Permuted Language Modeling (PLM)