Learn Before
A large language model is trained on a massive, diverse corpus of text from the internet. The training process involves repeatedly predicting missing words in sentences, with no human-provided labels or fact-checking. After training, the model can correctly state that 'The Eiffel Tower is in Paris.' Which statement best analyzes how the model likely acquired this specific piece of factual knowledge?
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
AI Training Strategy for Specialized Knowledge
A large language model is trained on a massive, diverse corpus of text from the internet. The training process involves repeatedly predicting missing words in sentences, with no human-provided labels or fact-checking. After training, the model can correctly state that 'The Eiffel Tower is in Paris.' Which statement best analyzes how the model likely acquired this specific piece of factual knowledge?
Evaluating Knowledge from Time-Limited Data