Learn Before
Training Objective of an Early Transformer Model
A pioneering generative language model from 2018, which was the first to be based on the transformer architecture, was trained using a specific objective on a large corpus of unlabeled text. In your own words, describe the primary goal of this model's pre-training process.
0
1
Tags
Data Science
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Comprehension in Revised Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A foundational generative language model introduced in 2018 significantly improved the ability to capture relationships between words far apart in a text, a major challenge for previous sequential models. Which of the following best analyzes the core architectural innovation responsible for this leap in performance?
Critique of an Early Transformer-Based Language Model
Training Objective of an Early Transformer Model