Learn Before
A researcher is developing a model to understand patterns in unlabeled time-series data from weather sensors. The data for each day is a sequence of 24 hourly temperature readings. The researcher's training strategy involves taking a sequence, randomly hiding the temperature reading for a single hour, and then training the model to estimate the hidden temperature value by looking at the readings from the other 23 hours. Which fundamental training strategy does this approach best exemplify?
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Masked Language Modeling (MLM)
A researcher is developing a model to understand patterns in unlabeled time-series data from weather sensors. The data for each day is a sequence of 24 hourly temperature readings. The researcher's training strategy involves taking a sequence, randomly hiding the temperature reading for a single hour, and then training the model to estimate the hidden temperature value by looking at the readings from the other 23 hours. Which fundamental training strategy does this approach best exemplify?
Dual Role of Data in a Self-Supervised Task
Analyzing Self-Supervised Training Procedures