When adapting a large, pre-trained model by introducing and training only a small set of new parameters, the original weights of the base model are also fine-tuned, but with a much smaller learning rate to prevent drastic changes.
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Comprehension in Revised Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A research team wants to adapt a very large, pre-trained language model (with billions of parameters) to perform a new, specialized task, such as classifying medical reports. The team's primary constraint is a very limited computational budget, which makes it infeasible to update all of the model's original parameters. Which of the following training strategies best resolves this constraint while still effectively adapting the model to the new task?
Evaluating a Model Adaptation Strategy
When adapting a large, pre-trained model by introducing and training only a small set of new parameters, the original weights of the base model are also fine-tuned, but with a much smaller learning rate to prevent drastic changes.