Learn Before
Parameter Optimization in Model Adaptation
When adapting a large, pre-trained model for a new classification task, a common efficiency-focused technique involves keeping the core feature-extracting part of the model fixed. Describe which specific set of parameters is optimized during this adaptation process and explain the main rationale behind this strategy.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Evaluating Model Adaptation Strategies
A machine learning engineer is adapting a large pre-trained language model for a new text classification task. Due to limited computational resources, they decide to freeze the encoder's parameters and only train the new classifier head. What is the primary trade-off associated with this decision?
Parameter Optimization in Model Adaptation