Activity (Process)

Freezing Encoder Parameters During Fine-Tuning

As an alternative to full fine-tuning, a classifier can be efficiently adapted to work in tandem with a pre-trained encoder by freezing the encoder's parameters, θ^\hat{\theta}. This maintains their pre-trained state, allowing the optimization process to focus solely on updating the classifier's parameters, ω\omega, for the specific task.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related