Learn Before
Concept
Teacher Model in Context Distillation
In context distillation, building the teacher model follows a standard fine-tuning process. This involves collecting a dataset comprising instructions, user inputs, and correct responses, and then continuing to train a pre-trained language model with this dataset so it becomes proficient at following detailed instructions.
0
1
Updated 2026-04-30
Tags
Foundations of Large Language Models
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences