Learn Before
Concept

Context Distillation

Context distillation is a knowledge distillation method designed to adapt large language models (LLMs) to follow simplified instructions. It involves training a student model to make predictions based on user inputs and simplified contexts (such as condensed instructions). This is achieved by transferring knowledge from a well-trained, instruction-following teacher model that processes the original, detailed instructions. The student model learns by minimizing the loss between its predictions and those produced by the teacher model.

Image 0

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences