Learn Before
Concept

Distilling Prompting Knowledge into Soft Prompts

The principles of knowledge distillation can be broadly applied to transfer prompting knowledge into a student model's parameters. By training the student model to replicate the outputs of a teacher model, the knowledge embedded in the prompt is effectively distilled. Consequently, the student model can be viewed as having encoded this distilled knowledge in the form of a soft prompt.

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences