Short Answer

Analyzing the Prompt Distillation Process

In the process of distilling prompting knowledge into a soft prompt, a 'teacher' model is given a detailed prompt to generate outputs, and a 'student' model is trained on these outputs. Explain the primary role of the original detailed prompt in this process and describe what happens to the knowledge it contains by the end of the training.

0

1

Updated 2025-10-06

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science