Learn Before
Analyzing the Prompt Distillation Process
In the process of distilling prompting knowledge into a soft prompt, a 'teacher' model is given a detailed prompt to generate outputs, and a 'student' model is trained on these outputs. Explain the primary role of the original detailed prompt in this process and describe what happens to the knowledge it contains by the end of the training.
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Applications of Prompt Distillation
Optimizing a Language Model for Mobile Deployment
A team aims to create a smaller, more efficient language model that can perform a specific, complex task without requiring the original, lengthy instruction prompt. They decide to transfer the knowledge from the prompt into the model's parameters. Arrange the steps of this process in the correct logical order.
Analyzing the Prompt Distillation Process