Learn Before
Example

Illustration of Prompt Tuning

Prompt tuning can be illustrated using a translation task, such as converting English to Chinese. Instead of relying on fixed textual instructions, soft prompts—which are learnable embeddings—are prepended to the standard input embedding sequence. During the fine-tuning phase, only these specific prompt embeddings are optimized to efficiently adapt the Large Language Model to the target task. Once fully optimized, these embeddings are deployed to instruct the model on new incoming data.

Image 0

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related