Example

Visual Representation of Hard vs. Soft Prompts

A hard prompt, such as the instruction 'Translate the sentence into Chinese', can be seen as a discrete token sequence like c1...c5c_1...c_5. When fed into a Large Language Model, these tokens are transformed into a sequence of real-valued vectors, such as h1...h5\mathbf{h}_1...\mathbf{h}_5. These intermediate hidden states within the model's embedding space can be conceptually viewed as a soft prompt, illustrating the fundamental difference between human-readable text and the continuous representations used internally by the model.

Image 0

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Computing Sciences

Foundations of Large Language Models Course