Learn Before
Concept

Encoding Soft Prompts with Sequence Models

A technique to better represent a soft prompt by treating its sequence of vectors (p0,p1,,pnp_0, p_1, \dots, p_n) as an input to a dedicated sequence model, such as a Transformer. This secondary model encodes the entire soft prompt sequence, and its resulting output representation is then used as the actual prompt input for the main Large Language Model. This essentially involves developing an additional model specifically for encoding soft prompts.

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Computing Sciences

Foundations of Large Language Models Course

Related