Classification

Alternative Methods for Soft Prompt Optimization

Soft prompt optimization can be framed in at least two ways. The first method involves maximizing the log-probability of the desired output, treating it as a maximum likelihood problem. An alternative approach is to minimize the Kullback-Leibler (KL) divergence between the model's output distribution when using the full context and its distribution when using the compressed soft prompt.

0

1

Updated 2025-10-10

Contributors are:

Who are from:

Tags

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences