Learn Before
Problem

Exploring and Learning Non-String Prompt Representations

A key research question in automated prompting investigates whether more compact and efficient prompt representations exist beyond traditional discrete text strings. This inquiry is motivated by the fact that LLMs internally encode discrete prompts (or "hard prompts") into low-dimensional vectors, suggesting that a more direct, non-string representation could be possible and more efficient.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related