Learn Before
Prompt as a Form of Context
In the framework of context compression, prompts and contexts are treated as conceptually similar, though not identical. A prompt can be considered a specific type of context, a viewpoint that helps generalize the methods discussed.
0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.3 Prompting - Foundations of Large Language Models
Related
Optimization Goal for Soft Prompt Learning via Context Compression
Challenge of Context Compression for Long Sequences
Prompt as a Form of Context
A research team is developing a system where a very long, detailed set of instructions is 'compressed' into a compact, learnable set of numerical values. This compact representation is then used to guide a language model in performing a specific task, aiming to replicate the performance that would be achieved if the model had processed the full set of instructions. What is the most significant practical challenge the team will face when implementing this 'compression' process?
Applying Context Compression for a Specialized Task
Sequential Context Compression with an RNN-like Mechanism
The Goal of Context Compression for Soft Prompts
Learn After
Applicability of Context Compression to General Text Compression
A developer is trying to make a language model consistently adhere to a long, complex style guide. Instead of providing the entire guide with every request, they use a technique to distill the guide's key principles into a compact, learned set of instructions that is automatically prefixed to each user's input. Which statement best analyzes the relationship between the full style guide and the compact, learned instructions in this scenario?
Unifying Principle in Model Guidance
The Relationship Between Prompts and Contexts