Learn Before
Example of Prefix Language Modeling
In prefix language modeling, an initial text sequence is provided as context, and the model is tasked with generating the remaining text. For instance, given the input prefix [C] The kitten is (where [C] represents a specific context token), the model autoregressively generates the continuation sequence: chasing the ball .. The superscripts illustrate the order in which the subsequent target tokens are generated.
0
1
Tags
Foundations of Large Language Models
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Comparison of Prefix and Causal Language Modeling
Example of Prefix Language Modeling Input Format
Training Encoder-Decoder Models with Prefix Language Modeling
Consider a model architecture composed of an encoder and a decoder, trained with a self-supervised objective to complete a text sequence given an initial prefix. Which statement best analyzes the distinct processing methods of the encoder and decoder for this task?
Processing a Text Sequence
In a self-supervised text generation task, a model is given an initial sequence of words (a prefix) and trained to produce the words that follow. For an architecture that uses two distinct components to accomplish this, match each component or data piece with its primary role or characteristic.
Example of Prefix Language Modeling