Learn Before
Example

Example of Permuted Language Modeling

In permuted language modeling, the input sequence is provided in its natural order, but the model learns to predict the tokens using an arbitrarily permuted generation order. For example, given the full input sentence [C] The kitten is chasing the ball ., the model might be tasked with generating the output tokens in a non-sequential order such as: The5^5 kitten7^7 is6^6 chasing1^1 the4^4 ball2^2 .3^3. Here, the superscripts verify that chasing is generated first, ball is generated second, and so on, simulating an autoregressive process over a shuffled target ordering.

0

1

Updated 2026-04-17

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related