Fill in the Blank

An autoregressive language model is generating a sequence of tokens, one at a time. To predict the fifth token in the sequence, denoted as y5y_5, the model uses all the previously generated tokens as context. The standard notation for this preceding subsequence of tokens is ____.

0

1

Updated 2025-10-07

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Comprehension in Revised Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science