Learn Before
Multiple Choice

A language model is being trained for text generation. During training, it learns from examples where each target sentence is represented as a sequence of tokens. When tested, the model successfully begins generating text but then fails to stop, producing an endless stream of words. Based on this specific failure, which essential structural token was most likely omitted from the end of each target sentence in the training data?

0

1

Updated 2025-10-03

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science