Learn Before
Multiple Choice

A language model is given the complete input token sequence: x=[SOS,’What’,’is’,’the’,’capital’,’of’,’France’,’?’]\mathbf{x} = [⟨SOS⟩, \text{'What'}, \text{'is'}, \text{'the'}, \text{'capital'}, \text{'of'}, \text{'France'}, \text{'?'} ]. By analyzing the components of this sequence, identify which token's primary role is to signal the beginning of the input context for the model.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.2 Generative Models - Foundations of Large Language Models

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science