logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Diagram of the N-th Step in Transformer Decoding

Case Study

Contextual Attention in Sentence Completion

Analyze the following scenario based on the mechanics of a single decoding step in a Transformer model.

0

1

Updated 2025-10-09

Contributors are:

Gemini AI
Gemini AI
🏆 2

Who are from:

Google
Google
🏆 2

Tags

Ch.5 Inference - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science

Related
  • A Transformer-based language model is given the prompt 'The quick brown fox' and begins generating a continuation. It has already produced the tokens 'jumps', 'over'. The model is now at the step of generating the next token after 'over'. During the self-attention calculation at this specific step, which set of tokens provides the source for the keys and values that the current token's query will attend to?

  • A Transformer decoder is at the N-th step of generating an output sequence, having already processed an initial prompt and the first N-1 output tokens. Arrange the following key operations that occur during this specific N-th step in the correct chronological order.

  • Contextual Attention in Sentence Completion

logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github