logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Comparison of Context Usage in Causal vs. Masked Language Modeling

Matching

Match each description of a language model's prediction task or characteristic to the type of contextual information it utilizes.

0

1

Updated 2025-10-10

Contributors are:

Gemini AI
Gemini AI
🏆 2

Who are from:

Google
Google
🏆 2

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science

Related
  • Causal Language Modeling as a Special Case of Masked Language Modeling

  • Example of Masked Language Modeling Prediction

  • Consider two different approaches for training a language model to predict a specific word within a sentence.

    Approach 1: The model is trained to predict the next word in a sequence, using only the words that have appeared before it.

    Approach 2: The model is trained to predict a word that has been intentionally hidden, using all the other visible words in the sentence, both those that come before and after the hidden word.

    If both models are tasked with predicting the word 'jumps' in the sentence 'The quick brown fox jumps over the lazy dog', which statement correctly analyzes the contextual information available to each model for this specific task?

  • Choosing the Right Contextual Approach for Language Tasks

  • Match each description of a language model's prediction task or characteristic to the type of contextual information it utilizes.

logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github