Learn Before
Short Answer

Bidirectional Context in Language Modeling

Consider the sentence: 'The driver turned the steering ___ to avoid the obstacle.' A language model is tasked with predicting the missing word. Explain how a model trained using a masked language modeling objective would leverage the available information differently than a model that can only process the sentence from left to right. Specifically, what part of the sentence provides crucial context that the left-to-right model would miss?

0

1

Updated 2025-10-02

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science