Case Study

Contextual Representation Analysis

A team is using a pre-trained sequence encoding model and inputs two different sentences. They observe that the numerical vector generated for the word 'close' is significantly different between the two outputs. Based on the principles of how these models generate representations, explain this phenomenon.

0

1

Updated 2025-10-05

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science