Concept

Shift from Word to Sequence Representations

Building on the success of representing individual words as vectors, the research focus in NLP expanded to learning representations for entire sequences of text. This progression was enabled by more powerful language models, such as those using LSTM architectures. The subsequent introduction of the Transformer model dramatically accelerated this trend, causing a surge in research and development of sequence representation techniques.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related