Learn Before
Concept

Sequence Representation via Language Models

Following the successful application of word embeddings via simple prediction tasks, researchers began to explore learning representations of entire sequences using more powerful language models, such as LSTM-based models. Further progress and immense interest in sequence representation exploded after the Transformer architecture was proposed.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences