Learn Before
Concept

BERT Input Representation: Single and Paired Sentences

BERT models are built to represent either individual sentences or pairs of sentences, equipping them to tackle a variety of downstream language understanding problems. When a task requires handling two sentences simultaneously, the input is arranged as a single combined sequence containing both sentences, typically denoted as SentA\mathrm{Sent}_{A} and SentB\mathrm{Sent}_{B}.

Image 0

0

0

Updated 2026-05-02

Tags

What is BERT?

Data Science

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences