Example

Example of Token Masking in a BERT Input Sequence

An example of token masking can be seen with a two-sentence input prepared for BERT. The original sequence, [CLS] It is raining . [SEP] I need an umbrella . [SEP], is modified by replacing selected tokens with the [MASK] symbol. This results in the masked sequence: [CLS] It is [MASK] . [SEP] I need [MASK] umbrella . [SEP], where the model's task is to predict the original words 'raining' and 'an'.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related