Example

Example of a Denoising Autoencoder Task for Encoder-Decoder Models

This example illustrates the input and output for an encoder-decoder model undergoing denoising training. The encoder processes a corrupted input sequence where specific tokens have been replaced by a mask token, represented as: [CLS] The puppies are [MASK] outside [MASK] house . The decoder then utilizes the hidden representation from the encoder to predict and reconstruct the original, uncorrupted sequence, yielding: ⟨s⟩ The puppies are frolicking outside the house .

0

1

Updated 2026-04-16

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences