Learn Before
Example of Span Masking in Denoising Autoencoding
In the context of denoising autoencoding for encoder-decoder models, span masking involves replacing one or more spans of tokens in an input sequence with mask symbols. For example, if the original sequence is corrupted into [C] The kitten [M] is [M] ., the model is trained to process this input and reconstruct the full, clean target sequence, generating The kitten is chasing the ball ..
0
1
Tags
Foundations of Large Language Models
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Training Encoder-Decoder Models with a Denoising Autoencoding Objective
A research team is pre-training a language model with the specific goal of making it highly proficient at understanding long-range contextual relationships and the logical flow of arguments within a paragraph. They use a method where the model learns to restore an original, clean text from a deliberately corrupted version. Which of the following corruption strategies applied to the input text would be most effective for achieving the team's specific goal?
Designing a Robust Text Correction Model
Analyzing the Impact of Input Corruption
Example of Span Masking in Denoising Autoencoding
Example of Sentinel Masking in Denoising Autoencoding