Learn Before
A language model is being trained using a technique where an input document is 'rotated'. For example, an original document is transformed into the following sequence: 'leads to success . Success brings happiness . Hard work'. What is the primary objective for the model when presented with this transformed input?
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Example of Document Rotation
A self-supervised learning task involves modifying an input sequence by selecting a token and rearranging the sequence so that the selected token becomes the new starting point. The part of the sequence that originally came before the selected token is moved to the end. Given the original sequence 'Hard work leads to success .', if the token 'leads' is chosen as the new starting point, what is the resulting modified sequence?
Reconstructing Original Sequence from Rotated Input
A language model is being trained using a technique where an input document is 'rotated'. For example, an original document is transformed into the following sequence: 'leads to success . Success brings happiness . Hard work'. What is the primary objective for the model when presented with this transformed input?
Example of Document Rotation in Denoising Autoencoding