Learn Before
Comparison

Comparison of Self-Supervised Pre-training and Self-Training

The key distinction between self-supervised pre-training in NLP and traditional self-training lies in their reliance on an initial model. Self-training requires an initial model trained on seed data to generate pseudo labels for unlabeled data. In contrast, self-supervised pre-training does not need an initial model; it generates all supervision signals directly from the raw text and trains the entire model from scratch.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related