Learn Before
Concept

Unsupervised Pre-training

Unsupervised pre-training, a key focus in the early resurgence of deep learning, involves optimizing a neural network's parameters using a task-agnostic criterion. Instead of relying on task-specific labels, this method uses objectives like minimizing the input's reconstruction cross-entropy. It is typically used as a preparatory phase before a model undergoes supervised training.

0

1

Updated 2026-04-14

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences