Definition

Foundation Models

Foundation models are large-scale, pre-trained neural network models developed through self-supervised learning on massive, unlabeled datasets. They are designed to be general-purpose and serve as a base that can be efficiently adapted to a wide variety of specific downstream tasks through methods like fine-tuning or prompting, often eliminating the need to train a new model from scratch.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Foundations of Large Language Models

Ch.2 Generative Models - Foundations of Large Language Models

Related