Concept

Role of Pre-training in Developing Latent Abilities

During the pre-training phase, Large Language Models acquire the foundational knowledge necessary for understanding instructions and generating appropriate responses. However, these capabilities exist in a latent state and are not fully functional until they are activated by a subsequent supervisory process, such as fine-tuning.

0

1

Updated 2025-10-06

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related