Concept

Catastrophic Forgetting in Fine-Tuning

Catastrophic forgetting describes a phenomenon where a neural network loses previously learned knowledge after being trained on new data. In the context of fine-tuning, this problem arises when adapting a model to a new task, which can cause a significant drop in its performance on the original task it was previously proficient in.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.1 Pre-training - Foundations of Large Language Models