Concept

Knowledge Distillation

Knowledge distillation is a type of model compression and acceleration technique. It counteracts the challenge of deploying efficient deep learning models on devices with limited resources (e.g. mobile devices and embedded systems) due to computational complexity and storage requirements.

0

1

Updated 2026-04-30

Tags

Deep Learning (in Machine learning)

Data Science

Foundations of Large Language Models Course

Computing Sciences

Foundations of Large Language Models

Ch.3 Prompting - Foundations of Large Language Models