Learn Before
Concept
Model Compression
Model compression was was initially proposed as a knowledge transfer from a large / ensemble “teacher” model into training small “student” models with similar accuracy. This was later known as knowledge distillation.
0
1
Updated 2022-10-21
Tags
Deep Learning (in Machine learning)
Data Science