Learn Before
Concept

Compression of Pre-trained Models

Since pre-trained models (PTMs) usually consist of hundreds of millions of parameters, they are difficult to deploy in online services for real-world applications or on resource-restricted devices. Model compression is an approach used to reduce the model size and increase computational efficiency.

0

1

Updated 2026-05-03

Tags

Data Science

Foundations of Large Language Models Course

Computing Sciences