Learn Before
Theory

Limit of Data Compression in Information Theory

A fundamental theorem of information theory establishes a strict lower bound on data compression. It states that to encode data drawn randomly from a given probability distribution PP, an observer needs at least H[P]H[P] nats (or bits, if using base 2) to encode it, where H[P]H[P] is the entropy of the distribution.

0

1

Updated 2026-05-03

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L