Learn Before
Definition
Cross-Entropy in Information Theory
In information theory, the cross-entropy from a true probability distribution to a subjective probability distribution , denoted as , is defined as the expected surprisal of an observer who uses the subjective probabilities when observing data that was actually generated according to the true probabilities . Mathematically, this is expressed as:
0
1
Updated 2026-05-03
Tags
D2L
Dive into Deep Learning @ D2L