Learn Before
Definition

Cross-Entropy in Information Theory

In information theory, the cross-entropy from a true probability distribution PP to a subjective probability distribution QQ, denoted as H(P,Q)H(P, Q), is defined as the expected surprisal of an observer who uses the subjective probabilities QQ when observing data that was actually generated according to the true probabilities PP. Mathematically, this is expressed as:

H(P,Q)=extrmdefjP(j)logQ(j)H(P, Q) \stackrel{ extrm{def}}{=} \sum_j - P(j) \log Q(j)

0

1

Updated 2026-05-03

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L