Learn Before
Concept

Conditional Entropy

It is useful and necessary to talk about conditional entropy, instead of leaving the subject where the textbook drops off. Let YY be the set of labels, and let XX be the set of features.

Conditional Entropy We now define conditional entropy as H(YX=x)=yYPr(Y=y)logPr(Y=yX=x)H(Y \mid X=x) = -\sum_{y\in Y} \Pr(Y=y)\log \Pr(Y=y \mid X=x).

We interpret the conditional entropy as the variability in the label, given the feature X=xX=x. In context: how uncertain are we about a person’s health, given that we know how many times a year they go to the doctors office.

0

1

Updated 2020-03-01

Tags

Data Science

Related