Concept

Entropy in information Theory

Quite similar to Gini index numerically, entropy can also reflects the node purity. A samll value implies that one class dominates the region.

D=k=1Kp^mklogp^mkD=-\sum_{k=1}^{K} \hat{p}_{m k} \log \hat{p}_{m k}

where pmkp_{mk} denotes the proportion of training observations in the mmth region that are from the kkth class. Entropy also measures variance in the label per features.

0

3

Updated 2020-03-01

Tags

Data Science

Learn After