Concept

Mutual Information

Mutual information is a quantity measuring dependence in information theory, basing itself on how the knowledge of one variable reduces the uncertainty on the other variable. In our case, this quantity can be expressed as: I(U,V)=i=1Uj=1VUiVjNlogUiVjUiVjI(U,V) = \sum_{i=1}^{|U|} \sum_{j=1}^{|V|} \frac{|U_i \cap V_j |}{N} \log \frac{|U_i \cap V_j |}{|U_i||V_j|} where UU, VV represent the input variables and UiU_i , VjV_j represent the categories of the variables.

0

1

Updated 2020-07-28

Tags

Data Science