Learn Before
Concept

Kolmogorov Complexity

In this approach, the basic postulate that “the factorization of the joint density function Pcause,effectP_{cause,effect} into PcausePeffectcauseP_{cause} P_{effect|cause} should lead to a simpler model than PeffectPcauseeffectP_{effect} P_{cause|effect} ”, can be expressed with the Kolmogorov complexity framework: K(Pcause)+K(Peffectcause)+K(Peffect)+K(Pcauseeffect)K(P_{cause}) + K(P_{effect|cause}) \overset{+}\leq K(P_{effect}) + K(P_{cause|effect})

This inequality comes from the postulate of algorithmic independence between the distribution of the cause PcauseP_{cause} and the distribution of the causal mechanism PeffectcauseP_{effect|cause} stated by Janzing and Schölkopf as: I(Pcause:Peffectcause)=+0I(P_{cause} : P_{effect|cause}) \overset{+}=0 where I(Pcause:Peffectcause)I(P_{cause} : P_{effect|cause}) denotes algorithmic mutual information.

Kolmogorov complexity and algorithmic mutual information are not computable in practice but they have led to two different practical implementations in the literature:

  • Model Selection with Minimum Message Length Principle (MML)
  • Independence Between Cause and Mechanism

0

1

Updated 2020-07-21

Tags

Data Science