Аннотация:
It is often stated that “Correlation does not imply causation”: a dependency between observed values of two random variables might not imply that there exists a causal connection between the corresponding processes (and assuming there exists one, one might not know the direction). In Shannon information theory, this is reflected by the law of “symmetry of information”: $I(X;Y) = I(Y;X)$. The information that a random variable $X$ has about $Y$ equals the information that $Y$ has about $X$.
This law remains valid if Shannon entropy is replaced by Kolmogorov complexity. However, there exists a subtler setting where this law is violated and one might speculate that this asymmetry can be used to reconstruct causality.
In the second part of the talk, we discuss the postulate of independence of conditionals for the inference of causal relations in observed data. This postulate was introduced by D. Janzing and B. Schölkopf in 2010, and the two-variable case states that, if $X$ causes $Y$, then the marginal distribution $P_X$ has low information about the conditional distribution $P_{Y|X}$ (as defined with Kolmogorov complexity). We explain how the most popular methods can be explained by this postulate and overview new methods that were inspired by it.