Seminars
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
Calendar
Search
Add a seminar

RSS
Forthcoming seminars




Colloquium of the Faculty of Computer Science
May 10, 2016 18:10–19:30, Moscow
 


Causal inference and Kolmogorov complexity

Bruno Bauwens

Faculty of Computer Science, National Research University "Higher School of Economics"

Number of views:
This page:199
Youtube:



Abstract: It is often stated that “Correlation does not imply causation”: a dependency between observed values of two random variables might not imply that there exists a causal connection between the corresponding processes (and assuming there exists one, one might not know the direction).  In Shannon information theory, this is reflected by the law of “symmetry of information”: $I(X;Y) = I(Y;X)$. The information that a random variable $X$ has about $Y$ equals the information that $Y$ has about $X$.
This law remains valid if Shannon entropy is replaced by Kolmogorov complexity. However, there exists a subtler setting where this law is violated and one might speculate that this asymmetry can be used to reconstruct causality.
In the second part of the talk, we discuss the postulate of independence of conditionals for the inference of causal relations in observed data. This postulate was introduced by D. Janzing and B. Schölkopf in 2010, and the two-variable case states that, if $X$ causes $Y$, then the marginal distribution $P_X$ has low information about the conditional distribution $P_{Y|X}$ (as defined with Kolmogorov complexity). We explain how the most popular methods can be explained by this postulate and overview new methods that were inspired by it.

Language: English
 
  Contact us:
 Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024