Семинары
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Календарь
Поиск
Регистрация семинара

RSS
Ближайшие семинары




Коллоквиум Факультета компьютерных наук НИУ ВШЭ
10 мая 2016 г. 18:10–19:30, г. Москва, Покровский бульвар 11
 


Causal inference and Kolmogorov complexity

Bruno Bauwens

Faculty of Computer Science, National Research University "Higher School of Economics"

Количество просмотров:
Эта страница:151
Youtube:



Аннотация: It is often stated that “Correlation does not imply causation”: a dependency between observed values of two random variables might not imply that there exists a causal connection between the corresponding processes (and assuming there exists one, one might not know the direction).  In Shannon information theory, this is reflected by the law of “symmetry of information”: $I(X;Y) = I(Y;X)$. The information that a random variable $X$ has about $Y$ equals the information that $Y$ has about $X$.
This law remains valid if Shannon entropy is replaced by Kolmogorov complexity. However, there exists a subtler setting where this law is violated and one might speculate that this asymmetry can be used to reconstruct causality.
In the second part of the talk, we discuss the postulate of independence of conditionals for the inference of causal relations in observed data. This postulate was introduced by D. Janzing and B. Schölkopf in 2010, and the two-variable case states that, if $X$ causes $Y$, then the marginal distribution $P_X$ has low information about the conditional distribution $P_{Y|X}$ (as defined with Kolmogorov complexity). We explain how the most popular methods can be explained by this postulate and overview new methods that were inspired by it.

Язык доклада: английский
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2024