Seminars
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
Calendar
Search
Add a seminar

RSS
Forthcoming seminars




Mathematical Foundations of Artificial Intelligence
June 4, 2024 16:00, S.I. Adian Seminar jointly with the seminar Mathematical Foundations of Artificial Intelligence, Moscow, Room 110 MIAN (Gubkina str. 8) + Zoom (in English)
 


An introduction to Kolmogorov complexity with applications to reinforcement learning

Bruno Bauwens

HSE Univeristy, Moscow
Video records:
MP4 3,294.4 Mb

Number of views:
This page:740
Video files:528
Youtube:

Bruno Bauwens
Photo Gallery



Abstract: Ray Solomonoff considered a general version the problem of sequence prediction: how to predict the next bit of a sequence when provided with unlimited computational power? He proposed to apply Bayesian reasoning to a prior distribution defined by the output of a randomized universal Turing machine. In this talk, a gentle introduction to Kolmogorov complexity is given. We also prove its is incomputabity, and provide elegant examples of Godel and Rosser sentences. We also prove that Kolmogorov complexity is approximately equal to the negative logarithm of Solomonoff's prior distribution. This implies that Solomonoff induction has a bias towards ‘simple’ explanations. Afterwards, Hutter's solution for reinforcement learning will be discussed, which is a generalization of Solomonoff induction. He also provides a time bounded version, (which still seems not practical) and relies on a proof system for the optimization of computational resources. Finally, I briefly make some philosophical comments on Vitanyi's work on the information distance and on work that aims to understand the implicit Bayes of stochastic gradient descent in neural nets. This talk only requires basic skills in discrete mathematics. It is intended for people interested in computability theory or the foundations of machine learning.

Language: English

References
  1. A. Shen, V. A. Uspensky, N. Vereshchagin, Kolmogorov complexity and algorithmic randomness, Math. Surveys Monogr., 220, American Mathematical Society, Providence, RI, 2017, xviii+511 pp.  mathscinet
  2. M. Li, P. Vitányi, An Introduction to Kolmogorov Complexity and Its Applications, Texts Comput. Sci., Springer, Cham., 2019, xxiii+834 pp.  mathscinet
  3. M. Hutter, Universal artificial intelligence: Sequential decisions based on algorithmic probability, Springer Science & Business Media, 2005, xx+278 pp.  crossref
  4. M. Hutter, D. Quarel, E. Catt, An Introduction to Universal Artificial Intelligence, CRC Press, 2024 https://www.routledge.com/.../book/9781032607023
  5. R. Cilibrasi, P. M. B. Vitányi, “Clustering by compression”, IEEE Trans. Inform. Theory, 51:4 (2005), 1523–1545  mathscinet
  6. D. A. Roberts, Ya. Sho, The principles of deep learning theory, Cambridge University Press, 2022  crossref
 
  Contact us:
 Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024