Loading [MathJax]/jax/output/SVG/config.js
Семинары
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Календарь
Поиск
Регистрация семинара

RSS
Ближайшие семинары




Семинар Добрушинской лаборатории Высшей школы современной математики МФТИ
1 июня 2021 г. 16:00, online - zoom, Москва
 


Replica-mean-field limits for intensity-based neural networks

Francois Baccelli

UT Austin and INRIA

Количество просмотров:
Эта страница:139

Аннотация: Due to the inherent complexity of neural models, relating the spiking activity of a network to its structure requires simplifying assumptions, such as considering models in the thermodynamic mean-field limit. In this limit, an infinite number of neurons interact via vanishingly small interactions, thereby erasing the finite size geometry of interactions. To better capture the geometry in question, we analyze the activity of neural networks in the replica-mean-field limit regime. Such models are made of infinitely many replicas which interact according to the same basic structure as that of the finite network of interest. Our main contribution is an analytical characterization of the stationary dynamics of intensity-based neural networks with spiking reset and heterogeneous excitatory synapses in this replica-mean-field limit. Specifically, we functionally characterize the stationary dynamics of these limit networks via ordinary or partial differential equations derived from the Poisson Hypothesis of queuing theory. We then reduce this functional characterization to a system of self-consistency equations specifying the stationary neuronal firing rates. Joint work with T. Taillefumier.
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2025