Семинары
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Календарь
Поиск
Регистрация семинара

RSS
Ближайшие семинары




Коллоквиум Факультета компьютерных наук НИУ ВШЭ
11 апреля 2023 г. 16:20–17:40, г. Москва, Покровский бульвар 11
 


Federated Uncertainty Quantification: a Survey

Eric Moulines

École Polytechnique

Количество просмотров:
Эта страница:126
Youtube:

Eric Moulines



Аннотация: Many machine learning applications require training a centralized model on decentralized, heterogeneous, and potentially private data sets. Federated learning (FL) has emerged as a privacy-friendly training paradigm that does not require clients’ private data to leave their local devices. FL brings new challenges in addition to “traditional” distributed learning: expensive communication, statistical heterogeneity, partial participation, and privacy.
The ”classical” formulation of FL treats it as a distributed optimization problem. Yet the standard distributed optimization algorithms (e.g., data-parallel SGD) are too communication-intensive to be practical at FL. An alternative approach is to consider a Bayesian formulation of the FL problem. Typically within this approach an exact posterior inference is untractable even for models and data sets of modest size, thus an approximate inference methods should be considered. Among the many proposed approaches, we will discuss the MCMC solution, which is the Federated Averaging Langevin Dynamics. We will also cover the approach, based on variational inference, where fewer lockstep synchronization and communication steps may be required between clients and servers.

Язык доклада: английский

Website: https://cs.hse.ru/announcements/823486128.html
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2024