Видеотека
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Видеотека
Архив
Популярное видео

Поиск
RSS
Новые поступления






Moscow Conference on Combinatorics and Applications - week 1
2 июня 2021 г. 13:00–13:30, г. Москва, Онлайн
 


Aleksandr Beznosikov (MIPT, HSE) - On Disributed Saddle Point Problems

Дополнительные материалы:
Adobe PDF 1.4 Mb

Количество просмотров:
Эта страница:56
Материалы:1

Аннотация: GAN is one of the most popular and commonly used neural network models. When the model is large and there is a lot of data, the learning process can be delayed. The standard way out is to use multiple devices. Therefore, the methods of distributed and federated training for GANs are an important question. But from an optimization point of view, GANs are saddle-point problems: $\min_x \max_y f(x,y)$. Therefore, this paper focuses on the distributed optimization of smooth stochastic saddle-point problems. The first part of the paper is devoted to lower bounds for the distributed methods for saddle-point problems as well as the optimal algorithms by which these bounds are achieved. Next, we present a new algorithm for distributed saddle-point problems – Extra Step Local SGD. In the experimental part of the paper, we use the Local SGD technique in practice. In particular, we train GANs in a distributed manner.

Дополнительные материалы: безносиков.pdf (1.4 Mb)
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2024