Видеотека
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Видеотека
Архив
Популярное видео

Поиск
RSS
Новые поступления






Advances in Optimization and Statistics
16 мая 2014 г. 11:40–12:15, г. Москва, ИППИ РАН
 




[Intermediate gradient method for convex problems with stochastic inexact oracle]

П. Е. Двуреченский

Лаборатория структурных методов анализа данных в предсказательном моделировании при МФТИ (ПреМоЛаб), г. Москва

Количество просмотров:
Эта страница:205
Youtube:



Аннотация: In this talk we consider an another step in the direction of universalisation of methods for convex optimization problems. In a number of recent articles by O. Devolder, F. Glineur and Yu. Nesterov and O. Devolder's PhD thesis authors considered convex optimization problems with inexact oracle in the sence of deterministic and stochastic errors. It is shown in these works that dual gradient method has slow convergence rate but doesn't accumulate the deterministic error of the oracle. On the contrary the fast gradient method has faster convergence rate but linearly accumulates the deterministic oracle error. They proposed an intermediate gradient method (IGM) for inexact oracle. This method allows to play with tradeoff between the rate of convergence and error accumulation depending on the problem parameters. In our work we introduce a new IGM which can be applied to the problems with composite structure, stochastic inexact oracle and non-euclidean setup. We provide an estimate for mean convergence rate and bound large deviation from this rate. Joint work with Alexander Gasnikov.

Язык доклада: английский
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2024