Аннотация:
In this talk we consider an another step in the direction of universalisation of methods for convex optimization problems. In a number of recent articles by O. Devolder, F. Glineur and Yu. Nesterov and O. Devolder's PhD thesis authors considered convex optimization problems with inexact oracle in the sence of deterministic and stochastic errors. It is shown in these works that dual gradient method has slow convergence rate but doesn't accumulate the deterministic error of the oracle. On the contrary the fast gradient method has faster convergence rate but linearly accumulates the deterministic oracle error. They proposed an intermediate gradient method (IGM) for inexact oracle. This method allows to play with tradeoff between the rate of convergence and error accumulation depending on the problem parameters. In our work we introduce a new IGM which can be applied to the problems with composite structure, stochastic inexact oracle and non-euclidean setup. We provide an estimate for mean convergence rate and bound large deviation from this rate.
Joint work with Alexander Gasnikov.