|
SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES
Optimal analysis of method with batching for monotone stochastic finite-sum variational inequalities
A. Pichugina, M. Pechina, A. Beznosikova, A. Savchenkob, A. Gasnikova a Moscow Institute of Physics and Technology, Dolgoprudny, Russian Federation
b Sber AI Lab, Moscow, Russian Federation
Abstract:
Variational inequalities are a universal optimization paradigm that is interesting in itself, but also incorporates classical minimization and saddle point problems. Modern realities encourage to consider stochastic formulations of optimization problems. In this paper, we present an analysis of a method that gives optimal convergence estimates for monotone stochastic finite-sum variational inequalities. In contrast to the previous works, our method supports batching and does not lose the oracle complexity optimality. The effectiveness of the algorithm, especially in the case of small but not single batches is confirmed experimentally.
Keywords:
stochastic optimization, variational inequalities, finite-sum problems, batching.
Citation:
A. Pichugin, M. Pechin, A. Beznosikov, A. Savchenko, A. Gasnikov, “Optimal analysis of method with batching for monotone stochastic finite-sum variational inequalities”, Dokl. RAN. Math. Inf. Proc. Upr., 514:2 (2023), 212–224; Dokl. Math., 108:suppl. 2 (2023), S348–S359
Linking options:
https://www.mathnet.ru/eng/danma466 https://www.mathnet.ru/eng/danma/v514/i2/p212
|
Statistics & downloads: |
Abstract page: | 91 | References: | 15 |
|