Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2019, Volume 11, Issue 2, Pages 205–217
DOI: https://doi.org/10.20537/2076-7633-2019-11-2-205-217
(Mi crm706)
 

This article is cited in 3 scientific papers (total in 3 papers)

MATHEMATICAL MODELING AND NUMERICAL SIMULATION

On some stochastic mirror descent methods for constrained online optimization problems

M. S. Alkousa

Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow Region, 141701, Russia
Full-text PDF (152 kB) Citations (3)
References:
Abstract: The problem of online convex optimization naturally occurs in cases when there is an update of statistical information. The mirror descent method is well known for non-smooth optimization problems. Mirror descentis an extension of the subgradient method for solving non-smooth convex optimization problems in the case of a non-Euclidean distance. This paper is devoted to a stochastic variant of recently proposed Mirror Descent methods for convex online optimization problems with convex Lipschitz (generally, non-smooth) functional constraints. This means that we can still use the value of the functional constraint, but instead of (sub)gradient of the objective functional and the functional constraint, we use their stochastic (sub)gradients. More precisely, assume that on a closed subset of $n$-dimensional vector space, N convex Lipschitz non-smooth functionals are given. The problem is to minimize the arithmetic mean of these functionals with a convex Lipschitz constraint. Two methods are proposed, for solving this problem, using stochastic (sub)gradients: adaptive method (does not require knowledge of Lipschitz constant neither for the objective functional, nor for the functional of constraint) and non-adaptive method (requires knowledge of Lipschitz constant for the objective functional and the functional of constraint). Note that it is allowed to calculate the stochastic (sub)gradient of each functional only once. In the case of non-negative regret, we find that the number of non-productive steps is $O(N)$, which indicates the optimality of the proposed methods. We consider an arbitrary proximal structure, which is essential for decision-making problems. The results of numerical experiments are presented, allowing to compare the work of adaptive and non-adaptive methods for some examples. It is shown that the adaptive method can significantly improve the number of the found solutions.
Keywords: online convex optimization problem, non-smooth constrained optimization problem, adaptive mirror Descent, Lipschitz functional, Stochastic (sub)gradient.
Received: 18.11.2018
Revised: 05.03.2019
Accepted: 06.03.2019
Document Type: Article
UDC: 519.85
Language: English
Citation: M. S. Alkousa, “On some stochastic mirror descent methods for constrained online optimization problems”, Computer Research and Modeling, 11:2 (2019), 205–217
Citation in format AMSBIB
\Bibitem{Alk19}
\by M.~S.~Alkousa
\paper On some stochastic mirror descent methods for constrained online optimization problems
\jour Computer Research and Modeling
\yr 2019
\vol 11
\issue 2
\pages 205--217
\mathnet{http://mi.mathnet.ru/crm706}
\crossref{https://doi.org/10.20537/2076-7633-2019-11-2-205-217}
Linking options:
  • https://www.mathnet.ru/eng/crm706
  • https://www.mathnet.ru/eng/crm/v11/i2/p205
  • This publication is cited in the following 3 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:272
    Full-text PDF :97
    References:29
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024