Abstract:
This paper is a follow-up to the author's previous paper on convex optimization. In that paper we began the process of adjusting greedy-type algorithms from nonlinear approximation for finding sparse solutions of convex optimization problems. We modified there the three most popular greedy algorithms in nonlinear approximation in Banach spaces – Weak Chebyshev Greedy Algorithm, Weak Greedy Algorithm with Free Relaxation, and Weak Relaxed Greedy Algorithm – for solving convex optimization problems. We continue to study sparse approximate solutions to convex optimization problems. It is known that in many engineering applications researchers are interested in an approximate solution of an optimization problem as a linear combination of elements from a given system of elements. There is an increasing interest in building such sparse approximate solutions using different greedy-type algorithms. In this paper we concentrate on greedy algorithms that provide expansions, which means that the approximant at the mth iteration is equal to the sum of the approximant from the previous, (m−1)th, iteration and one element from the dictionary with an appropriate coefficient. The problem of greedy expansions of elements of a Banach space is well studied in nonlinear approximation theory. At first glance the setting of a problem of expansion of a given element and the setting of the problem of expansion in an optimization problem are very different. However, it turns out that the same technique can be used for solving both problems. We show how the technique developed in nonlinear approximation theory, in particular, the greedy expansions technique, can be adjusted for finding a sparse solution of an optimization problem given by an expansion with respect to a given dictionary.
Citation:
V. N. Temlyakov, “Greedy expansions in convex optimization”, Function spaces and related problems of analysis, Collected papers. Dedicated to Oleg Vladimirovich Besov, corresponding member of the Russian Academy of Sciences, on the occasion of his 80th birthday, Trudy Mat. Inst. Steklova, 284, MAIK Nauka/Interperiodica, Moscow, 2014, 252–270; Proc. Steklov Inst. Math., 284 (2014), 244–262
\Bibitem{Tem14}
\by V.~N.~Temlyakov
\paper Greedy expansions in convex optimization
\inbook Function spaces and related problems of analysis
\bookinfo Collected papers. Dedicated to Oleg Vladimirovich Besov, corresponding member of the Russian Academy of Sciences, on the occasion of his 80th birthday
\serial Trudy Mat. Inst. Steklova
\yr 2014
\vol 284
\pages 252--270
\publ MAIK Nauka/Interperiodica
\publaddr Moscow
\mathnet{http://mi.mathnet.ru/tm3531}
\crossref{https://doi.org/10.1134/S037196851401018X}
\elib{https://elibrary.ru/item.asp?id=21249118}
\transl
\jour Proc. Steklov Inst. Math.
\yr 2014
\vol 284
\pages 244--262
\crossref{https://doi.org/10.1134/S0081543814010180}
\isi{https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=Publons&SrcAuth=Publons_CEL&DestLinkType=FullRecord&DestApp=WOS_CPL&KeyUT=000335559000017}
\elib{https://elibrary.ru/item.asp?id=21876757}
\scopus{https://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-84899857743}
Linking options:
https://www.mathnet.ru/eng/tm3531
https://doi.org/10.1134/S037196851401018X
https://www.mathnet.ru/eng/tm/v284/p252
This publication is cited in the following 10 articles:
Erwan Fouillen, Claire Boyer, Maxime Sangnier, “Proximal boosting: Aggregating weak learners to minimize non-differentiable losses”, Neurocomputing, 520 (2023), 301
A. V. Dereventsov, V. N. Temlyakov, “Biorthogonal Greedy Algorithms in convex optimization”, Appl. Comput. Harmon. Anal., 60 (2022), 489–511
Wenhui Zhang, Peixin Ye, Shuo Xing, Xu Xu, “Optimality of the Approximation and Learning by the Rescaled Pure Super Greedy Algorithms”, Axioms, 11:9 (2022), 437
Zh. Gao, G. Petrova, “Rescaled pure greedy algorithm for convex optimization”, Calcolo, 56:2 (2019), UNSP 15
H. Nguyen, G. Petrova, “Greedy strategies for convex optimization”, Calcolo, 54:1 (2017), 207–224
R. A. DeVore, V. N. Temlyakov, “Convex optimization on Banach spaces”, Found. Comput. Math., 16:2 (2016), 369–394
V. N. Temlyakov, “Convergence and rate of convergence of some greedy algorithms in convex optimization”, Proc. Steklov Inst. Math., 293 (2016), 325–337
V. N. Temlyakov, “Dictionary descent in optimization”, Anal. Math., 42:1 (2016), 69–89
V. N. Temlyakov, “Greedy approximation in convex optimization”, Constr. Approx., 41:2 (2015), 269–296
Vladimir Temlyakov, 2014 48th Asilomar Conference on Signals, Systems and Computers, 2014, 1331