Abstract:
This paper addresses the problem of constructing and analyzing estimators for the regression problem in supervised learning. Recently, there has been great interest in studying universal estimators. The term “universal” means that, on the one hand, the estimator does not depend on the a priori assumption that the regression function fρfρ belongs to some class FF from a collection of classes FF and, on the other hand, the estimation error for fρfρ is close to the optimal error for the class FF. This paper is an illustration of how the general technique of constructing universal estimators, developed in the author's previous paper, can be applied in concrete situations. The setting of the problem studied in the paper has been motivated by a recent paper by Smale and Zhou. The starting point for us is a kernel K(x,u)K(x,u) defined on X×ΩX×Ω. On the base of this kernel, we build an estimator that is universal for classes defined in terms of nonlinear approximations with regard to the system {K(⋅,u)}u∈Ω{K(⋅,u)}u∈Ω. To construct an easily implementable estimator, we apply the relaxed greedy algorithm.
\Bibitem{Tem06}
\by V.~N.~Temlyakov
\paper On Universal Estimators in Learning Theory
\inbook Function spaces, approximation theory, and nonlinear analysis
\bookinfo Collected papers
\serial Trudy Mat. Inst. Steklova
\yr 2006
\vol 255
\pages 256--272
\publ Nauka, MAIK «Nauka/Inteperiodika»
\publaddr M.
\mathnet{http://mi.mathnet.ru/tm268}
\mathscinet{http://mathscinet.ams.org/mathscinet-getitem?mr=2302836}
\transl
\jour Proc. Steklov Inst. Math.
\yr 2006
\vol 255
\pages 244--259
\crossref{https://doi.org/10.1134/S0081543806040201}
\scopus{https://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-33846857278}
Linking options:
https://www.mathnet.ru/eng/tm268
https://www.mathnet.ru/eng/tm/v255/p256
This publication is cited in the following 4 articles:
V. N. Temlyakov, “On Universal Sampling Recovery in the Uniform Norm”, Proc. Steklov Inst. Math., 323 (2023), 206–216
Temlyakov V.N., “Smooth Fixed Volume Discrepancy, Dispersion, and Related Problems”, J. Approx. Theory, 237 (2019), 113–134
Vladimir Temlyakov, “Connections between numerical integration, discrepancy, dispersion, and universal discretization”, The SMAI journal of computational mathematics, S5 (2019), 185
Temlyakov V.N., “Universal Discretization”, J. Complex., 47 (2018), 97–109