|
Trudy Matematicheskogo Instituta imeni V.A. Steklova, 2006, Volume 255, Pages 256–272
(Mi tm268)
|
|
|
|
This article is cited in 4 scientific papers (total in 4 papers)
On Universal Estimators in Learning Theory
V. N. Temlyakov University of South Carolina
Abstract:
This paper addresses the problem of constructing and analyzing estimators for the regression problem in supervised learning. Recently, there has been great interest in studying universal estimators. The term “universal” means that, on the one hand, the estimator does not depend on the a priori assumption that the regression function $f_\rho$ belongs to some class $F$ from a collection of classes $\mathcal F$ and, on the other hand, the estimation error for $f_\rho$ is close to the optimal error for the class $F$. This paper is an illustration of how the general technique of constructing universal estimators, developed in the author's previous paper, can be applied in concrete situations. The setting of the problem studied in the paper has been motivated by a recent paper by Smale and Zhou. The starting point for us is a kernel $K(x,u)$ defined on $X\times \Omega$. On the base of this kernel, we build an estimator that is universal for classes defined in terms of nonlinear approximations with regard to the system $\{K(\cdot ,u)\}_{u\in \Omega }$. To construct an easily implementable estimator, we apply the relaxed greedy algorithm.
Received in January 2006
Citation:
V. N. Temlyakov, “On Universal Estimators in Learning Theory”, Function spaces, approximation theory, and nonlinear analysis, Collected papers, Trudy Mat. Inst. Steklova, 255, Nauka, MAIK «Nauka/Inteperiodika», M., 2006, 256–272; Proc. Steklov Inst. Math., 255 (2006), 244–259
Linking options:
https://www.mathnet.ru/eng/tm268 https://www.mathnet.ru/eng/tm/v255/p256
|
Statistics & downloads: |
Abstract page: | 353 | Full-text PDF : | 101 | References: | 36 |
|