A note on the rate of convergence of local polynomial estimators in regression models
Kybernetika, Tome 37 (2001) no. 5, pp. 585-603 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regression function $m$ and the values of the derivatives $D_{\gamma }m(x_{0})$ in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established. The results cover the special cases of regression models with i.i.d. errors and the case of observations at an equidistant lattice.
Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regression function $m$ and the values of the derivatives $D_{\gamma }m(x_{0})$ in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established. The results cover the special cases of regression models with i.i.d. errors and the case of observations at an equidistant lattice.
Classification : 62G08, 62G20, 62J02
Keywords: nonparametric regression models; smoothness condition
@article{KYB_2001_37_5_a4,
     author = {Liese, Friedrich and Steinke, Ingo},
     title = {A note on the rate of convergence of local polynomial estimators in regression models},
     journal = {Kybernetika},
     pages = {585--603},
     year = {2001},
     volume = {37},
     number = {5},
     mrnumber = {1877076},
     zbl = {1264.62032},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2001_37_5_a4/}
}
TY  - JOUR
AU  - Liese, Friedrich
AU  - Steinke, Ingo
TI  - A note on the rate of convergence of local polynomial estimators in regression models
JO  - Kybernetika
PY  - 2001
SP  - 585
EP  - 603
VL  - 37
IS  - 5
UR  - http://geodesic.mathdoc.fr/item/KYB_2001_37_5_a4/
LA  - en
ID  - KYB_2001_37_5_a4
ER  - 
%0 Journal Article
%A Liese, Friedrich
%A Steinke, Ingo
%T A note on the rate of convergence of local polynomial estimators in regression models
%J Kybernetika
%D 2001
%P 585-603
%V 37
%N 5
%U http://geodesic.mathdoc.fr/item/KYB_2001_37_5_a4/
%G en
%F KYB_2001_37_5_a4
Liese, Friedrich; Steinke, Ingo. A note on the rate of convergence of local polynomial estimators in regression models. Kybernetika, Tome 37 (2001) no. 5, pp. 585-603. http://geodesic.mathdoc.fr/item/KYB_2001_37_5_a4/

[1] Donoho D. L., Liu R. C.: Geometrizing rates of convergence, III. Ann. Statist. 19 (1991), 2, 668–701 | DOI | MR | Zbl

[2] Fan J.: Design-adaptive nonparametric regression. J. Amer. Statist. Assoc. 87 (1992), 420, 998–1004 | MR | Zbl

[3] Fan J.: Local linear regression smoothers and their minimax efficiencies. Ann. Statist. 21 (1993), 196–216 | DOI | MR | Zbl

[4] Fan J., Gasser T., Gijbels I., Brockmann, M., Engel J.: Local polynomial regression: Optimal kernels and asymptotic minimax efficiency. Ann. Inst. Statist. Math. 49 (1997), 1, 79–99 | DOI | MR | Zbl

[5] Gasser T., Müller H.-G.: Estimating regression functions and their derivatives by the kernel method. Scand. J. Statist. 11 (1984), 171–185 | MR | Zbl

[6] Hall P.: On convergence rates in nonparametric problems. Internat. Statist. Rev. 57 (1989), 1, 45–58 | DOI | Zbl

[7] Cam L. Le: Asymptotic Methods in Statistical Decision Theory. Springer–Verlag, Berlin 1986 | MR | Zbl

[8] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 | MR | Zbl

[9] Müller H.-G.: Goodness-of-fit diagnostics for regression models. Scand. J. Statist. 19 (1992), 2, 157–172 | MR | Zbl

[10] Müller W. G.: Optimal design for local fitting. J. Statist. Plann. Inference 55 (1996), 3, 389–397 | DOI | MR | Zbl

[11] Nadaraya E. A.: On estimating regression. Theory Probab. Appl. 9 (1964), 141–142 | Zbl

[12] Park D.: Comparison of two response curve estimators. J. Statist. Comput. Simulation 62 (1999), 3, 259–269 | DOI | MR | Zbl

[13] Rényi A.: On measures of entropy and information. In: Proc. 4th Berkeley Symp., Berkeley 1961, Vol. 1, pp. 547–561 | MR

[14] Ruppert D., Wand P.: Multivariate locally weighted least squares regression. Ann. Statist. 22 (1994), 3, 1346–1370 | DOI | MR | Zbl

[15] Schoenberg I. J.: Spline functions and the problem of graduation. Proc. Nat. Acad. Sci. U.S.A. 52 (1964), 947–950 | DOI | MR | Zbl

[16] Stone C. J.: Consistent nonparametric regression (with discussion). Ann. Statist. 5 (1977), 595–645 | DOI | MR

[17] Stone C. J.: Optimal rates of convergence for nonparametric estimates. Ann. Statist. 8 (1980), 6, 1348–1360 | DOI | MR

[18] Stone C. J.: Optimal global rates of convergence for nonparametric regression. Ann. Statist. 10 (1982), 4, 1040–1053 | DOI | MR | Zbl

[19] Strasser H.: Mathematical Theory of Statistics. De Gruyter, Berlin 1985 | MR | Zbl

[20] Wahba G.: Spline Models for Observational Data. SIAM, Philadelphia 1990 | MR | Zbl

[21] Watson G. S.: Smooth regression analysis. Sankhya, Ser. A 26 (1964), 359–372 | MR | Zbl