The James–Stein procedure for a conditionally Gaussian regression
Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 4 (2011), pp. 6-17 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

The paper considers the problem of estimating a $p$-dimensional ($p\ge2$) mean vector of a multivariate conditionally normal distribution under quadratic loss. The problem of this type arises when estimating the parameters in a continuous time regression model with a non-Gaussian Ornstein–Uhlenbeck process. We propose a modification of the James–Stein procedure of the form $\theta^*(Y)=(1-c/\|Y\|)Y$, where $Y$ is an observation and $c>0$ is a special constant. This estimate allows one to derive an explicit upper bound for the quadratic risk and has a significantly smaller risk than the usual maximum likelihood estimator for the dimensions $p\ge2$. This procedure is applied to the problem of parametric estimation in a continuous time conditionally Gaussian regression model and to that of estimating the mean vector of a multivariate normal distribution when the covariance matrix is unknown and depends on some nuisance parameters.
Keywords: conditionally Gaussian regression model, improved estimation, James–Stein procedure, non-Gaussian Ornstein–Uhlenbeck process.
@article{VTGU_2011_4_a1,
     author = {E. A. Pchelintsev},
     title = {The {James{\textendash}Stein} procedure for a~conditionally {Gaussian} regression},
     journal = {Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika},
     pages = {6--17},
     year = {2011},
     number = {4},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/VTGU_2011_4_a1/}
}
TY  - JOUR
AU  - E. A. Pchelintsev
TI  - The James–Stein procedure for a conditionally Gaussian regression
JO  - Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
PY  - 2011
SP  - 6
EP  - 17
IS  - 4
UR  - http://geodesic.mathdoc.fr/item/VTGU_2011_4_a1/
LA  - ru
ID  - VTGU_2011_4_a1
ER  - 
%0 Journal Article
%A E. A. Pchelintsev
%T The James–Stein procedure for a conditionally Gaussian regression
%J Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
%D 2011
%P 6-17
%N 4
%U http://geodesic.mathdoc.fr/item/VTGU_2011_4_a1/
%G ru
%F VTGU_2011_4_a1
E. A. Pchelintsev. The James–Stein procedure for a conditionally Gaussian regression. Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 4 (2011), pp. 6-17. http://geodesic.mathdoc.fr/item/VTGU_2011_4_a1/

[1] Berger J. O., Haff L. R., “A class of minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix”, Statist. Decisions, 1 (1983), 105–129 | MR | Zbl

[2] Efron B., Morris C., “Families of minimax estimators of the mean of a multivariate normal distribution”, Ann. Statist., 4:1 (1976), 11–21 | DOI | MR | Zbl

[3] Fourdrinier D., Statistique inférentielle, Dunod, 2002, 336 pp.

[4] Fourdrinier D., Pergamenshchikov S., “Improved selection model method for the regression with dependent noise”, Ann. Inst. Statist. Math., 59:3 (2007), 435–464 | DOI | MR | Zbl

[5] Fourdrinier D., Strawderman W. E., William E., “A unified and generalized set of shrinkage bounds on minimax Stein estimates”, J. Multivariate Anal., 99 (2008), 2221–2233 | DOI | MR | Zbl

[6] Gleser L. J., “Minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix”, Ann. Statist., 14:4 (1986), 1625–1633 | DOI | MR | Zbl

[7] James W., Stein C., “Estimation with quadratic loss”, Proceedings of the Fourth Berkeley Symposium on Mathematics Statistics and Probability, v. 1, University of California Press, Berkeley, 1961, 361–380 | MR

[8] Konev V., Pergamenchtchikov S., Efficient robust nonparametric estimation in a semimartingale regression model, 2010, arXiv: 1010.3366[math.ST]

[9] Stein C., “Estimation of the mean of a multivariate normal distribution”, Ann. Statist., 9:6 (1981), 1135–1151 | DOI | MR | Zbl