Minimax estimation of the Gaussian parametric regression
Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 5 (2014), pp. 40-47 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

The paper considers the problem of estimating a $d\ge2$ dimensional mean vector of a multivariate normal distribution under quadratic loss. Let the observations be described by the equation \begin{equation} Y=\theta+\sigma\xi, \end{equation} where $\theta$ is a $d$-dimension vector of unknown parameters from some bounded set $\Theta\subset\mathbb R^d$, $\xi$ is a Gaussian random vector with zero mean and identity covariance matrix $I_d$, i.e. $Law(\xi)=\mathrm N_d(0,I_d)$ and $\sigma$ is a known positive number. The problem is to construct a minimax estimator of the vector $\theta$ from observations $Y$. As a measure of the accuracy of estimator $\hat\theta$ we select the quadratic risk defined as $$ R(\theta,\hat\theta):=\boldsymbol E_\theta|\theta-\hat\theta|^2,\qquad|x|^2=\sum^d_{j=1}x^2_j, $$ where $\boldsymbol E_\theta$ is the expectation with respect to measure $\boldsymbol P_\theta$. We propose a modification of the James–Stein procedure of the form $$ \theta^*_+=\left(a-\frac c{|Y|}\right)_+Y, $$ where $c>0$ is a special constant and $a_+=\max(a,0)$ is a positive part of $a$. This estimate allows one to derive an explicit upper bound for the quadratic risk and has a significantly smaller risk than the usual maximum likelihood estimator and the estimator $$ \theta^*=\left(1-\frac c{|Y|}\right)Y $$ for the dimensions $d\ge2$. We establish that the proposed procedure $\hat\theta_+$ is minimax estimator for the vector $\theta$. A numerical comparison of the quadratic risks of the considered procedures is given. In conclusion it is shown that the proposed minimax estimator $\hat\theta_+$ is the best estimator in the mean square sense.
Keywords: parametric regression, improved estimation, James–Stein procedure, mean squared risk, minimax estimator.
@article{VTGU_2014_5_a3,
     author = {V. A. Pchelintsev and E. A. Pchelintsev},
     title = {Minimax estimation of the {Gaussian} parametric regression},
     journal = {Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika},
     pages = {40--47},
     year = {2014},
     number = {5},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/VTGU_2014_5_a3/}
}
TY  - JOUR
AU  - V. A. Pchelintsev
AU  - E. A. Pchelintsev
TI  - Minimax estimation of the Gaussian parametric regression
JO  - Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
PY  - 2014
SP  - 40
EP  - 47
IS  - 5
UR  - http://geodesic.mathdoc.fr/item/VTGU_2014_5_a3/
LA  - ru
ID  - VTGU_2014_5_a3
ER  - 
%0 Journal Article
%A V. A. Pchelintsev
%A E. A. Pchelintsev
%T Minimax estimation of the Gaussian parametric regression
%J Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
%D 2014
%P 40-47
%N 5
%U http://geodesic.mathdoc.fr/item/VTGU_2014_5_a3/
%G ru
%F VTGU_2014_5_a3
V. A. Pchelintsev; E. A. Pchelintsev. Minimax estimation of the Gaussian parametric regression. Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 5 (2014), pp. 40-47. http://geodesic.mathdoc.fr/item/VTGU_2014_5_a3/

[1] Fourdrinier D., Statistique Inferentielle, Dunod, Paris, 2002

[2] Lehmann E. L., Casella G., Theory of Point Estimation, 2nd edition, Springer, N.Y., 1998 | MR

[3] James W., Stein C., “Estimation with quadratic loss”, Proceedings of the Fourth Berkeley Symposium on Mathematics Statistics and Probability, v. 1, University of California Press, Berkeley, 1961, 361–380 | MR

[4] Stein C., “Estimation of the mean of a multivariate normal distribution”, The Annals of Statistics, 9:6 (1981), 1135–1151 | DOI | MR | Zbl

[5] Baranchik A. J., Multiple regression and estimation of the mean of a multivariate normal distribution, Technical Report. V. 51, Department of Statistics, Stanford University, 1964 | MR

[6] Strawderman W. E., “Proper Bayes minimax estimators of the multivariate normal distribution”, Annals of Mathematical Statistics, 42 (1971), 385–388 | DOI | MR | Zbl

[7] Guo Y. Y., Pal N., “A sequence of improvements over the James–Stein estimator”, J. Multivariate Analysis, 42 (1992), 302–317 | DOI | MR | Zbl

[8] Shao P. Y.-S., Strawderman W. E., “Improving on the James–Stein positive-part estimator”, The Annals of Statistics, 22 (1994), 1517–1538 | DOI | MR | Zbl

[9] Efron B., Morris C., “Families of minimax estimators of the mean of a multivariate normal distribution”, The Annals of Statistics, 4 (1976), 11–21 | DOI | MR | Zbl

[10] Berger J. O., Haff L. R., “A class of minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix”, Statist. Decisions, 1 (1983), 105–129 | MR | Zbl

[11] Gleser L. J., “Minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix”, The Annals of Statistics, 14 (1986), 1625–1633 | DOI | MR | Zbl

[12] Fourdrinier D., Pergamenshchikov S., “Improved selection model method for the regression with dependent noise”, Ann. of the Inst. of Statist. Math., 59:3 (2007), 435–464 | DOI | MR | Zbl

[13] Fourdrinier D., Strawderman W. E., William E., “A unified and generalized set of shrinkage bounds on minimax Stein estimates”, J. Multivariate Anal., 99 (2008), 2221–2233 | DOI | MR | Zbl

[14] Pchelintsev E. A., “Protsedura Dzheymsa–Steyna dlya uslovno-gaussovskoy regressii”, Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika, 2011, no. 4(16), 6–17 (in Russian)

[15] Konev V. V., Pchelintsev E. A., “Otsenivanie parametricheskoy regressii s impul'snymi shumami po diskretnym nablyudeniyam”, Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika, 2012, no. 1(17), 20–35 (in Russian)

[16] Pchelintsev E., “Improved estimation in a non-Gaussian parametric regression”, Statistical Inference for Stochastic Processes, 16:1 (2013), 15–28 | DOI | MR | Zbl

[17] Konev V. V., Pergamenshchikov S. M., Pchelintsev E. A., “Estimation of a regression with the noise of pulse type from discrete data”, Theory of Probability and its Applications, 58:3 (2014), 454–471 | DOI

[18] Ibragimov I. A., Khas'minskii R. Z., Statistical Estimation. Asymptotic Theory, Springer-Verlag, New York, 1981 | MR | MR | Zbl