Contribution to the theory of Pitman estimators
Zapiski Nauchnykh Seminarov POMI, Probability and statistics. Part 18, Tome 408 (2012), pp. 245-267 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice du chapitre de livre

New inequalities are proved for the variance of the Pitman estimators (minimum variance equivariant estimators) of $\theta$ constructed from samples of fixed size from populations $F(x-\theta)$. The inequalities are closely related to the classical Stam inequality for the Fisher information, its analog in small samples, and a powerful variance drop inequality. The only condition required is finite variance of $F$; even the absolute continuity of $F$ is not assumed. As corollaries of the main inequalities for small samples, one obtains alternate proofs of known properties of the Fisher information, as well as interesting new observations like the fact that the variance of the Pitman estimator based on a sample of size $n$ scaled by $n$ monotonically decreases in $n$. Extensions of the results to the polynomial versions of the Pitman estimators and a multivariate location parameter are given. Also, the search for characterization of equality conditions for one of the inequalities leads to a Cauchy-type functional equation for independent random variables, and an interesting new behavior of its solutions is described.
@article{ZNSL_2012_408_a14,
     author = {A. M. Kagan and Tinghui Yu and A. Barron and M. Madiman},
     title = {Contribution to the theory of {Pitman} estimators},
     journal = {Zapiski Nauchnykh Seminarov POMI},
     pages = {245--267},
     year = {2012},
     volume = {408},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/ZNSL_2012_408_a14/}
}
TY  - JOUR
AU  - A. M. Kagan
AU  - Tinghui Yu
AU  - A. Barron
AU  - M. Madiman
TI  - Contribution to the theory of Pitman estimators
JO  - Zapiski Nauchnykh Seminarov POMI
PY  - 2012
SP  - 245
EP  - 267
VL  - 408
UR  - http://geodesic.mathdoc.fr/item/ZNSL_2012_408_a14/
LA  - en
ID  - ZNSL_2012_408_a14
ER  - 
%0 Journal Article
%A A. M. Kagan
%A Tinghui Yu
%A A. Barron
%A M. Madiman
%T Contribution to the theory of Pitman estimators
%J Zapiski Nauchnykh Seminarov POMI
%D 2012
%P 245-267
%V 408
%U http://geodesic.mathdoc.fr/item/ZNSL_2012_408_a14/
%G en
%F ZNSL_2012_408_a14
A. M. Kagan; Tinghui Yu; A. Barron; M. Madiman. Contribution to the theory of Pitman estimators. Zapiski Nauchnykh Seminarov POMI, Probability and statistics. Part 18, Tome 408 (2012), pp. 245-267. http://geodesic.mathdoc.fr/item/ZNSL_2012_408_a14/

[1] S. Artstein, K. M. Ball, F. Barthe, A. Naor, “Solution of Shannon's problem on the monotonicity of entropy”, J. Amer. Math. Soc., 17:4 (2004), 975–982, (electronic) | DOI | MR | Zbl

[2] A. DasGupta, “Letter to the editors”, IMS Bulletin, 37:6 (2008), 16

[3] E. A. Carlen, “Superadditivity of Fisher's information and logarithmic Sobolev inequalities”, J. Funct. Anal., 101:1 (1991), 194–211 | DOI | MR | Zbl

[4] B. Efron, C. Stein, “The jackknife estimate of variance”, Ann. Statist., 9:3 (1981), 586–596 | DOI | MR | Zbl

[5] W. Hoeffding, “A class of statistics with asymptotically normal distribution”, Ann. Math. Statist., 19:3 (1948), 293–325 | DOI | MR | Zbl

[6] J. Hoffmann-Jørgensen, A. M. Kagan, L. D. Pitt, L. A. Shepp, “Strong decomposition of random variables”, J. Theor. Probab., 20:2 (2007), 211–220 | DOI | MR | Zbl

[7] I. A. Ibragimov, R. Z. Has'minski, Statistical Estimation: Asymptotic Theory, Applications of Mathematics, 16, Springer, New York, 1981 | MR | Zbl

[8] A. Kagan, Z. Landsman, “Statistical meaning of Carlen's superadditivity of the Fisher information”, Statist. Probab. Lett., 32 (1997), 175–179 | DOI | MR | Zbl

[9] A. M. Kagan, Ya. Malinovsky, “Monotonicity in the sample size of the length of classical confidence intervals”, Statist. Probab. Lett., 83:1 (2013), 78–82 | DOI | MR | Zbl

[10] A. Kagan, “An inequality for the Pitman estimators related to the Stam inequality”, Sankhyā, Ser. A, 64 (2002), 281–292 | MR

[11] A. M. Kagan, “On the estimation theory of location parameter”, Sankhyā, Ser. A, 28 (1966), 335–352 | MR | Zbl

[12] A. M. Kagan, “Fisher information contained in a finite-dimensional linear space, and a properly formulated version of the method of moments”, Probl. Peredači Inform., 12:2 (1976), 20–42 | MR | Zbl

[13] A. M. Kagan, L. B. Klebanov, S. M. Fintušal, “Asymptotic behavior of polynomial Pitman estimators”, Zap. Nauchn. Semin. LOMI, 43, 1974, 30–39 | MR

[14] E. Lukacs, Characteristic Functions, 2nd ed., Hafner Publishing Co., New York, 1970 | MR

[15] M. Madiman, A. R. Barron, A. M. Kagan, T. Yu, Fundamental limits for distributed estimation: the case of a location parameter, Preprint, 2009

[16] M. Madiman, A. R. Barron, A. M. Kagan, T. Yu, “A model for pricing data bundles based on minimax risks for estimation of a location parameter”, Proc. IEEE Inform. Theory Workshop (Volos, Greece, June 2009)

[17] M. Madiman, A. R. Barron, “Generalized entropy power inequalities and monotonicity properties of information”, IEEE Trans. Inform. Theory, 53:7 (2007), 2317–2329 | DOI | MR

[18] J. Shao, Mathematical Statistics, 2nd ed., Springer, New York, 2003 | MR

[19] N.-Z. Shi, “Letter to the Editors”, IMS Bulletin, 36:4 (2008), 4

[20] A. J. Stam, “Some inequalities satisfied by the quantities of information of Fisher and Shannon”, Inform. Control, 2 (1959), 101–112 | DOI | MR | Zbl

[21] R. Zamir, “A proof of the Fisher information inequality via a data processing argument”, IEEE Trans. Inform. Theory, 44:3 (1998), 1246–1250 | DOI | MR | Zbl