Modified power divergence estimators in normal models – simulation and comparative study
Kybernetika, Tome 48 (2012) no. 4, pp. 795-808 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the $\phi$-divergence is always equal to its upper bound, and the minimum $\phi$-divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by $\alpha \in \mathbb{R}$ in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different $\phi$-divergence parameters.
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the $\phi$-divergence is always equal to its upper bound, and the minimum $\phi$-divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by $\alpha \in \mathbb{R}$ in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different $\phi$-divergence parameters.
Classification : 62B05, 62H30
Keywords: minimum $\phi $-divergence estimation; subdivergence; superdivergence; PC simulation; relative efficiency; robustness
@article{KYB_2012_48_4_a8,
     author = {Fr\'ydlov\'a, Iva and Vajda, Igor and K\r{u}s, V\'aclav},
     title = {Modified power divergence estimators in normal models {\textendash} simulation and comparative study},
     journal = {Kybernetika},
     pages = {795--808},
     year = {2012},
     volume = {48},
     number = {4},
     mrnumber = {3013399},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2012_48_4_a8/}
}
TY  - JOUR
AU  - Frýdlová, Iva
AU  - Vajda, Igor
AU  - Kůs, Václav
TI  - Modified power divergence estimators in normal models – simulation and comparative study
JO  - Kybernetika
PY  - 2012
SP  - 795
EP  - 808
VL  - 48
IS  - 4
UR  - http://geodesic.mathdoc.fr/item/KYB_2012_48_4_a8/
LA  - en
ID  - KYB_2012_48_4_a8
ER  - 
%0 Journal Article
%A Frýdlová, Iva
%A Vajda, Igor
%A Kůs, Václav
%T Modified power divergence estimators in normal models – simulation and comparative study
%J Kybernetika
%D 2012
%P 795-808
%V 48
%N 4
%U http://geodesic.mathdoc.fr/item/KYB_2012_48_4_a8/
%G en
%F KYB_2012_48_4_a8
Frýdlová, Iva; Vajda, Igor; Kůs, Václav. Modified power divergence estimators in normal models – simulation and comparative study. Kybernetika, Tome 48 (2012) no. 4, pp. 795-808. http://geodesic.mathdoc.fr/item/KYB_2012_48_4_a8/

[1] M. Broniatowski, A. Keziou: Minimization of $\phi $-divergences on sets of signed measures. Studia Sci. Math. Hungar. 43 (2006), 403-442. | MR | Zbl

[2] M. Broniatowski, A. Keziou: Parametric estimation and tests through divergences and the duality technique. J. Multivariate Anal. 100 (2009), 16-36. | DOI | MR | Zbl

[3] M. Broniatowski, I. Vajda: Several Applications of Divergence Criteria in Continuous Families. Research Report No. 2257. Institute of Information Theory and Automation, Prague 2009.

[4] I. Frýdlová: Minimum Kolmogorov Distance Estimators. Diploma Thesis. Czech Technical University, Prague 2004.

[5] I. Frýdlová: Modified Power Divergence Estimators and Their Performances in Normal Models. In: Proc. FernStat2010, Faculty of Social and Economic Studies UJEP, Ústí n. L. 2010, 28-33.

[6] F. Liese, I. Vajda: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412. | DOI | MR

[7] A. Toma, S. Leoni-Aubin: Robust tests based on dual divergence estimators and saddlepoint approximations. J. Multivariate Anal. 101 (2010), 1143-1155. | DOI | MR | Zbl

[8] A. Toma, M. Broniatowski: Dual divergence estimators and tests: Robustness results. J. Multivariate Analysis 102 (2011), 20-36. | DOI | MR | Zbl

[9] I. Vajda: Theory of Statistical Inference and Information. Kluwer, Boston 1989. | Zbl