Extensions of the parametric families of divergences used in statistical inference
Kybernetika, Tome 44 (2008) no. 1, pp. 95-112 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

We propose a simple method of construction of new families of $\phi$%-divergences. This method called convex standardization is applicable to convex and concave functions $\psi(t)$ twice continuously differentiable in a neighborhood of $t=1$ with nonzero second derivative at the point $t=1$. Using this method we introduce several extensions of the LeCam, power, $% \chi^a$ and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
We propose a simple method of construction of new families of $\phi$%-divergences. This method called convex standardization is applicable to convex and concave functions $\psi(t)$ twice continuously differentiable in a neighborhood of $t=1$ with nonzero second derivative at the point $t=1$. Using this method we introduce several extensions of the LeCam, power, $% \chi^a$ and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
Classification : 62B05, 62B10, 62H30
Keywords: divergences; metric divergences; families of $f$-divergences
@article{KYB_2008_44_1_a7,
     author = {K\r{u}s, V\'aclav and Morales, Domingo and Vajda, Igor},
     title = {Extensions of the parametric families of divergences used in statistical inference},
     journal = {Kybernetika},
     pages = {95--112},
     year = {2008},
     volume = {44},
     number = {1},
     mrnumber = {2405058},
     zbl = {1142.62002},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2008_44_1_a7/}
}
TY  - JOUR
AU  - Kůs, Václav
AU  - Morales, Domingo
AU  - Vajda, Igor
TI  - Extensions of the parametric families of divergences used in statistical inference
JO  - Kybernetika
PY  - 2008
SP  - 95
EP  - 112
VL  - 44
IS  - 1
UR  - http://geodesic.mathdoc.fr/item/KYB_2008_44_1_a7/
LA  - en
ID  - KYB_2008_44_1_a7
ER  - 
%0 Journal Article
%A Kůs, Václav
%A Morales, Domingo
%A Vajda, Igor
%T Extensions of the parametric families of divergences used in statistical inference
%J Kybernetika
%D 2008
%P 95-112
%V 44
%N 1
%U http://geodesic.mathdoc.fr/item/KYB_2008_44_1_a7/
%G en
%F KYB_2008_44_1_a7
Kůs, Václav; Morales, Domingo; Vajda, Igor. Extensions of the parametric families of divergences used in statistical inference. Kybernetika, Tome 44 (2008) no. 1, pp. 95-112. http://geodesic.mathdoc.fr/item/KYB_2008_44_1_a7/

[1] Beirlant J., Devroye L., Győrfi, L., Vajda I.: Large deviations of divergence measures of partitions. J. Statist. Plann. Inference 93 (2001), 1–16 | MR

[2] Csiszár I., Fisher J.: Informationsentfernungen im Raum der Narcheinlichkeitsverteilungen. Publ. Math. Inst. Hungar. Acad. Sci. 7 (1962), 159–180 | MR

[3] Győrfi L., Vajda I.: Asymptotic distributions for goodness-of-fit statistics in a sequence of multinomial models. Statist. Probab. Lett. 56 (2002), 57–67 | MR

[4] Hobza T., Molina, I., Vajda I.: On convergence of Fisher’s information in continuous models with quantized observations. Test 4 (2005), 151–179

[5] Kafka P., Österreicher, F., Vincze I.: On powers of Csiszár $f$-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 415–422 | MR

[6] Kullback S., Leibler R.: On information and sufficiency. Ann. Math. Statist. 22 (1951), 79–86 | MR | Zbl

[7] Kullback S.: Statistics and Information Theory. Wiley, New York 1957

[8] Kůs V.: Blended $\phi $-divergences with examples. Kybernetika 39 (2003), 43–54 | MR

[9] Cam L. Le: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986 | MR | Zbl

[10] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 | MR | Zbl

[11] Liese F., Vajda I.: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394–4412 | MR

[12] Lindsay B. G.: Efficiency versus robustness: The case of minimum Hellinger distance and other methods. Ann. Statist. 22 (1994), 1081–1114 | MR

[13] Morales D., Pardo, L., Vajda I.: Some new statistics for testing hypotheses in parametric models. J. Multivariate Anal. 62 (1997), 137–168 | MR | Zbl

[14] Morales D., Pardo, L., Vajda I.: Limit laws for disparities of spacings. Nonparametric Statistics 15 (2003), 325–342 | MR | Zbl

[15] Morales D., Pardo, L., Vajda I.: On the optimal number of classes in the Pearson goodness-of-fit tests. Kybernetika 41 (2005), 677–698 | MR

[16] Österreicher F.: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389–393 | MR | Zbl

[17] Österreicher F., Vajda I.: A new class of metric divergences on probability spaces and its applicability in statistics. Ann. Inst. Statist. Math. 55 (2003), 639–653 | MR | Zbl

[18] Pardo L.: Statistical Inference Based on Divergence Measures. Chapman&Hall, London 2006 | MR | Zbl

[19] Read T. C. R., Cressie N. A.: Goodness-of-fit Statistics for Discrete Multivariate Data. Springer, Berlin 1988 | MR | Zbl

[20] Vajda I.: $\chi ^{a}$-divergence and generalized Fisher information. In: Trans. 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague 1973, pp. 872–886 | MR | Zbl

[21] Vajda I.: Theory of Statistical Inference and Information. Kluwer, Boston 1989 | Zbl

[22] Vajda I., Kůs V.: Relations Between Divergences, Total Variations and Euclidean Distances. Research Report No. 1853, Institute of Information Theory, Prague 1995

[23] Vajda I., Meulen E. C. van der: Optimization of Barron density stimates. IEEE Trans. Inform. Theory 47 (2001), 1867–1883 | MR