On metric divergences of probability measures
Kybernetika, Tome 45 (2009) no. 6, pp. 885-900
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Standard properties of $\phi$-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e.\,g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.
Standard properties of $\phi$-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e.\,g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.
Classification : 62B10, 62H30, 68T10, 94A17
Keywords: total variation; Hellinger divergence; Le Cam divergence; Information divergence; Jensen-Shannon divergence; metric divergences
@article{KYB_2009_45_6_a0,
     author = {Vajda, Igor},
     title = {On metric divergences of probability measures},
     journal = {Kybernetika},
     pages = {885--900},
     year = {2009},
     volume = {45},
     number = {6},
     mrnumber = {2650071},
     zbl = {1186.94421},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2009_45_6_a0/}
}
TY  - JOUR
AU  - Vajda, Igor
TI  - On metric divergences of probability measures
JO  - Kybernetika
PY  - 2009
SP  - 885
EP  - 900
VL  - 45
IS  - 6
UR  - http://geodesic.mathdoc.fr/item/KYB_2009_45_6_a0/
LA  - en
ID  - KYB_2009_45_6_a0
ER  - 
%0 Journal Article
%A Vajda, Igor
%T On metric divergences of probability measures
%J Kybernetika
%D 2009
%P 885-900
%V 45
%N 6
%U http://geodesic.mathdoc.fr/item/KYB_2009_45_6_a0/
%G en
%F KYB_2009_45_6_a0
Vajda, Igor. On metric divergences of probability measures. Kybernetika, Tome 45 (2009) no. 6, pp. 885-900. http://geodesic.mathdoc.fr/item/KYB_2009_45_6_a0/

[1] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299–318. | MR

[2] I. Csiszár: On topological properties of $f$-divergences. Studia Sci. Math. Hungar. 2 (1967), 329–339. | MR

[3] B. Fuglede, T. Topsøe: Jensen–Shannon divergence and Hilbert space embedding. In: Proc. IEEE Internat. Symposium on Inform. Theory, IEEE Publications, New York 2004, p. 31.

[4] P. Kafka, F. Österreicher, I. Vincze: On powers of $f$-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 329–339. | MR

[5] M. Khosravifard, D. Fooladivanda, T. A. Gulliver: Confliction of the convexity and metric properties in $f$-divergences. IEICE Trans. on Fundamentals E90-A (2007), 1848–1853. | DOI

[6] V. Kůs, D. Morales, I. Vajda: Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 95–112. | MR | Zbl

[7] L. Le Cam: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986. | MR | Zbl

[8] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. | MR | Zbl

[9] F. Liese, I. Vajda: On divergence and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394–4412. | DOI | MR

[10] K. Matusita: Decision rules based on the distance for problems of fit, two samples and estimation. Ann. Math. Statist. 26 (1955), 631–640. | DOI | MR | Zbl

[11] F. Öesterreicher: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389–393. | MR

[12] F. Österreicher, I. Vajda: A new class of metric divergences on probability spaces and its statistical applications. Ann. Inst. Statist. Math. 55 (2003), 639–653. | DOI | MR

[13] I. Vajda: On the $f$-divergence and singularity of probability measures. Period. Math. Hungar. 2 (1972), 223–234. | DOI | MR | Zbl

[14] I. Vincze: On the concept and measure of information contained in an observation. In: Contributions to Probability (J. Gani and V. F. Rohatgi, eds.), Academic Press, New York 1981, pp. 207–214. | MR | Zbl