Information-type divergence when the likelihood ratios are bounded
Applicationes Mathematicae, Tome 24 (1997) no. 4, pp. 415-423
Cet article a éte moissonné depuis la source Institute of Mathematics Polish Academy of Sciences
The so-called ϕ-divergence is an important characteristic describing "dissimilarity" of two probability distributions. Many traditional measures of separation used in mathematical statistics and information theory, some of which are mentioned in the note, correspond to particular choices of this divergence. An upper bound on a ϕ-divergence between two probability distributions is derived when the likelihood ratio is bounded. The usefulness of this sharp bound is illustrated by several examples of familiar ϕ-divergences. An extension of this inequality to ϕ-divergences between a finite number of probability distributions with pairwise bounded likelihood ratios is also given.
DOI :
10.4064/am-24-4-415-423
Keywords:
information measures, multiple decisions, convexity, likelihood ratio
Affiliations des auteurs :
Andrew Rukhin 1
@article{10_4064_am_24_4_415_423,
author = {Andrew Rukhin},
title = {Information-type divergence when the likelihood ratios are bounded},
journal = {Applicationes Mathematicae},
pages = {415--423},
year = {1997},
volume = {24},
number = {4},
doi = {10.4064/am-24-4-415-423},
zbl = {0893.60008},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.4064/am-24-4-415-423/}
}
TY - JOUR AU - Andrew Rukhin TI - Information-type divergence when the likelihood ratios are bounded JO - Applicationes Mathematicae PY - 1997 SP - 415 EP - 423 VL - 24 IS - 4 UR - http://geodesic.mathdoc.fr/articles/10.4064/am-24-4-415-423/ DO - 10.4064/am-24-4-415-423 LA - en ID - 10_4064_am_24_4_415_423 ER -
Andrew Rukhin. Information-type divergence when the likelihood ratios are bounded. Applicationes Mathematicae, Tome 24 (1997) no. 4, pp. 415-423. doi: 10.4064/am-24-4-415-423
Cité par Sources :