On the Jensen-Shannon divergence and the variation distance for categorical probability distributions
Kybernetika, Tome 57 (2021) no. 6, pp. 879-907
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.
DOI :
10.14736/kyb-2021-6-0879
Classification :
62B10, 62H05, 94A17
Keywords: blended divergences; Chan-Darwiche metric; likelihood-free inference; implicit maximum likelihood; reverse Pinsker inequality; simulator-based inference
Keywords: blended divergences; Chan-Darwiche metric; likelihood-free inference; implicit maximum likelihood; reverse Pinsker inequality; simulator-based inference
@article{10_14736_kyb_2021_6_0879,
author = {Corander, Jukka and Remes, Ulpu and Koski, Timo},
title = {On the {Jensen-Shannon} divergence and the variation distance for categorical probability distributions},
journal = {Kybernetika},
pages = {879--907},
year = {2021},
volume = {57},
number = {6},
doi = {10.14736/kyb-2021-6-0879},
mrnumber = {4376866},
zbl = {07478645},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2021-6-0879/}
}
TY - JOUR AU - Corander, Jukka AU - Remes, Ulpu AU - Koski, Timo TI - On the Jensen-Shannon divergence and the variation distance for categorical probability distributions JO - Kybernetika PY - 2021 SP - 879 EP - 907 VL - 57 IS - 6 UR - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2021-6-0879/ DO - 10.14736/kyb-2021-6-0879 LA - en ID - 10_14736_kyb_2021_6_0879 ER -
%0 Journal Article %A Corander, Jukka %A Remes, Ulpu %A Koski, Timo %T On the Jensen-Shannon divergence and the variation distance for categorical probability distributions %J Kybernetika %D 2021 %P 879-907 %V 57 %N 6 %U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2021-6-0879/ %R 10.14736/kyb-2021-6-0879 %G en %F 10_14736_kyb_2021_6_0879
Corander, Jukka; Remes, Ulpu; Koski, Timo. On the Jensen-Shannon divergence and the variation distance for categorical probability distributions. Kybernetika, Tome 57 (2021) no. 6, pp. 879-907. doi: 10.14736/kyb-2021-6-0879
Cité par Sources :