Optimality conditions for maximizers of the information divergence from an exponential family
Kybernetika, Tome 43 (2007) no. 5, pp. 731-746
Voir la notice de l'article provenant de la source Czech Digital Mathematics Library
The information divergence of a probability measure $P$ from an exponential family $\mathcal{E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q\in \mathcal{E}$. All directional derivatives of the divergence from $\mathcal{E}$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $P$ to be a maximizer of the divergence from $\mathcal{E}$ are presented, including new ones when $P$ is not projectable to $\mathcal{E}$.
Classification :
52A20, 60A10, 62B10, 90C90, 94A17
Keywords: Kullback–Leibler divergence; relative entropy; exponential family; information projection; log-Laplace transform; cumulant generating function; directional derivatives; first order optimality conditions; convex functions; polytopes
Keywords: Kullback–Leibler divergence; relative entropy; exponential family; information projection; log-Laplace transform; cumulant generating function; directional derivatives; first order optimality conditions; convex functions; polytopes
@article{KYB_2007__43_5_a9,
author = {Mat\'u\v{s}, Franti\v{s}ek},
title = {Optimality conditions for maximizers of the information divergence from an exponential family},
journal = {Kybernetika},
pages = {731--746},
publisher = {mathdoc},
volume = {43},
number = {5},
year = {2007},
mrnumber = {2376334},
zbl = {1149.94007},
language = {en},
url = {http://geodesic.mathdoc.fr/item/KYB_2007__43_5_a9/}
}
Matúš, František. Optimality conditions for maximizers of the information divergence from an exponential family. Kybernetika, Tome 43 (2007) no. 5, pp. 731-746. http://geodesic.mathdoc.fr/item/KYB_2007__43_5_a9/