Maximizing multi–information
Kybernetika, Tome 42 (2006) no. 5, pp. 517-538 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Stochastic interdependence of a probability distribution on a product space is measured by its Kullback–Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probability distributions with globally maximal multi-information we obtain our main result: The exponential family of pure pair-interactions contains all global maximizers of the multi-information in its closure.
Stochastic interdependence of a probability distribution on a product space is measured by its Kullback–Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probability distributions with globally maximal multi-information we obtain our main result: The exponential family of pure pair-interactions contains all global maximizers of the multi-information in its closure.
Classification : 60B10, 82C32, 92B20, 94A15
Keywords: multi-information; exponential family; relative entropy; pair- interaction; infomax principle; Boltzmann machine; neural networks
@article{KYB_2006_42_5_a0,
     author = {Ay, Nihat and Knauf, Andreas},
     title = {Maximizing multi{\textendash}information},
     journal = {Kybernetika},
     pages = {517--538},
     year = {2006},
     volume = {42},
     number = {5},
     mrnumber = {2283503},
     zbl = {1249.82011},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2006_42_5_a0/}
}
TY  - JOUR
AU  - Ay, Nihat
AU  - Knauf, Andreas
TI  - Maximizing multi–information
JO  - Kybernetika
PY  - 2006
SP  - 517
EP  - 538
VL  - 42
IS  - 5
UR  - http://geodesic.mathdoc.fr/item/KYB_2006_42_5_a0/
LA  - en
ID  - KYB_2006_42_5_a0
ER  - 
%0 Journal Article
%A Ay, Nihat
%A Knauf, Andreas
%T Maximizing multi–information
%J Kybernetika
%D 2006
%P 517-538
%V 42
%N 5
%U http://geodesic.mathdoc.fr/item/KYB_2006_42_5_a0/
%G en
%F KYB_2006_42_5_a0
Ay, Nihat; Knauf, Andreas. Maximizing multi–information. Kybernetika, Tome 42 (2006) no. 5, pp. 517-538. http://geodesic.mathdoc.fr/item/KYB_2006_42_5_a0/

[1] Aarts E., Korst J.: Simulated Annealing and Boltzmann Machines. Wiley, New York 1989 | MR | Zbl

[2] Ackley D. H., Hinton G. E., Sejnowski T. J.: A learning algorithm for Boltzmann machines. Cognitive Science 9 (1985), 147–169 | DOI

[3] Aigner M.: Combinatorial Theory, Classics in Mathematics. Springer–Verlag, Berlin 1997 | MR

[4] Amari S.: Information geometry on hierarchy of probability distributions. IEEE Trans. Inform. Theory 47 (2001), 1701–1711 | DOI | MR | Zbl

[5] Amari S., Kurata, K., Nagaoka H.: Information geometry of Boltzmann machines. IEEE Trans. Neural Networks 3 (1992), 2, 260–271 | DOI

[6] Ay N.: An information-geometric approach to a theory of pragmatic structuring. Ann. Probab. 30 (2002), 416–436 | DOI | MR | Zbl

[7] Ay N.: Locality of global stochastic interaction in directed acyclic networks. Neural Computation 14 (2002), 2959–2980 | DOI | Zbl

[8] Linsker R.: Self-organization in a perceptual network. IEEE Computer 21 (1988), 105–117 | DOI

[9] Matúš F., Ay N.: On maximization of the information divergence from an exponential family. In: Proc. WUPES’03 (J. Vejnarová, ed.), University of Economics, Prague 2003, pp. 199–204

[10] Shannon C. E.: A mathematical theory of communication. Bell System Tech. J. 27 (1948), 379–423, 623–656 | DOI | MR | Zbl

[11] Tononi G., Sporns, O., Edelman G. M.: A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Nat. Acad. Sci. U. S. A. 91 (1994), 5033–5037 | DOI