Tropical probability theory and an application to the entropic cone
Kybernetika, Tome 56 (2020) no. 6, pp. 1133-1153
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

In a series of articles, we have been developing a theory of tropical diagrams of probability spaces, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone.
In a series of articles, we have been developing a theory of tropical diagrams of probability spaces, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone.
DOI : 10.14736/kyb-2020-6-1133
Classification : 94A17, 94A24
Keywords: tropical probability; entropic cone; non-Shannon inequality
@article{10_14736_kyb_2020_6_1133,
     author = {Matveev, Rostislav and Portegies, Jacobus W.},
     title = {Tropical probability theory and an application to the entropic cone},
     journal = {Kybernetika},
     pages = {1133--1153},
     year = {2020},
     volume = {56},
     number = {6},
     doi = {10.14736/kyb-2020-6-1133},
     mrnumber = {4199907},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1133/}
}
TY  - JOUR
AU  - Matveev, Rostislav
AU  - Portegies, Jacobus W.
TI  - Tropical probability theory and an application to the entropic cone
JO  - Kybernetika
PY  - 2020
SP  - 1133
EP  - 1153
VL  - 56
IS  - 6
UR  - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1133/
DO  - 10.14736/kyb-2020-6-1133
LA  - en
ID  - 10_14736_kyb_2020_6_1133
ER  - 
%0 Journal Article
%A Matveev, Rostislav
%A Portegies, Jacobus W.
%T Tropical probability theory and an application to the entropic cone
%J Kybernetika
%D 2020
%P 1133-1153
%V 56
%N 6
%U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1133/
%R 10.14736/kyb-2020-6-1133
%G en
%F 10_14736_kyb_2020_6_1133
Matveev, Rostislav; Portegies, Jacobus W. Tropical probability theory and an application to the entropic cone. Kybernetika, Tome 56 (2020) no. 6, pp. 1133-1153. doi: 10.14736/kyb-2020-6-1133

[1] Ahlswede, R., Körner, J.: On common information and related characteristics of correlated information sources. Preprint, 7th Prague Conference on Information Theory, 1974. | MR

[2] Ahlswede, R., Körner, J.: On common information and related characteristics of correlated information sources. In: General Theory of Information Transfer and Combinatorics (R. Ahlswede et al., eds.), Lecture Notes in Computer Science 4123, Springer, Berlin, Heidelberg, 2006. | MR

[3] Bertschinger, N., Rauh, J., Olbrich, E., Jost, J., Ay, N.: Quantifying unique information. Entropy 16 (2014), 4, 2161-2183. | DOI | MR

[4] Chan, T. H., Yeung, R. W: On a relation between information inequalities and group theory. IEEE Trans. Inform. Theory 48 (2002), 7, 1992-1995. | DOI | MR

[5] Dougherty, R., Freiling, Ch., Zeger, K.: Six new non-shannon information inequalities. In: 2006 IEEE International Symposium on Information Theory, IEEE, 2006, pp. 233-236. | DOI | MR

[6] Dougherty, R., Freiling, Ch., Zeger, K.: Non-Shannon information inequalities in four random variables. arXiv preprint arXiv:1104.3602, 2011. | MR

[7] Gromov, M.: In a search for a structure, part 1: On entropy.

[8] Kovačević, M., Stanojević, I., Šenk, V.: On the hardness of entropy minimization and related problems. In: 2012 IEEE Information Theory Workshop, IEEE, 2012, pp. 512-516. | DOI

[9] Leinster, T.: Basic Category Theory, volume 143. | MR

[10] Matúš, F.: Probabilistic conditional independence structures and matroid theory: background 1. Int. J. General System 22 (1993), 2, 185-196. | DOI

[11] Matúš, F.: Two constructions on limits of entropy functions. IEEE Trans. Inform. Theory 53 (2006), 1, 320-330. | DOI | MR

[12] Matúš, F.: Infinitely many information inequalities. In: IEEE International Symposium on Information Theory, ISIT 2007, IEEE, pp. 41-44. | DOI

[13] Matúš, F., Csirmaz, L.: Entropy region and convolution. IEEE Trans. Inform. Theory 62 (2016), 11, 6007-6018. | DOI | MR

[14] Matúš, F., Studený, M.: Conditional independences among four random variables i. Combinat. Probab. Comput. 4 (1995), 3, 269-278. | DOI | MR

[15] Makarychev, K., Makarychev, Y., Romashchenko, A., Vereshchagin, N.: A new class of non-Shannon-type inequalities for entropies. Comm. Inform. Syst. 2 (2002), 2, 147-166. | DOI | MR

[16] Matveev, R., Portegies, J. W: Asymptotic dependency structure of multiple signals. Inform. Geometry 1 (2018), 2, 237-285. | DOI | MR

[17] Matveev, R., Portegies, J. W.: Arrow Contraction and Expansion in Tropical Diagrams. arXiv e-prints, page arXiv:1905.05597, 2019.

[18] Matveev, R., Portegies, J. W.: Conditioning in tropical probability theory. arXiv e-prints, page arXiv:1905.05596, 2019.

[19] Matveev, R., Portegies, J. W.: Tropical diagrams of probability spaces. arXiv e-prints, page arXiv:1905.04375, 2019. | MR

[20] Slepian, D., Wolf, J.: Noiseless coding of correlated information sources. IEEE Trans. Inform. Theory 19 (1973), 4, 471-480. | DOI | MR

[21] Vidyasagar, M.: A metric between probability distributions on finite sets of different cardinalities and applications to order reduction. IEEE Trans. Automat. Control 57 (2012), 10, 2464-2477. | DOI | MR

[22] Wyner, A.: The common information of two dependent random variables. IEEE Trans. Inform. Theory 21 (1975), 2, 163-179. | DOI | MR

[23] Yeung, R. W.: Information Theory and Network Coding. Springer Science and Business Media, 2008. | DOI

[24] Zhang, Z., Yeung, R. W.: A non-shannon-type conditional inequality of information quantities. IEEE Trans. Inform. Theory 43 (1997), 6, 1982-1986. | DOI | MR

[25] Zhang, Z., Yeung, R. W.: On characterization of entropy function via information inequalities. IEEE Trans. Inform. Theory 44 (1998), 4, 1440-1452. | DOI | MR

Cité par Sources :