On typical encodings of multivariate ergodic sources
Kybernetika, Tome 56 (2020) no. 6, pp. 1090-1110
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.
We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.
DOI : 10.14736/kyb-2020-6-1090
Classification : 94A24, 94A29
Keywords: entropy; entropy rate; multivariate source; ergodic source; a.e.p. property
@article{10_14736_kyb_2020_6_1090,
     author = {Kupsa, Michal},
     title = {On typical encodings of multivariate ergodic sources},
     journal = {Kybernetika},
     pages = {1090--1110},
     year = {2020},
     volume = {56},
     number = {6},
     doi = {10.14736/kyb-2020-6-1090},
     mrnumber = {4199905},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1090/}
}
TY  - JOUR
AU  - Kupsa, Michal
TI  - On typical encodings of multivariate ergodic sources
JO  - Kybernetika
PY  - 2020
SP  - 1090
EP  - 1110
VL  - 56
IS  - 6
UR  - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1090/
DO  - 10.14736/kyb-2020-6-1090
LA  - en
ID  - 10_14736_kyb_2020_6_1090
ER  - 
%0 Journal Article
%A Kupsa, Michal
%T On typical encodings of multivariate ergodic sources
%J Kybernetika
%D 2020
%P 1090-1110
%V 56
%N 6
%U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2020-6-1090/
%R 10.14736/kyb-2020-6-1090
%G en
%F 10_14736_kyb_2020_6_1090
Kupsa, Michal. On typical encodings of multivariate ergodic sources. Kybernetika, Tome 56 (2020) no. 6, pp. 1090-1110. doi: 10.14736/kyb-2020-6-1090

[1] Bassoli, R., Marques, H., Rodriguez, J., Shum, K. W., Tafazolli, R.: Network coding theory: A survey. IEEE Commun. Surveys Tutor. 15 (2013), 1950-1978. | DOI

[2] Cover, T. M., Thomas, J. A.: Elements of Information Theory. John Wiley and Sons, 2012. | DOI | MR

[3] Gray, R. M., Kieffer, J. C.: Asymptotically mean stationary measures. Ann. Probab. 8 (1980), 962-973. | DOI | MR

[4] Gray, R. M.: Entropy and Information Theory. Springer Science and Business Media, 2011. | MR

[5] Kaced, T.: Partage de secret et théorie algorithmique de l'information. PhD. Thesis, Université Montpellier 2, 2012.

[6] Kieffer, J. C.: A generalized Shannon-McMillan theorem for the action of an amenable group on a probability space. Ann. Probab. 3 (1975), 1031-1037. | DOI | MR

[7] Matúš, F.: Two constructions on limits of entropy functions. IEEE Trans. Inform. Theory 53 (2007), 320-330. | DOI | MR

[8] Matúš, F., Csirmaz, L.: Entropy region and convolution. IEEE Trans. Inform. Theory 62 (2016), 6007-6018. | DOI | MR

[9] Matúš, F., Kupsa, M.: On colorings of bivariate random sequences. In: Proc. IEEE International Symposium on Information Theory 2010, pp. 1272-1275. | DOI

[10] Yeung, R. W.: Information Theory and Network Coding. Springer Science and Business Media, 2008. | DOI

Cité par Sources :