Voir la notice de l'article provenant de la source Math-Net.Ru
[1] M. Belis, S. Guiasu, “A Quantitative and qualitative measure of information in cybernetic systems”, IEEE Trans. Inf. Th., 14 (1968), 593–594 | DOI
[2] A. Clim, “Weighted entropy with application”, Analele Universitatii Bucuresti Matematica, LVII (2008), 223–231 | MR
[3] T. M. Cover, J. M. Thomas, Elements of information theory, Basic Books, NY, 2006
[4] R. L. Dobrushin, “Passing to the limit under the sign of the information and entropy”, Theory Prob. Appl., 1960, 29–37 | Zbl
[5] I. S. Gradshteyn, I. M. Ryzhik, Table of integrals, series, and product, Elsevier, 2007 | MR
[6] M. Kelbert, Yu. Suhov, “Continuity of mutual entropy in the large signal-to-noise ratio limit”, Stochastic Analysis, Springer, Berlin, 2010, 281–299 | MR
[7] M. Kelbert, Yu. Suhov, Information theory and coding by example, Cambridge University Press, Cambridge, 2013 | MR | Zbl
[8] M. Kelbert, P. Mozgunov, “Shannon's differential entropy asymptotic analysis in a Bayesian problem”, Mathematical Communications, 20:2 (2015), 219–228 | MR
[9] M. Kelbert, P. Mozgunov, Asymptotic behaviour of weighted differential entropies in a Bayesian problem, 2015, arXiv: 1504.01612
[10] M. Kelbert, Yu. Suhov, S. Yasaei Sekeh, Entropy-power inequality for weighted entropy, 2015, arXiv: 1502.02188
[11] M. Kelbert, Yu. Suhov, I. Stuhl, S. Yasaei Sekeh, Basic inequalities for weighted entropies, 2015, arXiv: 1510.02184