An information-theoretic central limit theorem for finitely susceptible FKG systems
Teoriâ veroâtnostej i ee primeneniâ, Tome 50 (2005) no. 2, pp. 331-343 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

We adapt arguments concerning entropy-theoretic convergence from the independent case to the case of Fortuin–Kasteleyn–Ginibre (FKG) random variables. FKG systems are chosen since their dependence structure is controlled through covariance alone, though in what follows we use many of the same arguments for weakly dependent random variables. As in previous work of Barron and Johnson, we consider random variables perturbed by small normals, since the FKG property gives us control of the resulting densities. We need to impose a finite susceptibility condition; that is, the covariance between one random variable and the sum of all the random variables should remain finite.
Keywords: entropy, Fisher information
Mots-clés : normal convergence, FKG variables.
@article{TVP_2005_50_2_a6,
     author = {O. Johnson},
     title = {An information-theoretic central limit theorem for finitely susceptible {FKG} systems},
     journal = {Teori\^a vero\^atnostej i ee primeneni\^a},
     pages = {331--343},
     year = {2005},
     volume = {50},
     number = {2},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/TVP_2005_50_2_a6/}
}
TY  - JOUR
AU  - O. Johnson
TI  - An information-theoretic central limit theorem for finitely susceptible FKG systems
JO  - Teoriâ veroâtnostej i ee primeneniâ
PY  - 2005
SP  - 331
EP  - 343
VL  - 50
IS  - 2
UR  - http://geodesic.mathdoc.fr/item/TVP_2005_50_2_a6/
LA  - en
ID  - TVP_2005_50_2_a6
ER  - 
%0 Journal Article
%A O. Johnson
%T An information-theoretic central limit theorem for finitely susceptible FKG systems
%J Teoriâ veroâtnostej i ee primeneniâ
%D 2005
%P 331-343
%V 50
%N 2
%U http://geodesic.mathdoc.fr/item/TVP_2005_50_2_a6/
%G en
%F TVP_2005_50_2_a6
O. Johnson. An information-theoretic central limit theorem for finitely susceptible FKG systems. Teoriâ veroâtnostej i ee primeneniâ, Tome 50 (2005) no. 2, pp. 331-343. http://geodesic.mathdoc.fr/item/TVP_2005_50_2_a6/

[1] Barron A. R., “Entropy and the central limit theorem”, Ann. Probab., 14:1 (1986), 336–342 | DOI | MR | Zbl

[2] Brown L. D., “A proof of the central limit theorem motivated by the Cramér–Rao inequality”, Statistics and Probability, Essays in Honour of C. R. Rao, eds. G. Kallianpur, P. R. Krishnaiah, J. K. Ghosh, North-Holland, Amsterdam, New York, 1982, 141–148 | MR

[3] Carlen E. A., Soffer A., “Entropy production by block variable summation and central limit theorems”, Commun. Math. Phys., 140:2 (1991), 339–371 | DOI | MR | Zbl

[4] Gnedenko B. V., Korolev V. Y., Random Summation. Limit Theorems and Applications, CRC Press, Boca Raton, 1996, 267 pp. | MR | Zbl

[5] Grimmett G. R., Percolation, Springer-Verlag, Berlin, 1999, 444 pp. | MR | Zbl

[6] Johnson O. T., “Entropy inequalities and the central limit theorem”, Stochastic Process. Appl., 88:2 (2000), 291–304 | DOI | MR | Zbl

[7] Johnson O. T., Barron A. R., “Fisher information inequalities and the central limit theorem”, Probab. Theory Related Fields, 129:3 (2004), 391–409 | DOI | MR | Zbl

[8] Johnson O. T., Suhov Y. M., “Entropy and random vectors”, J. Statist. Phys., 104:1–2 (2001), 145–165 | DOI | MR

[9] Lebowitz J. L., “GHS and other inequalities”, Comm. Math. Phys., 35 (1974), 87–92 | DOI | MR

[10] Lehmann E. L., “Some concepts of dependence”, Ann. Math. Statist., 37 (1966), 1137–1153 | DOI | MR | Zbl

[11] Linnik Yu. V., “Teoretiko-informatsionnoe dokazatelstvo tsentralnoi predelnoi teoremy v usloviyakh Lindeberga”, Teoriya veroyatn. i ee primen., 4:3 (1959), 311–321 | MR

[12] Newman C. M., “Moment inequalities for ferromagnetic Gibbs distributions”, J. Math. Phys., 16:9 (1975), 1956–1959 | DOI | MR

[13] Newman C. M., “Normal fluctuations and the FKG inequalities”, Comm. Math. Phys., 74:2 (1980), 119–128 | DOI | MR | Zbl

[14] Shimizu R., “On Fisher's amount of information for location family”, A Modern Course on Statistical Distributions in Scientific Work, v. 3, Characterizations and Applications, eds. G. P. Patil, S. Kotz, and J. K. Ord, Reidel, Dordrecht, 1975, 305–312

[15] Takano S., “The inequalities of Fisher information and entropy power for dependent variables”, Proceedings of the 7th Japan–Russia Symposium on Probability Theory and Mathematical Statistics (Tokyo, 1995), eds. S. Watanabe et al., World Scientific, River Edge, 1996, 460–470 | MR | Zbl

[16] Takano S., “Entropy and a limit theorem for some dependent variables”, Prague Stochastics'98, v. 2, Union of Czech Mathematicians and Physicists, Prague, 1998, 549–552