Divergence between various estimates of quantized information sources
Kybernetika, Tome 32 (1996) no. 4, pp. 395-407 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Classification : 62B10, 81P68, 94A29
@article{KYB_1996_32_4_a5,
     author = {Morales, Domingo and Pardo, Leandro and Vajda, Igor},
     title = {Divergence between various estimates of quantized information sources},
     journal = {Kybernetika},
     pages = {395--407},
     year = {1996},
     volume = {32},
     number = {4},
     mrnumber = {1420131},
     zbl = {0930.94014},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_1996_32_4_a5/}
}
TY  - JOUR
AU  - Morales, Domingo
AU  - Pardo, Leandro
AU  - Vajda, Igor
TI  - Divergence between various estimates of quantized information sources
JO  - Kybernetika
PY  - 1996
SP  - 395
EP  - 407
VL  - 32
IS  - 4
UR  - http://geodesic.mathdoc.fr/item/KYB_1996_32_4_a5/
LA  - en
ID  - KYB_1996_32_4_a5
ER  - 
%0 Journal Article
%A Morales, Domingo
%A Pardo, Leandro
%A Vajda, Igor
%T Divergence between various estimates of quantized information sources
%J Kybernetika
%D 1996
%P 395-407
%V 32
%N 4
%U http://geodesic.mathdoc.fr/item/KYB_1996_32_4_a5/
%G en
%F KYB_1996_32_4_a5
Morales, Domingo; Pardo, Leandro; Vajda, Igor. Divergence between various estimates of quantized information sources. Kybernetika, Tome 32 (1996) no. 4, pp. 395-407. http://geodesic.mathdoc.fr/item/KYB_1996_32_4_a5/

[1] A. Barron L. Györfi, E. van der Meulen: Distribution estimation consistent in total variation and in two types of information divergence. IEEE Trans. Inform. Theory 38 (1992), 1437-1454. | MR

[2] T. Berger: Rate Distortion Theory: A Mathematical Basis for Data Compression. Prentice-Hall, Englewood Cliffs, NJ 1971. | MR

[3] H. Chernoff, E. L. Lehmann: The use of maximum likelihood estimates in $\chi^2$ tests of goodness of fit. Ann. Math. Statist. 25 (1954), 579-586. | MR | Zbl

[4] B. S. Clarke, A. R. Barron: Information-theoretic asymptotics and Bayes methods. IEEE Trans. Inform. Theory 36 (1990), 453-471. | MR

[5] T. M. Cover, J. A. Thomas: Elements of Information Theory. New York, Wiley 1991. | MR | Zbl

[6] N. Cressie, T. R. C. Read: Multinomial goodness of fit tests. J. Roy. Statist. Soc. Ser. A 46 (1984), 440-464. | MR | Zbl

[7] I. Csiszár: Information-type measures of difference of probability distributions and their indirect observation. Studia Sci. Math. Hungar. 2 (1967), 299-318. | MR

[8] I. Csiszár: Generalized cutoff rates and Rényi's information measures. IEEE Trans. Inform. Theory 41 (1995), 26-34. | MR | Zbl

[9] R. C. Dahiya, J. Gurland: Pearson chi-squared test of fit with random intervals. Biometrika 59 (1972), 147-153. | MR | Zbl

[10] A. Gersho, R. M. Gray: Vector Quantization and Signal Compression. Kluwer, Boston 1991.

[11] L. Györfi I. Vajda, E. van der Meulen: Minimum Hellinger distance point estimates consistent under weak family regularity. Mathem. Methods of Statistics 3 (1994), 25-45. | MR

[12] L. Györfi I. Vajda, E. van der Meulen: Parameter estimation by projecting on structural families. In: Proc. 5th Prague Symp. on Asympt. Statistics (P. Mandl and H. Hušková, eds.), Physica Verlag, Wien 1994, pp. 261-272. | MR

[13] W. C. M. Kallenberg J. Oosterhoff, B. F. Schriever: The number of classes in chi-squared goodness of fit tests. J. Amer. Statist. Assoc. 80 (1985), 959-968. | MR

[14] M. Menéndez D. Morales L. Pardo, I. Vajda: Divergence-based estimation and testing of statistical models of classification. J. Multivariate Anal. 54 (1995), 329-354. | MR

[15] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. | MR | Zbl

[16] D. S. Moore: A chi-squared statistics with random cell boundaries. Ann. Math. Statist. 42 (1971), 147-156. | MR

[17] D. Morales L. Pardo, I. Vajda: Asymptotic divergence of estimates of discrete distributions. J. Statist. Plann. Inference 49, 1995. | MR

[18] F. Österreicher, I. Vajda: Statistical information and discrimination. IEEE Trans. Inform. Theory 39 (1993), 1036-1039. | MR

[19] F. H. Ruymgaart: A note on chi-square statistics with random cell boundaries. Ann. Statist. 3 (1975), 965-968. | MR | Zbl

[20] M. Teboulle, I. Vajda: Convergence of best $\phi$-entropy estimates. IEEE Trans. Inform. Theory 39 (1993), 297-301. | MR | Zbl

[21] I. Vajda: From perceptron to Boltzman machine: Information processing by cognitive networks. In: Proc. of the Third European School of System Sciences (I. Figuearas, A. Moncho and R. Torres, eds.), Univ. of Valencia, Valencia 1994, pp. 65-68.

[22] A. Veselý, I. Vajda: Classification of random signals by neural networks. In: Proc. of 14th Internat. Congress of Cybernetics, University of Namur, Namur 1996, in print.