Keywords: divergence measures; information radius; statistical experiment; sufficiency of experiments; Shannon's entropy; comparison of experiments; stochastic transformations;; unified scalar parametric generalizations of Jensen difference divergence measure
@article{10_21136_AM_1991_104481,
author = {Taneja, Inder Jeet and Pardo, L. and Morales, D.},
title = {$(R,S)$-information radius of type $t$ and comparison of experiments},
journal = {Applications of Mathematics},
pages = {440--455},
year = {1991},
volume = {36},
number = {6},
doi = {10.21136/AM.1991.104481},
mrnumber = {1134921},
zbl = {0748.62003},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.21136/AM.1991.104481/}
}
TY - JOUR AU - Taneja, Inder Jeet AU - Pardo, L. AU - Morales, D. TI - $(R,S)$-information radius of type $t$ and comparison of experiments JO - Applications of Mathematics PY - 1991 SP - 440 EP - 455 VL - 36 IS - 6 UR - http://geodesic.mathdoc.fr/articles/10.21136/AM.1991.104481/ DO - 10.21136/AM.1991.104481 LA - en ID - 10_21136_AM_1991_104481 ER -
%0 Journal Article %A Taneja, Inder Jeet %A Pardo, L. %A Morales, D. %T $(R,S)$-information radius of type $t$ and comparison of experiments %J Applications of Mathematics %D 1991 %P 440-455 %V 36 %N 6 %U http://geodesic.mathdoc.fr/articles/10.21136/AM.1991.104481/ %R 10.21136/AM.1991.104481 %G en %F 10_21136_AM_1991_104481
Taneja, Inder Jeet; Pardo, L.; Morales, D. $(R,S)$-information radius of type $t$ and comparison of experiments. Applications of Mathematics, Tome 36 (1991) no. 6, pp. 440-455. doi: 10.21136/AM.1991.104481
[1] Blackwell D. (1951): Comparison of experiments. Proc. 2nd Berkeley Symp. Berkeley: University of California Press, 93-102. | MR
[2] Burbea J. (1984): The Bose-Einstein Entropy of degree a and its Jensen Difference. Utilitas Math. 25, 225-240. | MR
[3] Burbea J., Rao C . R. (1982): Entropy Differential Metric, Distance and Divergence Measures in Probability Spaces: A Unified Approach. J. Multi. Analy. 12, 575 - 596. | DOI | MR
[4] Burbea J., Rao C. R. (1982): On the Convexity of some Divergence Measures based on Entropy Functions. IEEE Trans. on Inform. Theory IT-28, 489-495. | MR
[5] Capocelli R. M., Taneja I. J. (1984): Generalized Divergence Measures and Error Bounds. Proc. IEEE Internat. Conf. on Systems, man and Cybernetics, Oct. 9-12, Halifax, Canada, pp. 43 - 47.
[6] Campbell L. L. (1986): An extended Čencov characterization of the Information Metric. Proc. Ann. Math. Soc., 98, 135-141. | MR
[7] Čencov N. N. (1982): Statistical Decisions Rules and Optimal Inference. Trans. of Math. Monographs, 53, Am. Math. Soc., Providence, R. L. | MR
[8] De Groot M. H. (1970): Optimal Statistical Decisions. McGraw-Hill. New York. | MR
[9] Ferentinos K., Papaioannou T. (1982): Information in experiments and sufficiency. J. Statist. Plann. Inference 6, 309-317. | DOI | MR
[10] Goel P. K., De Groot (1979): Comparison of experiments and information measures. Ann. Statist. 7, 1066-1077. | DOI | MR
[11] Kullback S., Leibler A. (1951): On information and sufficiency. Ann. Math Stat. 27, 986-1005.
[12] Lindley D. V. (1956): On a measure of information provided by an experiment. Ann. Math. Statis. 27, 986-1005. | DOI | MR
[13] Marshall A. W., Olkin I. (1979): Inequalities: Theory of Majorization and its Applications. Academic Press. New York. | MR
[14] Morales D., Taneja I. J., Pardo L.: Comparison of Experiments based on $\phi$-Measures of Jensen Difference. Communicated.
[15] Pardo L., Morales D., Taneja I. J.: $\lambda$-measures of hypoentropy and comparison of experiments: Bayesian approach. To appear in Statistica. | MR | Zbl
[16] Rao C. R. (1982): Diversity and Dissimilarity Coefficients: A Unified Approach. J. Theoret. Pop. Biology, 21, 24-43. | DOI | MR
[17] Rao C. R., Nayak T. K. (1985): Cross Entropy, Dissimilarity Measures and characterization of Quadratic Entropy. IEEE Trans, on Inform. Theory, IT-31(5), 589-593. | DOI | MR
[18] Sakaguchi M. (1964): Information Theory and Decision Making. Unpublished Lecture Notes, Statist. Dept., George Washington Univ., Washington DC.
[19] Sanťanna A. P., Taneja I. J.: Trigonometric Entropies, Jensen Difference Divergence Measures and Error Bounds. Information Sciences 25, 145-156. | MR
[20] Shannon C. E. (1948): A Mathematical Theory of Communications. Bell. Syst. Tech. J. 27, 379-423. | DOI | MR
[21] Sibson R. (1969): Information Radius. Z. Wahrs. und verw. Geb. 14, 149-160. | DOI | MR
[22] Taneja I. J.: 1(983): On characterization of J-divergence and its generalizations. J. Combin. Inform. System Sci. 8, 206-212. | MR
[23] Taneja I. J. (1986): $\lambda$-measures of hypoentropy and their applications. Statistica, anno XLVI, n. 4, 465-478. | MR
[24] Taneja I. J. (1986): Unified Measure of Information applied to Markov Chains and Sufficiency. J. Comb. Inform. & Syst. Sci., 11, 99-109. | MR
[25] Taneja I. J. (1987): Statistical aspects of Divergence Measures. J. Statist. Plann. & Inferen., 16, 137-145. | DOI | MR
[26] Taneja I. J. (1989): On Generalized Information Measures and their Applications. Adv. Elect. Phys. 76, 327 - 413. Academic Press.
[27] Taneja I. J. (1990): Bounds on the Probability of Error in Terms of Generalized Information Radius. Information Sciences. 46.
[28] Taneja I. J., Morales D., Pardo L. (1991): $\lambda$-measures of hypoentropy and comparison of experiments: Blackwell and Lehemann approach. Kybernetika, 27, 413 - 420. | MR
[29] Vajda I. (1968): Bounds on the Minimal Error Probability and checking a finite or countable number of Hypothesis. Inform. Trans. Problems 4, 9-17. | MR
[30] Vajda I. (1989): Theory of Statistical Inference and Information. Kluwer Academic Publishers, Dordrecht/Boston/London/.
Cité par Sources :