Choosing the best $\phi$-divergence goodness-of-fit statistic in multinomial sampling with linear constraints
Kybernetika, Tome 42 (2006) no. 6, pp. 711-722 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

In this paper we present a simulation study to analyze the behavior of the $\phi $-divergence test statistics in the problem of goodness-of-fit for loglinear models with linear constraints and multinomial sampling. We pay special attention to the Rényi’s and $I_{r}$-divergence measures.
In this paper we present a simulation study to analyze the behavior of the $\phi $-divergence test statistics in the problem of goodness-of-fit for loglinear models with linear constraints and multinomial sampling. We pay special attention to the Rényi’s and $I_{r}$-divergence measures.
Classification : 62B10, 62F03, 62F30, 62G10, 62H15, 62H17, 65C60
Keywords: multinomial sampling; restricted maximum likelihood estimator; goodness-of-fit; $I_r$-divergence measure; Rényi’s divergence measure
@article{KYB_2006_42_6_a5,
     author = {Martin, Nirian and Pardo, Leandro},
     title = {Choosing the best $\phi$-divergence goodness-of-fit statistic in multinomial sampling with linear constraints},
     journal = {Kybernetika},
     pages = {711--722},
     year = {2006},
     volume = {42},
     number = {6},
     mrnumber = {2296510},
     zbl = {1245.62011},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2006_42_6_a5/}
}
TY  - JOUR
AU  - Martin, Nirian
AU  - Pardo, Leandro
TI  - Choosing the best $\phi$-divergence goodness-of-fit statistic in multinomial sampling with linear constraints
JO  - Kybernetika
PY  - 2006
SP  - 711
EP  - 722
VL  - 42
IS  - 6
UR  - http://geodesic.mathdoc.fr/item/KYB_2006_42_6_a5/
LA  - en
ID  - KYB_2006_42_6_a5
ER  - 
%0 Journal Article
%A Martin, Nirian
%A Pardo, Leandro
%T Choosing the best $\phi$-divergence goodness-of-fit statistic in multinomial sampling with linear constraints
%J Kybernetika
%D 2006
%P 711-722
%V 42
%N 6
%U http://geodesic.mathdoc.fr/item/KYB_2006_42_6_a5/
%G en
%F KYB_2006_42_6_a5
Martin, Nirian; Pardo, Leandro. Choosing the best $\phi$-divergence goodness-of-fit statistic in multinomial sampling with linear constraints. Kybernetika, Tome 42 (2006) no. 6, pp. 711-722. http://geodesic.mathdoc.fr/item/KYB_2006_42_6_a5/

[1] Agresti A.: Categorical Data Analysis. Wiley, New York 2002 | MR

[2] Andersen E. B.: The Statistical Analysis of Categorical Data. Springer, New York 1990 | Zbl

[3] Ali S. M., Silvey S. D.: A general class of coefficient of divergence of one distribution from another. J. Roy. Statist. Soc. 28 (1966), 131–142 | MR

[4] Csiszár I.: Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Bewis der Ergodizität on Markhoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci. 8 (1963), 84–108

[5] Dale J. R.: Asymptotic normality of goodness-of-fit statistics for sparse product multinomials. J. Roy. Statist. Soc. Ser. B 41 (1986), 48–59 | MR | Zbl

[6] Haber M., Brown M. B.: Maximum likelihood methods for log-linear models when expected frequencies are subject to linear constraints. J. Amer. Statist. Assoc. 81 (1986), 477–482 | MR | Zbl

[7] Kullback S.: Kullback information. In: Encyclopedia of Statistical Sciences (S. Kotz and N. L. Johnson, eds.), Wiley, New York 1985, Volume 4, pp. 421–425 | MR

[8] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 | MR | Zbl

[9] Pardo L., Menéndez M. L.: Analysis of divergence in loglinear models when expected frequencies are subject to linear constraints. Metrika 64 (2006), 63–76 | DOI | MR | Zbl

[10] Powers D. A., Xie Y.: Statistical Methods for Categorical Data Analysis. Academic Press, San Diego 2000 | MR | Zbl

[11] Rényi A.: On measures of entropy and information. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability 1 (1961), pp. 547–561

[12] Vajda I.: Theory of Statistical Inference and Information. Kluwer Academic Publishers, Dordrecht 1989 | Zbl