Maximum likelihood principle and $I$-divergence: discrete time observations
Kybernetika, Tome 34 (1998) no. 3, pp. 265-288 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

The paper investigates the relation between maximum likelihood and minimum $I$-divergence estimates of unknown parameters and studies the asymptotic behaviour of the likelihood ratio maximum. Observations are assumed to be done in the discrete time.
The paper investigates the relation between maximum likelihood and minimum $I$-divergence estimates of unknown parameters and studies the asymptotic behaviour of the likelihood ratio maximum. Observations are assumed to be done in the discrete time.
Classification : 62B10, 62F12, 62M10
Keywords: maximum likelihood estimate; information divergence; exponential families; discrete time process; autoregressive sequences
@article{KYB_1998_34_3_a1,
     author = {Mich\'alek, Ji\v{r}{\'\i}},
     title = {Maximum likelihood principle and $I$-divergence: discrete time observations},
     journal = {Kybernetika},
     pages = {265--288},
     year = {1998},
     volume = {34},
     number = {3},
     mrnumber = {1640966},
     zbl = {1274.62066},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_1998_34_3_a1/}
}
TY  - JOUR
AU  - Michálek, Jiří
TI  - Maximum likelihood principle and $I$-divergence: discrete time observations
JO  - Kybernetika
PY  - 1998
SP  - 265
EP  - 288
VL  - 34
IS  - 3
UR  - http://geodesic.mathdoc.fr/item/KYB_1998_34_3_a1/
LA  - en
ID  - KYB_1998_34_3_a1
ER  - 
%0 Journal Article
%A Michálek, Jiří
%T Maximum likelihood principle and $I$-divergence: discrete time observations
%J Kybernetika
%D 1998
%P 265-288
%V 34
%N 3
%U http://geodesic.mathdoc.fr/item/KYB_1998_34_3_a1/
%G en
%F KYB_1998_34_3_a1
Michálek, Jiří. Maximum likelihood principle and $I$-divergence: discrete time observations. Kybernetika, Tome 34 (1998) no. 3, pp. 265-288. http://geodesic.mathdoc.fr/item/KYB_1998_34_3_a1/

[1] Anderson T. W.: The Statistical Analysis of Time Series. Wiley, New York 1971 | MR | Zbl

[2] Basseville M., Benveniste A.: Detection of Abrupt Changes in Signals and Dynamical Systems. Springer–Verlag, Berlin 1986 | Zbl

[3] Krishnaiah P. R., Miao B. Q.: Review about estimation of change points. In: Handbook of Statistics (P. R. Krishnaiah and C. R. Rao, eds.), Elsevier Sci. Publishers, Amsterdam 1988, Vol. 7, pp. 375–402

[4] Kullback S.: Information Theory and Statistics (in Russian). Nauka, Moscow 1967. Translated from the English original

[5] Kupperman M.: Further application of information theory to multivariate analysis and statistical inference. Ann. Math. Statist. 27 (1956), 1184

[6] Kűchler V., Sorensen M.: Exponential families of stochastic processes: A unifying semimartingale approach. Internat. Statist. Rev. (1989), 123–144 | DOI

[7] Michálek J.: Yule–Walker estimates and asymptotic $I$-divergence rate. Problems Control Inform. Theory 19 (1990), 5–6, 387–398 | MR | Zbl

[8] Michálek J.: A method of detecting changes in the behaviour of locally stationary sequences. Kybernetika 31 (1995), 1, 17–29 | MR | Zbl

[9] Morales D., Pardo L., Vajda I.: About classical and some new statistics for testing hypothesis in parametric models. J. Multivariate Anal. (to appear) | MR

[10] Page E.: Continuous inspection schemes. Biometrika 41 (1954), 100–115 | DOI | MR | Zbl