Keywords: Cencov's comments; inverse problems in distribution estimation; $L_1$ density estimation; variational distance; $\phi$-divergence
@article{KYB_2011_47_6_a2,
author = {Gy\"orfi, L\'aszl\'o and Krzy\.zak, Adam},
title = {Why $L_1$ view and what is next?},
journal = {Kybernetika},
pages = {840--854},
year = {2011},
volume = {47},
number = {6},
mrnumber = {2907845},
zbl = {06047589},
language = {en},
url = {http://geodesic.mathdoc.fr/item/KYB_2011_47_6_a2/}
}
Györfi, László; Krzyżak, Adam. Why $L_1$ view and what is next?. Kybernetika, Tome 47 (2011) no. 6, pp. 840-854. http://geodesic.mathdoc.fr/item/KYB_2011_47_6_a2/
[1] Abou-Jaoude, S.: Conditions nécessaires et suffisantes de convergence $L_1$ en probabilité de l’histogramme pour une densité. Ann. Inst. H. Poincaré XII (1976), 213–231. | MR
[2] Barndorff-Nielsen, O.: Information and Exponential Families in Statistical Theory. Wiley, 1978. | MR | Zbl
[3] Barron, A. R., Györfi, L., Meulen, E. C. van der: Distribution estimation consistent in total variation and two types of information divergence. IEEE Trans. Inform. Theory 38 (1992), 1437–1454. | DOI | MR
[4] Cencov, N. N.: Estimation of unknown density function from observations. (in Russian) Trans. SSSR Acad. Sci. 147 (1962), 45–48. | MR
[5] Cencov, N. N.: Categories of mathematical statistics. (in Russian) Trans. SSSR Acad. Sci. 164 (1965), 511–514. | MR
[6] Cencov, N. N.: General theory of exponential families of distribution functions. Theory Probab. Appl. 11 (1966), 483–494. | MR
[7] Cencov, N. N.: Asymmetric distance between distribution functions, entropy and Pithagoras theorem. (in Russian) Math. Notes 4 (1968), 323–332. | MR
[8] Cencov, N. N.: Statistical Decision Rules and Optimal Inference. (in Russian) Nauka, Moscow 1972. | MR
[9] Cencov, N. N.: Algebraic foundation of mathematical statistics. Math. Operationsforsch. Statist., Ser. Statistics 9 (1978), 267–276. | MR
[10] Cencov, N. N.: On basic concepts of mathematical statistics. Banach Center Publ. 6 (1980), 85-94. | MR
[11] Cencov, N. N.: On correctness of the pointwise estimation problem. (in Russian) Theory Probab. Appl. 26 (1981) 15–31. | MR
[12] Csiszár, I., Fischer, J.: Informationsentfernungen im Raum der Wahscheinlichkeitsverteilungen. Publ. Math. Inst. Hungar. Acad. Sci. 7 (1962), 159–180. | MR
[13] Csiszár, I.: Information-type measures of divergence of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299–318. | MR
[14] Csiszár, I.: On topological properties of $f$-divergence. Studia Sci. Math. Hungar. 2 (1967), 329–339.
[15] Devroye, L., Györfi, L.: Nonparametric Density Estimation: The $L_1$ View. Wiley, 1985. Russian translation: Mir, Moscow, 1988 (Translated from English to Russian by A. Tsybakov). | MR
[16] Devroye, L., Györfi, L.: No empirical measure can converge in the total variation sense for all distribution. Ann. Statist. 18 (1990), 1496–1499. | DOI | MR
[17] Frolov, A. S., Cencov, N. N.: Application of dependent observations in the Monte Carlo method for recovering smooth curves. (in Russian) In: Proc. 6th Russian Conference on Probability Theory and Mathematical Statistics, Vilnus 1962, pp. 425–437. | MR
[18] Györfi, L., Páli, I., Meulen, E. C. van der: There is no universal source code for infinite alphabet. IEEE Trans. Inform. Theory 40 (1994), 267–271. | DOI | MR
[19] Györfi, L., Páli, I., Meulen, E. C. van der: On universal noiseless source coding for infinite source alphabets. Europ. Trans. Telecomm. 4 (1993), 9–16.
[20] Hartigan, J. A.: The likelihood and invariance principles. Annals Math. Statist. 38 (1967), 533–539. | MR
[21] Ibragimov, I. A., Hasminski, R. Z.: On estimation of density. (in Russian) Scientific Notes of LOMI Seminars 98 (1980), 61–86.
[22] Kafka, P., Österreicher, F., Vincze, I.: On powers of $f$-divergences defining a distance. Studia Sci. Math. Hungar. 26 (1991), 415–422. | MR
[23] Kemperman, J. H. B.: An optimum rate of transmitting information. Ann. Math. Statist. 40 (1969), 2156–2177. | DOI | MR
[24] Khosravifard, M., Fooladivanda, D., Gulliver, T. A.: Confliction of the convexity and metric properties in f-divergences. IEICE Trans. Fundamentals E90-A (2007), 1848–1853.
[25] Kolmogorov, A. L.: Sulla determinazione empirica di una legge di distribuzione. Giornale dell’Istituto Italiano degli Attuari 4 (1933), 83-91. | Zbl
[26] Kriz, T. A., Talacko, J. V.: Equivalence of the maximum likelihood estimator to a minimum entropy estimator. Trab. Estadist. Invest. Oper. 19 (1968), 55-65. | DOI | MR | Zbl
[27] Kullback, S.: A lower bound for discrimination in terms of variation. IEEE Trans. Inform, Theory 13 (1967), 126–127. | DOI
[28] Kullback, S.: Correction to “A lower bound for discrimination in terms of variation". IEEE Trans. Inform. Theory 16 (1970), 652. | DOI
[29] Morse, N., Sacksteder, R.: Statistical isomorphism. Ann. Math. Statist. 37 (1966), 203–214. | DOI | MR | Zbl
[30] LeCam, L.: On some asymptotic properties of maximum likelihood estimates and related Bayes estimates. Univ. Calif. Publ. Statist. 1 (1953), 267–329. | MR
[31] Liese, F., Vajda, I.: Convex Statistical Distances. Teubner, Leipzig 1987. | MR | Zbl
[32] Morozova, E. A., Cencov, N. N.: Markov maps in noncommutative probability theory and mathematical statistics. (in Russian) In: Proc. 4th Internat. Vilnius Conf. Probability Theory and Mathematical Statistics, VNU Science Press 2 (1987), pp. 287–310. | MR | Zbl
[33] Nadaraya, E. A.: On nonparametric estimation of Bayes risk in classification problems. (in Russian) Trans. Georgian Acad. Sci. 82 (1976), 277–280. | MR
[34] Nadaraya, E. A.: Nonparametric Estimation of Probability Density and Regression Curve. (in Russian) Tbilisi State University, Georgia 1983. | MR
[35] Österreicher, F., Vajda, I.: A new class of metric divergences on probability spaces and its statistical applications. Ann. Inst. Statist. Math. 55 (2003), 639–653. | DOI | MR
[36] Sobol, I. M.: Multidimensional Quadratic Formulas and Haar Functions. (in Russian) Nauka, Moscow 1969. | MR
[37] Statulavicius, W. W.: On Some Asymptotic Properties of Minimax Density Estimates. (in Russian) PhD. Thesis, Vilnus State University 1986.
[38] Stratonovich, R. L.: Rate of convergence of probability density estimates. (in Russian) Trans. SSSR Acad. Sci., Ser. Technical Cybernetics 6 (1969), 3–15.
[39] Toussaint, G. T.: Sharper lower bounds for information in term of variation. IEEE Trans. Inform. Theory 21 (1975), 99–103. | DOI | MR
[40] Vajda, I.: Note on discrimination information and variation. IEEE Trans. Inform. Theory IT-16 (1970), 771–773. | DOI | MR | Zbl
[41] Vajda, I.: On the f-divergence and singularity of probability measures. Period. Math. Hungar. 2 (1972), 223–234. | DOI | MR | Zbl
[42] Vajda, I.: On metric divergences of probability measures. Kybernetika 45 (2009), 885–900. | MR | Zbl
[43] Wald, A.: Contributions to the theory of statistical estimation and testing hypotheses. Ann. Math. Statist. 10 (1939), 299–326. | DOI | MR | Zbl