Mathematical methods of randomized machine teaching
Itogi nauki i tehniki. Sovremennaâ matematika i eë priloženiâ. Tematičeskie obzory, Mathematical Analysis, Tome 155 (2018), pp. 65-88.

Voir la notice de l'article provenant de la source Math-Net.Ru

In this paper, a review of mathematical methods of randomized machine teaching is presented.
Keywords: mathematical methods, machine teaching.
@article{INTO_2018_155_a3,
     author = {Yu. S. Popkov},
     title = {Mathematical methods of randomized machine teaching},
     journal = {Itogi nauki i tehniki. Sovremenna\^a matematika i e\"e prilo\v{z}eni\^a. Temati\v{c}eskie obzory},
     pages = {65--88},
     publisher = {mathdoc},
     volume = {155},
     year = {2018},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/INTO_2018_155_a3/}
}
TY  - JOUR
AU  - Yu. S. Popkov
TI  - Mathematical methods of randomized machine teaching
JO  - Itogi nauki i tehniki. Sovremennaâ matematika i eë priloženiâ. Tematičeskie obzory
PY  - 2018
SP  - 65
EP  - 88
VL  - 155
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/INTO_2018_155_a3/
LA  - ru
ID  - INTO_2018_155_a3
ER  - 
%0 Journal Article
%A Yu. S. Popkov
%T Mathematical methods of randomized machine teaching
%J Itogi nauki i tehniki. Sovremennaâ matematika i eë priloženiâ. Tematičeskie obzory
%D 2018
%P 65-88
%V 155
%I mathdoc
%U http://geodesic.mathdoc.fr/item/INTO_2018_155_a3/
%G ru
%F INTO_2018_155_a3
Yu. S. Popkov. Mathematical methods of randomized machine teaching. Itogi nauki i tehniki. Sovremennaâ matematika i eë priloženiâ. Tematičeskie obzory, Mathematical Analysis, Tome 155 (2018), pp. 65-88. http://geodesic.mathdoc.fr/item/INTO_2018_155_a3/

[1] Aivazyan S. A., Bukhshtaber V. M., Enyukov I. S., Meshalkin L. D., Prikladnaya statistika: klassifikatsiya i snizhenie razmernosti, Finansy i statistika, M., 1989 | MR

[2] Aivazyan S. A., Enyukov I. S., Meshalkin L. D., Prikladnaya statistika. Issledovanie zavisimostei, Finansy i statistika, M., 1985 | MR

[3] Aizerman M. A., Braverman E. M., Rozonoer L. I., Metod potentsialnykh funktsii v teorii obucheniya mashin, Nauka, M., 1970

[4] Alekseev V. M., Tikhomirov V.M., Fomin S. V., Optimalnoe upravlenie, Nauka, M., 1979 | MR

[5] Vapnik V. N., Chervonenkis A. Ya., Teoriya raspoznavaniya obrazov, Nauka, M., 1974

[6] Vapnik V. N., Chervonenkis A. Ya., Vosstanovlenie zavisimostei po empiricheskim dannym, Nauka, M., 1979

[7] Volterra V., Teoriya funktsionalov, integralnykh i integro-differentsialnykh uravnenii., Nauka, M., 1982

[8] Vorontsov K. V., Matematicheskie metody obucheniya po pretsedentam, Kurs lektsii, MFTI, M., 2006

[9] Zagoruiko N. G., Prikladnye metody analiza dannykh i znanii, Nauka, Novosibirsk, 1998

[10] Zolotykh N. Yu., Mashinnoe obuchenie i analiz dannykh http://www.uic.unn.ru/~zny/ml

[11] Ioffe A. D., Tikhomirov V. M., Teoriya ekstremalnykh zadach, Nauka, M., 1974

[12] Kolmogorov A. N., Fomin S. V., Elementy teorii funktsii i funktsionalnogo analiza, Nauka, M., 1976 | MR

[13] Lapko A. V., Chentsov S. V., Krokhov S. I., Feldman L. A., Obuchayuschaya sistema obrabotki informatsii i prinyatiya reshenii. Neparametricheskii podkhod., Nauka, Novosibirsk, 1996

[14] Merkov A. B., Raspoznavanie obrazov. Vvedenie v metody statisticheskogo obucheniya, Editorial URSS, M., 2010

[15] Merkov A. B., Raspoznavanie obrazov. Postroenie i obuchenie veroyatnostnykh modelei, LENAND, M., 2014

[16] Popkov Yu. S., Kiselev O. N., Petrov N. P., Shmulyan B. L., Optimizatsiya i identifikatsiya neslineinykh stokhasticheskikh sistem, Energiya, M., 1976

[17] Popkov Yu. S., Popkov A. Yu., Darkhovskii B. S., “Parallelnyi metod Monte-Karlo dlya postroeniya entropiino-robastnykh otsenok”, Mat. model., 27:6 (2015), 14–32 | Zbl

[18] Flakh P., Mashinnoe obuchenie, DMK Press, 2015

[19] Tsypkin Ya. Z., Osnovy teorii obuchayuschikhsya sistem, Nauka, M., 1970 | MR

[20] Tsypkin Ya. Z., Popkov Yu. S., Teoriya nelineinykh impulsnykh sistem, Nauka, M., 1973 | MR

[21] Shlezinger M., Glavach V., Desyat lektsii po statisticheskomu i strukturnomu raspoznavaniyu, Naukova Dumka, Kiev, 2004

[22] Avellaneda M., “Minimum-relative-entropy calibration of asset-pricing models”, Int. J. Theor. Appl. Finance, 1:4 (1998), 447–472 | DOI | Zbl

[23] Bishop C. M., Pattern Recognition and Machine Learning, Springer-Verlag, 2006 | MR | Zbl

[24] Boucheron S., Bousquet O., Lugosi G., “Theory of Classification: A survey of some recent advances”, Eur. SAIM: Probab. Stat., 9 (2005), 323–375 | DOI | MR | Zbl

[25] Campbell P., “Editorial on special issue on big data: Community cleverness required”, Nature, 455:7209 (2008), 1 | DOI | MR

[26] Darkhovskii B. S., Popkov Yu. S., Popkov A. Yu., “Monte-Carlo method of batch iterations: Probabilistic characteristics”, Autom. Remote Control., 76:5 (2015), 776–785 | DOI | MR | Zbl

[27] Dempster A. P., Laird N. M., Rubin D. B., “Maximum likelihood from incomplete data via the EM algorithm”, J. Roy. Stat. Soc. Ser. B., 34 (1977), 1–38 | MR

[28] Fisher R. A., “Two new properties of mathematical liklihood”, Proc. Roy. Stat. Soc., 144 (1934), 285–307

[29] Frawley W. J., Piatetsky-Shapiro G., Matheus C. J., “Knowledge discovery in databases: An overview”, AI Magazine, 13:3 (1992), 57

[30] Gonzalo J. A., Muñoz F.-F.; Santos D. J., “Using a rate equation approach to model world population trends”, Simulation, 89 (2013), 192–198 | DOI

[31] Hastie T., Tibshirani R., Friedman J., The Elements of Statistical Learning, Springer, 2001 | MR | Zbl

[32] Huber P. J., Robust Statistics, Wiley, 1984 | MR

[33] Jain A., Murty M., Flynn P., “Data clastering: A review”, ASM Comput. Surv., 31:3 (1999), 264–323 | DOI

[34] Jaynes E. T., “Information theory and statistical mechanics”, Phys. Rev., 106 (1957), 620–630 | DOI | MR | Zbl

[35] Jaynes E. T., Papers on Probability, Statistics, and Statistical Physics, Kluwer Academic Publ., Dordrecht, 1989 | MR | Zbl

[36] Jaynes E. T., Probability Theory. The Logic and Science, Cambridge Univ. Press, 2003 | MR | Zbl

[37] Kaashoek M. A., Seatzu S., van der Mee C., Recent Advances in Operator Theory and Its Applications, Springer-Verlag, 2006

[38] Kapur J. N., Maximum Entropy Models in Science and Engineering, Wiley, 1989 | MR | Zbl

[39] Kullback S., Leibler R. A., “On information and sufficiency”, Ann. Math. Stat., 22:1 (1951), 79–86 | DOI | MR | Zbl

[40] The maximum entropy formalism, eds. Levin R. D., Tribus M., MIT Press, 1979 | MR | Zbl

[41] MacKay D., Information Theory: Inference and Learning Algorithms, Cambridge Univ. Press, 2003 | MR | Zbl

[42] Popkov Yu. S., Macrosystem Theory and Its Applications, Lect. Notes Contr. Inform. Sci., 203, Springer-Verlag, London, 1995 | MR

[43] Popkov Yu. S., Dubnov Yu. A., Popkov A. Yu., “New method of randomized forecasting using entropy-robust estimation: Application to the world population prediction”, Math. MDPI, 4 (2016), 1–16 | Zbl

[44] Popkov Yu. S., Popkov A. Yu., “New method of entropy-robust estimation for randomized models under limited data”, Entropy, 16 (2013), 675–698 | DOI

[45] Racine J., Maasoumi E., “A versatile and robust metric entropy test of time-reversibility, and other hypotheses”, J. Econometrics, 138 (2007), 547–567 | DOI | MR | Zbl

[46] Rosenblatt F., The Perceptron. A Perceiving and Recognizing Automaton, Project Para Report No. 85-460-1, Cornell Aeronautical Laboratory, 1957 | MR

[47] Rubinstein R. Y., Kroese D. P., Simulation and the Monte-Carlo Method, Wiley, 2008 | MR | Zbl

[48] Smola A., Bartlett P., Scholkopf B., Schuurmans D., Advances in Large Margin Classifiers, MIT Press, 2000 | MR | Zbl

[49] Witten I. H., Frank E., Data Mining: Practical Learning Tools and Techniques, Morgan Kaufmann, 2005 | Zbl