Lotka–Volterra model with mutations and generative adversarial networks
Teoretičeskaâ i matematičeskaâ fizika, Tome 218 (2024) no. 2, pp. 320-329 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

A model of population genetics of the Lotka–Volterra type with mutations on a statistical manifold is introduced. Mutations in the model are described by diffusion on a statistical manifold with a generator in the form of a Laplace–Beltrami operator with a Fisher–Rao metric, that is, the model combines population genetics and information geometry. This model describes a generalization of the model of machine learning theory, the model of generative adversarial network (GAN), to the case of populations of generative adversarial networks. The introduced model describes the control of overfitting for generating adversarial networks.
Keywords: learning theory, population genetics, theory of evolution.
@article{TMF_2024_218_2_a6,
     author = {S. V. Kozyrev},
     title = {Lotka{\textendash}Volterra~model with mutations and generative adversarial networks},
     journal = {Teoreti\v{c}eska\^a i matemati\v{c}eska\^a fizika},
     pages = {320--329},
     year = {2024},
     volume = {218},
     number = {2},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/TMF_2024_218_2_a6/}
}
TY  - JOUR
AU  - S. V. Kozyrev
TI  - Lotka–Volterra model with mutations and generative adversarial networks
JO  - Teoretičeskaâ i matematičeskaâ fizika
PY  - 2024
SP  - 320
EP  - 329
VL  - 218
IS  - 2
UR  - http://geodesic.mathdoc.fr/item/TMF_2024_218_2_a6/
LA  - ru
ID  - TMF_2024_218_2_a6
ER  - 
%0 Journal Article
%A S. V. Kozyrev
%T Lotka–Volterra model with mutations and generative adversarial networks
%J Teoretičeskaâ i matematičeskaâ fizika
%D 2024
%P 320-329
%V 218
%N 2
%U http://geodesic.mathdoc.fr/item/TMF_2024_218_2_a6/
%G ru
%F TMF_2024_218_2_a6
S. V. Kozyrev. Lotka–Volterra model with mutations and generative adversarial networks. Teoretičeskaâ i matematičeskaâ fizika, Tome 218 (2024) no. 2, pp. 320-329. http://geodesic.mathdoc.fr/item/TMF_2024_218_2_a6/

[1] I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, Y. Bengio, “Generative adversarial nets”, 28th Annual Conference on Neural Information Processing Systems 2014 (Montreal, Canada, 8–13 December, 2014), Advances in Neural Information Processing Systems, 27, eds. Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, K. Q. Weinberger, NIPS Foundation, La Jolla, CA, 2014, 2672–2680, arXiv: 1406.2661

[2] I. Goodfellow, NIPS 2016 tutorial: Generative adversarial networks, arXiv: 1701.00160

[3] S. V. Kozyrev, “Learning theory and population genetics”, Lobachevskii J. Math., 43:7 (2022), 1655–1662 | DOI | MR

[4] S. V. Kozyrev, “Learning by population genetics and matrix Riccati equation”, Entropy, 25:2 (2023), 348, 9 pp. | DOI

[5] A. M. Turing, “Computing machinery and intelligence”, Mind, LIX:236 (1950), 433–460 | DOI | MR

[6] V. Vanchurin, Yu. I. Wolf, M. I. Katsnelson, E. V. Koonin, “Towards a theory of evolution as multilevel learning”, Proc. Natl. Acad. Sci. USA, 119:6 (2022), e2120037119, 12 pp. | DOI

[7] S. Nikolenko, A. Kadurin, E. Arkhangelskaya, Glubokoe obuchenie. Pogruzhenie v mir neironnykh setei, Piter, SPb., 2018

[8] M. Eigen, J. McCaskill, P. Schuster, “Molecular quasi-species”, J. Phys. Chem., 92:24 (1988), 6881–6891 | DOI

[9] E. A. Morozova, N. N. Chentsov, “Estestvennaya geometriya semeistv veroyatnostnykh zakonov”, Teoriya veroyatnostei – 8, Itogi nauki i tekhn. Ser. Sovrem. probl. matem. Fundam. napravleniya, 83, VINITI, M., 1991, 133–265 | MR | Zbl

[10] N. N. Chentsov, Statisticheskie reshayuschie pravila i optimalnye vyvody, Nauka, M., 1972 | DOI | MR

[11] S. Amari, Differential-Geometrical Methods in Statistics, Lecture Notes in Statistics, 28, Springer, Berlin, 1985 | DOI | MR

[12] S. Amari, H. Nagaoka, Methods of Information Geometry, Translations of Mathematical Monographs, 191, AMS, Providence, RI, 2000 | DOI | MR

[13] P. Gibilisco, E. Riccomagno, M. P. Rogantin, H. P. Wynn (eds.), Algebraic and Geometric Methods in Statistics, Cambridge Univ. Press, Cambridge, 2010 | DOI | MR

[14] N. Combe, Yu. I. Manin, M. Marcolli, “Geometry of information: Classical and quantum aspects”, Theor. Comput. Sci., 908 (2022), 2–27, arXiv: 2107.08006 | DOI | MR

[15] V. N. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 2000 | DOI | MR

[16] S. Hochreiter, J. Schmidhuber, “Flat minima”, Neural Computation, 9:1 (1997), 1–42 | DOI

[17] O. Bousquet, A. Elisseeff, “Stability and generalization”, J. Mach. Learn. Res., 2:3 (2002), 499–526 | MR

[18] S. Kutin, P. Niyogi, “Almost-everywhere algorithmic stability and generalization error”, Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence (UAI2002) (Alberta, Canada, August 1–4, 2002), eds. A. Darwiche, N. Friedman, Morgan Kaufmann Publ., San Francisco, CA, 2002, 275–282, arXiv: 1301.0579

[19] T. Poggio, R. Rifkin, S. Mukherjee, P. Niyogi, “General conditions for predictivity in learning theory”, Nature, 428 (2004), 419–422 | DOI

[20] R. Fisher, Geneticheskaya teoriya estestvennogo otbora, NITs “Regulyarnaya i khaoticheskaya dinamika”, M.–Izhevsk, 2011 | MR

[21] J. Maynard Smith, Evolution and the Theory of Games, Cambridge Univ. Press, Cambridge, 1982 | DOI | Zbl