Simple robust neural network
Sibirskij žurnal industrialʹnoj matematiki, Tome 24 (2021) no. 4, pp. 126-138.

Voir la notice de l'article provenant de la source Math-Net.Ru

The classification problem and applying simple neural networks for solving it are considered. The robust modification of the error backpropagation algorithm that is used for training neural networks is proposed. The proclaim that allows building the proposed modification with the Huber loss-function is proved. In order to study the properties of the obtained neural network, a number of computational experiments has been carried out. The different values of outliers' fraction, noise level, and training and test samples size have been considered. The result analysis shows that the proposed modification can significantly increase classification accuracy and learning rate of a neural network when working with noisy data.
Keywords: classification problem, neural network, Huber loss-function, error backpropagation algorithm. .
@article{SJIM_2021_24_4_a8,
     author = {V. S. Timofeev and M. A. Sivak},
     title = {Simple robust neural network},
     journal = {Sibirskij \v{z}urnal industrialʹnoj matematiki},
     pages = {126--138},
     publisher = {mathdoc},
     volume = {24},
     number = {4},
     year = {2021},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/SJIM_2021_24_4_a8/}
}
TY  - JOUR
AU  - V. S. Timofeev
AU  - M. A. Sivak
TI  - Simple robust neural network
JO  - Sibirskij žurnal industrialʹnoj matematiki
PY  - 2021
SP  - 126
EP  - 138
VL  - 24
IS  - 4
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/SJIM_2021_24_4_a8/
LA  - ru
ID  - SJIM_2021_24_4_a8
ER  - 
%0 Journal Article
%A V. S. Timofeev
%A M. A. Sivak
%T Simple robust neural network
%J Sibirskij žurnal industrialʹnoj matematiki
%D 2021
%P 126-138
%V 24
%N 4
%I mathdoc
%U http://geodesic.mathdoc.fr/item/SJIM_2021_24_4_a8/
%G ru
%F SJIM_2021_24_4_a8
V. S. Timofeev; M. A. Sivak. Simple robust neural network. Sibirskij žurnal industrialʹnoj matematiki, Tome 24 (2021) no. 4, pp. 126-138. http://geodesic.mathdoc.fr/item/SJIM_2021_24_4_a8/

[1] Yu. P. Lankin, T. F. Baskanova, T. I. Lobova, “Neirosetevoi analiz slozhnoorganizovannykh ekologicheskikh dannykh”, Sovremennye problemy nauki i obrazovaniya, 2012, no. 4 https://www.science-education.ru/ru/article/view?id=6754 | Zbl

[2] V. G. Manzhula, D. S. Fedyashov, “Neironnye seti Kokhonena i nechetkie neironnye seti v intellektualnom analize dannykh”, Fundamentalnye issledovaniya, 2011, no. 4, 108–115 https://www.fundamental-research.ru/ru/article/view?id=21239

[3] Glubokie neiroseti, v. I, Podgotovka dannykh https://www.mql5.com/ru/articles/3486

[4] J. Fan, I. Gijbels, Local Polynomial Modelling and Its Applications, Chapman Hall, London, 1996 | DOI | Zbl

[5] S. Fujimoto, D. Meger, D. Precup, An Equivalence between Loss Functions and Non-Uniform Sampling in Experience Replay, 2020 https://papers.nips.cc/paper/2020/file/a3bf6e4db673b6449c2f7d13ee6ec9c0-Paper.pdf

[6] J. T. Barron, A General and Adaptive Robust Loss Function, 2017, arXiv: 1701.03077

[7] P. Andreou, C. Charalambous, S. Martzoukos, “Robust Artificial Neural Networks for Pricing of European Options”, Computational Economics, 2:27 (2006), 329–351 | DOI | Zbl

[8] F. Sebastiani, “Text Categorization”, Text mining and Its Applications, WIT Press, Southampton, 2005, 109–129 | DOI

[9] C. Bishop, Neural Networks for Pattern Recognition, Oxford Univ. Press, N.Y., 1995

[10] D. Khimmelblau, Prikladnoe nelineinoe programmirovanie, Mir, M., 1975

[11] J. P. Huber, Robust Statistics, Wiley, Hoboken, N. J., 2009 | DOI | Zbl

[12] UCI Machine Learning Repository, http://www.ics.uci.edu/m̃learn/MLRepository.html

[13] A. G. Ivakhnenko, V. S. Stepashko, Pomekhoustoichivost modelirovaniya, Nauk. dumka, Kiev, 1985