Hierarchical method of parameter setting for population-based metaheuristic optimization algorithms
Sibirskij žurnal industrialʹnoj matematiki, Tome 25 (2022) no. 4, pp. 164-178.

Voir la notice de l'article provenant de la source Math-Net.Ru

Metaheuristic algorithms for a global optimization problem have unbound strategy parameters that affect solution accuracy and algorithm efficiency. The task of determining optimal values of unbound parameters is called a parameter setting problem that can be solved by static parameter setting methods (performed before the algorithm run) and dynamic parameter control methods (during run). The paper introduces a novel hierarchical parameter setting method for the class of population-based metaheuristic optimization algorithms. The distinctive feature of this method is usage of the hierarchical algorithm model. The lower level represents a sequential algorithm from this class, and the upper level represents an algorithm with the island parallel model. Parameter setting is performed by the hierarchical method, which composes parameter tuning for the sequential algorithm and adaptive parameter control for the parallel algorithm. Parameter control is based on vector fitness criteria which consist of a convergence rate and a solution value. An approach for estimating the convergence rate for a multistep optimization method is proposed. Experimental results for CEC benchmark problems are presented and discussed.
Keywords: global optimization, metaheuristic algorithms, parameter setting, parameter control. .
@article{SJIM_2022_25_4_a12,
     author = {E. U. Seliverstov},
     title = {Hierarchical method of parameter setting for population-based metaheuristic optimization algorithms},
     journal = {Sibirskij \v{z}urnal industrialʹnoj matematiki},
     pages = {164--178},
     publisher = {mathdoc},
     volume = {25},
     number = {4},
     year = {2022},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/SJIM_2022_25_4_a12/}
}
TY  - JOUR
AU  - E. U. Seliverstov
TI  - Hierarchical method of parameter setting for population-based metaheuristic optimization algorithms
JO  - Sibirskij žurnal industrialʹnoj matematiki
PY  - 2022
SP  - 164
EP  - 178
VL  - 25
IS  - 4
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/SJIM_2022_25_4_a12/
LA  - ru
ID  - SJIM_2022_25_4_a12
ER  - 
%0 Journal Article
%A E. U. Seliverstov
%T Hierarchical method of parameter setting for population-based metaheuristic optimization algorithms
%J Sibirskij žurnal industrialʹnoj matematiki
%D 2022
%P 164-178
%V 25
%N 4
%I mathdoc
%U http://geodesic.mathdoc.fr/item/SJIM_2022_25_4_a12/
%G ru
%F SJIM_2022_25_4_a12
E. U. Seliverstov. Hierarchical method of parameter setting for population-based metaheuristic optimization algorithms. Sibirskij žurnal industrialʹnoj matematiki, Tome 25 (2022) no. 4, pp. 164-178. http://geodesic.mathdoc.fr/item/SJIM_2022_25_4_a12/

[1] K. A. De Jong, Evolutionary Computation: A Unified Approach, Springer-Verl, Cambridge, 2006 | MR

[2] J. Kennedy, R. Eberhart, Swarm Intelligence, Morgan Kaufmann, San Francisco, 2001

[3] M. Clerc, Particle Swarm Optimization, Wiley-ISTE, London, 2006 | MR

[4] T. Dokeroglu, E. Sevinc, T. Kucukyilmaz, E. Cosar, “A survey on new generation metaheuristic algorithms”, Comput. Indust. Engrg, 137 (2019) | DOI

[5] T. Bäck, D. Fogel, Z. Michalewicz, Evolutionary Computation 2: Advanced Algorithms and Operations, Institute of Physics Publ, Philadelphia, 2000 | MR

[6] Y. Lorion, T. Bogon, I. J. Timm, O. Drobnik, “An agent based parallel particle swarm optimization APPSO”, IEEE Swarm Intelligence Symp., 2009, 52–59 | DOI

[7] B. Niu, Y. Zhu, X. He, H. Wu, “MCPSO: A multi-swarm cooperative particle swarm optimizer”, Appl. Math. Comput. Special Iss. Intelligent Comput. Theory and Methodology, 185:2 (2008), 1050–1062 | DOI | MR

[8] K. De Jong, “Parameter setting in EAs: a 30 year perspective”, Parameter Setting in Evolutionary Algorithms. Studies Comput. Intelligence, 54 (2007), 1–18 | DOI

[9] S. Smit, A. Eiben, “Comparing parameter tuning methods for evolutionary algorithms”, Proc. 2009 IEEE Congress on Evolutionary Comput., 2009, 399–406 | DOI

[10] A. E. Eiben, S. K. Smit, “Parameter tuning for configuring and analyzing evolutionary algorithms”, Swarm and Evolutionary Comput., 1 (2011), 19–31 | DOI

[11] A. Eiben, Z. Michalewicz, M. Schoenauer, J. Smith, “Parameter control in evolutionary algorithms”, Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, 54 (2007), 19–46 | DOI

[12] Z. Zhan, J. Zhang, Y, Li, H. Chung, “Adaptive particle swarm optimization”, IEEE Trans. Systems, Man, and Cybernetics. Part B: Cybernetics, 39:6 (2009), 1362–1381 | DOI | MR

[13] E. Seliverstov, A. Karpenko, “Hierarchical model of parallel metaheuristic optimization algorithms”, Procedia Computer Science, 150, Proc 13 Internat. Symp. «Intelligent Systems 2018» (INTELS-18) (2019), 441–449 | DOI

[14] J. Nocedal, S. Wright, Numerical Optimization, Springer Sci. Business Media, N.Y., 2006 | MR

[15] R. Mendes, J. Kennedy, J. Neves, “The fully informed particle swarm: Simpler, maybe better”, IEEE Trans. Evolutionary Comput., 8 (2004), 204–210 | DOI

[16] I. Griva, S. Nash, A. Sofer, Linear and Nonlinear Optimization, Soc. Indust. Appl. Math., Philadelphia, 2008 | MR

[17] X. Li, K. Tang, M. N. Omidvar, Z. Yang, K. Qin, H. China, Benchmark functions for the CEC 2013 special session and competition on large-scale global optimization, Technical Report, Evolutionary Computation and Machine Learning Group, RMIT, 2013

[18] M. R. Bonyadi, Z. Michalewicz, “Analysis of stability, local convergence, and transformation sensitivity of a variant of the particle swarm optimization algorithm”, IEEE Trans. Evolutionary Comput., 20:3 (2016), 370–385 | DOI

[19] F. V. Bergh, A. P. Engelbrecht, “A convergence proof for the particle swarm optimiser”, Fund. Informat., 105:4 (2010), 341–374 | DOI | MR