Новый коллективный метод оптимизации на основе кооперации бионических алгоритмов
Автор: Ахмедова Шахназ Агасувар Кызы, Семенкин Евгений Станиславович
Журнал: Сибирский аэрокосмический журнал @vestnik-sibsau
Рубрика: 2-я международная конференция по математическим моделям и их применению
Статья в выпуске: 4 (50), 2013 года.
Бесплатный доступ
Описывается и рассматривается новый самонастраивающийся коллективный подох, названный кооперацией бионических алгоритмов (COBRA), основанный на пяти хорошо известных бионических алгоритмах: стайные алгоритмы, алгоритмы волчьих стай, мотыльковые алгоритмы, поиск алгоритмом «кукушки». Кроме того, включены две модификации алгоритма COBRA. Также новый коллективный метод используется для настройки весовых коэффициентов нейронной сети в решении различных задач классификации.
Самонастройка, нейронная сеть, классификация, бионический алгоритм
Короткий адрес: https://sciup.org/148177168
IDR: 148177168
Текст научной статьи Новый коллективный метод оптимизации на основе кооперации бионических алгоритмов
Existing metaheuristic algorithms such as Particle Swarm Optimization or Firefly Algorithm, for example, start to demonstrate their power in dealing with tough optimization problems and even NP-hard problems. Five well-known and very similar to each other nature-inspired algorithms such as Particle Swarm Optimization (PSO) [1], Wolf Pack Search (WPS) [2], Firefly Algorithm (FFA) [3], Cuckoo Search Algorithm (CSA) [4] and Bat Algorithm (BA) [5] were investigated by authors of this paper. Each of above listed algorithms was originally developed for solving real-parameter unconstrained optimization problems and imitates a nature process or the behavior of an animal group. For instance Bat Algorithm is based on the echolocation behavior of bats; Cuckoo Search Algorithm was inspired by the obligate brood parasitism of some cuckoo species by laying their eggs in the nests of other host birds (of other species), and so on.
Unconstrained optimization. Various test unconstrained optimization problems with various dimensions were used for the preliminary investigation of all mentioned algorithms: Rosenbrock’s function, Sphere function, Ackley’s function, Griewank’s function, Hyper-Ellipsoidal function and Rastrigin’s functions [6]. The comparison of obtained results showed that we can’t say which approach is the most appropriate for any function and any dimension. The best results were obtained by
different methods for different problems and for different dimensions; in some cases the best algorithm differs even for the same test problem if the dimension varies. Each strategy has its advantages and disadvantages, so a natural question is whether is it possible to combine major advantages of above listed algorithms and try to develop a potentially better algorithm?
These observations brought researchers to the idea of formulating a new metaheuristic approach that combines major advantages of various algorithms. So a new optimization method based on PSO, WPS, FFA, CSA and BA was implemented and investigated. Proposed approach was called Co-Operation of Biology Related Algorithms (COBRA). Its basic idea consists in generating five populations (one population for each algorithm) which are then executed in parallel cooperating with each other (so-called island model).
Proposed algorithm is a self-tuning meta-heuristic. That is why there is no necessity to choose the population size for each algorithm. The number of individuals in each algorithm’s population can increase or decrease depending on the fact weather was the fitness value improving on current stage or not. If the fitness value wasn’t improved during a given number of generations, then the size of all populations increases. And vice versa, if the fitness value was constantly improved, then the size of all populations de-
Table 1
Results obtained by COBRA for the first six test problems
Func |
Dimension |
Success rate |
Average population size |
Average number of function evaluations |
Average function value |
STD |
1 |
2 |
100 |
20 |
263 |
0.000652238 |
0.000346687 |
3 |
100 |
24 |
605 |
0.000750922 |
0.000290652 |
|
4 |
100 |
27 |
757 |
0.000790054 |
0.000297926 |
|
2 |
2 |
100 |
20 |
284 |
0.000753919 |
0.000272061 |
3 |
100 |
22 |
552 |
0.000783528 |
0.000290029 |
|
4 |
100 |
27 |
932 |
0.000817905 |
0.00028088 |
|
3 |
2 |
100 |
29 |
867 |
0.000588745 |
0.000307145 |
3 |
100 |
33 |
1470 |
0.000774339 |
0.000282613 |
|
4 |
100 |
32 |
1604 |
0.000739637 |
0.000372214 |
|
4 |
2 |
100 |
20 |
202 |
0.000678884 |
0.000320224 |
3 |
100 |
25 |
581 |
0.000749783 |
0.000282332 |
|
4 |
100 |
28 |
1085 |
0.000756105 |
0.000286405 |
|
5 |
2 |
100 |
22 |
369 |
0.000806724 |
0.000140685 |
3 |
100 |
22 |
574 |
0.000989866 |
0.00140048 |
|
4 |
100 |
28 |
885 |
0.000695163 |
0.000159342 |
|
6 |
2 |
100 |
27 |
860 |
180.001 |
0.000273844 |
3 |
100 |
40 |
2082 |
170.001 |
0.000247725 |
|
4 |
100 |
56 |
3877 |
160.001 |
0.000336504 |
Table 2
Performance comparison of COBRA and component algorithms (28 test functions from [7])
Func |
Best D = 10 |
Mean D = 10 |
Best D = 30 |
Mean D = 30 |
1 |
PSO |
PSO |
PSO |
COBRA |
2 |
COBRA |
COBRA |
COBRA |
COBRA |
3 |
COBRA |
COBRA |
COBRA |
COBRA |
4 |
COBRA |
COBRA |
COBRA |
COBRA |
5 |
PSO |
PSO |
WPS |
COBRA |
6 |
COBRA |
COBRA |
COBRA |
COBRA |
7 |
COBRA |
COBRA |
COBRA |
COBRA |
8 |
WPS |
COBRA |
COBRA |
COBRA |
9 |
COBRA |
COBRA |
COBRA |
COBRA |
10 |
COBRA |
COBRA |
COBRA |
COBRA |
11 |
COBRA |
COBRA |
COBRA |
COBRA |
12 |
COBRA |
COBRA |
COBRA |
COBRA |
13 |
COBRA |
COBRA |
COBRA |
COBRA |
14 |
WPS |
COBRA |
COBRA |
COBRA |
15 |
COBRA |
COBRA |
COBRA |
COBRA |
16 |
COBRA |
COBRA |
COBRA |
COBRA |
17 |
PSO |
COBRA |
COBRA |
COBRA |
18 |
COBRA |
COBRA |
COBRA |
COBRA |
19 |
COBRA |
COBRA |
COBRA |
COBRA |
20 |
WPS |
COBRA |
COBRA |
COBRA |
21 |
PSO |
COBRA |
COBRA |
COBRA |
22 |
COBRA |
COBRA |
COBRA |
COBRA |
23 |
WPS |
COBRA |
COBRA |
COBRA |
24 |
COBRA |
COBRA |
COBRA |
COBRA |
25 |
FFA |
COBRA |
COBRA |
COBRA |
26 |
COBRA |
COBRA |
COBRA |
COBRA |
27 |
COBRA |
COBRA |
COBRA |
COBRA |
28 |
PSO |
COBRA |
COBRA |
COBRA |
Table 3
Func |
Best |
Worst |
Mean |
STD |
Feasibility Rate |
1 |
-0.727373 |
–0.559235 |
–0.637199 |
0.0594299 |
100 |
2 |
1.82813 |
4.34865 |
2.75755 |
0.926259 |
100 |
3 |
8.87653 |
8.89164 |
8.87895 |
0.00318718 |
92 |
4 |
5.57793 |
5.58908 |
5.58215 |
0.00312711 |
28 |
5 |
189.952 |
516.713 |
338.063 |
82.963 |
32 |
6 |
103.515 |
563.247 |
369.431 |
72.4124 |
24 |
7 |
0.529035 |
0.888604 |
0.566389 |
0.0676125 |
100 |
8 |
21.8649 |
53.8881 |
23.4468 |
6.23478 |
100 |
9 |
1.9133e+012 |
2.4654e+012 |
2.04001e+012 |
2.12584e+012 |
68 |
10 |
2.30765e+011 |
2.84432e+012 |
6.58941e+011 |
8.31906e+011 |
84 |
11 |
0.0002073305 |
0.00843533 |
0.00190705 |
0.000479151 |
68 |
12 |
–169.36 |
671.962 |
-102.82 |
228.253 |
0 |
13 |
–57.1851 |
-54.9831 |
-57.0463 |
0.429771 |
100 |
14 |
7.9878 |
1.72346e+007 |
6.97436e+006 |
8.37255e+006 |
100 |
15 |
6.9986e+009 |
4.94244e+011 |
7.56845e+010 |
1.22078e+011 |
100 |
16 |
0.724961 |
1.17519 |
0.826752 |
0.165962 |
56 |
17 |
103.275 |
321.092 |
111.988 |
42.6833 |
100 |
18 |
378.76 |
671.905 |
425.911 |
77.383 |
96 |
Table 4
Func |
Best |
Worst |
Mean |
STD |
Feasible rate |
1 |
–0.625073 |
-0.270351 |
-0.42015 |
0.132247 |
100 |
2 |
4.0493 |
5.03476 |
4.57562 |
0.395698 |
92 |
3 |
28.6807 |
28.707 |
28.6861 |
0.00729205 |
36 |
4 |
9.13159 |
9.13525 |
9.13303 |
0.00120324 |
20 |
5 |
478.746 |
555.016 |
485.285 |
18.7327 |
40 |
6 |
493.301 |
600.586 |
504.521 |
25.1907 |
32 |
7 |
0.334309 |
4.23197 |
1.61077 |
1.30831 |
100 |
8 |
470.46 |
1023.44 |
896.351 |
220.978 |
100 |
9 |
2.63268e+012 |
7.38962e+012 |
3.12913e+012 |
1.05068e+012 |
56 |
10 |
1.25487e+012 |
3.94721e+012 |
1.96828e+012 |
5.87269e+011 |
44 |
11 |
–0.00825323 |
-0.00281327 |
-0.00443743 |
0.000266256 |
16 |
12 |
155.45 |
720.707 |
403.327 |
105.636 |
0 |
13 |
-64.3938 |
-53.8018 |
-58.2296 |
4.54096 |
100 |
14 |
398.499 |
2.90799e+007 |
1.24046e+006 |
5.68939e+006 |
100 |
15 |
4.93728e+009 |
4.92773e+012 |
2.14661e+012 |
1.80896e+012 |
100 |
16 |
1.15237 |
1.37727 |
1.19964 |
0.0478142 |
8 |
17 |
332.563 |
381.055 |
344.201 |
20.7101 |
92 |
18 |
290.033 |
357.092 |
355.187 |
13.2994 |
100 |
Table 5
Results obtained by binary modification of COBRA
Func |
Dimension |
Success rate |
Average population size |
Average number of function evaluations |
Average function value |
STD |
1 |
2 |
100 |
31 |
740 |
0.000182069 |
0.000248506 |
3 |
99 |
68 |
3473 |
0.000188191 |
0.000689647 |
|
4 |
93 |
80 |
6730 |
0.00579879 |
0.0242297 |
|
2 |
2 |
100 |
27 |
567 |
0.000236274 |
0.000265351 |
3 |
100 |
30 |
775 |
0.000150127 |
0.000168235 |
|
4 |
100 |
32 |
916 |
0.000355086 |
0.00029257 |
|
3 |
2 |
100 |
32 |
1439 |
0.00019874 |
0.000330485 |
3 |
91 |
51 |
2046 |
0.00150713 |
0.00245315 |
|
4 |
83 |
62 |
3030 |
0.00126295 |
0.00281119 |
|
4 |
2 |
100 |
33 |
931 |
0.000209168 |
0.000268542 |
3 |
100 |
32 |
868 |
0.000191162 |
0.000233884 |
|
4 |
92 |
79 |
1710 |
0.000347666 |
0.000257291 |
|
5 |
2 |
100 |
30 |
899 |
0.00032841 |
0.000468681 |
3 |
90 |
65 |
1332 |
0.000506847 |
0.00140048 |
|
4 |
90 |
160 |
2258 |
0.00411721 |
0.158903 |
|
6 |
2 |
100 |
28 |
1734 |
180.0002 |
0.000185362 |
3 |
98 |
36 |
3294 |
169.801 |
0.169149 |
|
4 |
96 |
41 |
5462 |
159.2 |
0.279294 |
Table 6
Classifier |
Scoring in Australia |
Scoring in Germany |
2SGP |
0.9027 |
0.8015 |
C4.5 |
0.8986 |
0.7773 |
Fuzzy |
0.8910 |
0.7940 |
GP |
0.8889 |
0.7834 |
CART |
0.8744 |
0.7565 |
LR |
0.8696 |
0.7837 |
CCEL |
0.8660 |
0.7460 |
RSM |
0,8520 |
0,6770 |
Bagging |
0.8470 |
0.6840 |
Bayesian |
0.8470 |
0.6790 |
Boosting |
0.7600 |
0.7000 |
k-NN |
0.7150 |
0.7151 |
This study: ANN+COBRA (5) |
0,8907 |
0,7829 |
This study: ANN+COBRA (3) |
0,8898 |
0,7809 |
Table 7
Author (year) |
Method |
Classification accuracy (%) |
Quinlan (1996) |
C4.5 |
94.74 |
Hamiton et al. (1996) |
RAIC |
95.00 |
Ster and Dobnikar (1996) |
LDA |
96.80 |
Nauck and Kruse (1999) |
NEFCLASS |
95.06 |
Pena-Reyes and Sipper (1999) |
Fuzzy-GA1 |
97.36 |
Setiono (2000) |
Neuro-rule 2a |
98.10 |
Albrecht et al. (2002) |
LSA machine |
98.80 |
Abonyi and Szeifert (2003) |
SFC |
95.57 |
Übeyli (2007) |
SVM |
99.54 |
Polat and Günes (2007) |
LS-SVM |
98.53 |
Guijarro-Berdias et al. (2007) |
LLS |
96.00 |
Akay (2009) |
SVM-CFS |
99.51 |
Karabatak and Cevdet-Ince (2009) |
AR + NN |
97.40 |
Peng et al. (2009) |
CFW |
99.50 |
A. Marcano-Cedeño, J. Quintanilla-Domínguez, D. Andina (2011) |
AMMLP |
99.26 |
This study (2013) |
ANN+COBRA (3x1) |
97.62 |
ANN+COBRA (5x1) |
97.67 |
|
ANN+COBRA (3x3) |
98.16 |
Table 8
Classification accuracies obtained with COBRA and other classifiers for diabetes problem
Author (year) |
Method |
Classification accuracy (%) |
Mehmet Recep Bozkurt1, Nilüfer Yur-tay, Ziynet Yilmaz1, Cengiz Sertkaya (2012) |
PNN |
72.00 |
LVQ |
73.60 |
|
FFN |
68.80 |
|
CFN |
68.00 |
|
DTDN |
76.00 |
|
TDN |
66.80 |
|
Gini |
65.97 |
|
AIS |
68.80 |
|
H. Temurtas, N. Yumusak, F. Temurtas (2009) |
MLNN with LM(10xFC) |
79.62 |
PNN (10xFC) |
78.05 |
|
MLNN with LM |
82.37 |
|
PNN |
78.13 |
|
S. M. Kamruzzaman, Ahmed Ryadh Hasan (2005) |
FCNN with PA |
77.344 |
K. Kayaer., T. Yildirim (2003) |
GRNN |
80.21 |
MLNN with LM |
77.08 |
|
L. Meng, P. Putten, H. Wang (2005) |
AIRS |
67.40 |
This study (2013) |
ANN+COBRA (3x1) |
79.65 |
ANN+COBRA (5x1) |
79.71 |
|
ANN+COBRA (3x3) |
79.83 |
Optimization. Zhengzhou University, Computational Intelligence Laboratory, Zhengzhou China, and Nanyang Technological University, Singapore. 2012.
-
8. Eiben A.E., Smith J.E. Introduction to evolutionary computation. Springer, Berlin, 2003.
-
9. Deb K. An efficient constraint handling method for genetic algorithms. Computer methods in applied mechanics and engineering, 186(2-4). 2000. P. 311–338.
-
10. Liang J. J., Shang Zhigang, Li Zhihui. Coevolu-tionary Comprehensive Learning Particle Swarm Optimizer. In Proceedings of Congress on Evolutionary Computation (CEC). 2010. P. 1505–1512.
-
11. Mallipeddi R., Suganthan P. N. Problem Definitions and Evaluation Criteria for the CEC 2010 Competi-
-
tion on Constrained Real-Parameter Optimization. Nan-yang Technological University, Singapore. 2009.
-
12. Kennedy J., Eberhart R. C. A discrete binary version of the particle swarm algorithm. In Proceedings of the World Multiconference on Systemics, Cybernetics and Informatics 1997, Piscataway, NJ. 1997. P. 4104– 4109.
-
13. Frank A., Asuncion, A. UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science. Citing Internet sources available from: < http://archive.ics.uci.edu/ml >. 2010.