The mate selection plays a key role in natural evolution process. Although a variety of mating strategies have been proposed in the community of evolutionary computation, the importance of mate selection has been ignored. In this paper, we propose a clustering based mate selection (CMS) strategy for evolutionary algorithms (EAs). In CMS, the population is partitioned into clusters and only the solutions in the same cluster are chosen for offspring reproduction. Instead of doing a whole new clustering process in each EA generation, the clustering iteration process is combined with the evolution iteration process. The combination of clustering and evolving processes benefits EAs by saving the cost to discover the population structure. To demonstrate this idea, a CMS utilizing the k-means clustering method is proposed and applied to a state-of-the-art EA. The experimental results show that the CMS strategy is promising to improve the performance of the EA.
Citation: Jinyuan Zhang, Aimin Zhou, Guixu Zhang, Hu Zhang. 2017: A clustering based mate selection for evolutionary optimization, Big Data and Information Analytics, 2(1): 77-85. doi: 10.3934/bdia.2017010
[1] | Guojun Gan, Kun Chen . A Soft Subspace Clustering Algorithm with Log-Transformed Distances. Big Data and Information Analytics, 2016, 1(1): 93-109. doi: 10.3934/bdia.2016.1.93 |
[2] | Guojun Gan, Qiujun Lan, Shiyang Sima . Scalable Clustering by Truncated Fuzzy c-means. Big Data and Information Analytics, 2016, 1(2): 247-259. doi: 10.3934/bdia.2016007 |
[3] | Marco Tosato, Jianhong Wu . An application of PART to the Football Manager data for players clusters analyses to inform club team formation. Big Data and Information Analytics, 2018, 3(1): 43-54. doi: 10.3934/bdia.2018002 |
[4] | Xiaoxiao Yuan, Jing Liu, Xingxing Hao . A moving block sequence-based evolutionary algorithm for resource investment project scheduling problems. Big Data and Information Analytics, 2017, 2(1): 39-58. doi: 10.3934/bdia.2017007 |
[5] | Pawan Lingras, Farhana Haider, Matt Triff . Fuzzy temporal meta-clustering of financial trading volatility patterns. Big Data and Information Analytics, 2017, 2(3): 219-238. doi: 10.3934/bdia.2017018 |
[6] | Tao Wu, Yu Lei, Jiao Shi, Maoguo Gong . An evolutionary multiobjective method for low-rank and sparse matrix decomposition. Big Data and Information Analytics, 2017, 2(1): 23-37. doi: 10.3934/bdia.2017006 |
[7] | Jianguo Dai, Wenxue Huang, Yuanyi Pan . A category-based probabilistic approach to feature selection. Big Data and Information Analytics, 2018, 3(1): 14-21. doi: 10.3934/bdia.2017020 |
[8] | Xiaoying Chen, Chong Zhang, Zonglin Shi, Weidong Xiao . Spatio-temporal Keywords Queries in HBase. Big Data and Information Analytics, 2016, 1(1): 81-91. doi: 10.3934/bdia.2016.1.81 |
[9] | Minlong Lin, Ke Tang . Selective further learning of hybrid ensemble for class imbalanced increment learning. Big Data and Information Analytics, 2017, 2(1): 1-21. doi: 10.3934/bdia.2017005 |
[10] | Yaguang Huangfu, Guanqing Liang, Jiannong Cao . MatrixMap: Programming abstraction and implementation of matrix computation for big data analytics. Big Data and Information Analytics, 2016, 1(4): 349-376. doi: 10.3934/bdia.2016015 |
The mate selection plays a key role in natural evolution process. Although a variety of mating strategies have been proposed in the community of evolutionary computation, the importance of mate selection has been ignored. In this paper, we propose a clustering based mate selection (CMS) strategy for evolutionary algorithms (EAs). In CMS, the population is partitioned into clusters and only the solutions in the same cluster are chosen for offspring reproduction. Instead of doing a whole new clustering process in each EA generation, the clustering iteration process is combined with the evolution iteration process. The combination of clustering and evolving processes benefits EAs by saving the cost to discover the population structure. To demonstrate this idea, a CMS utilizing the k-means clustering method is proposed and applied to a state-of-the-art EA. The experimental results show that the CMS strategy is promising to improve the performance of the EA.
In this paper, we consider the following continuous global optimization problem.
minf(x) s.t x∈[ai,bi]n | (1) |
where
The evolutionary algorithm (EA) is a type of heuristic optimization method, which is inspired by the natural evolution process [1]. It has become a major method to tackle (1). The major components in a general EA include a reproduction operator, and a selection operator. There is a key operation in natural evolution, named as mate selection, which chooses mating pairs or groups for breeding and plays a key role in sexual propagation. In EAs, a proper mate selection can also control the population convergence and diversity efficiently [13,6,12]. In the last decades some mating strategies have been proposed [16], including random mating, roulette wheel selection, truncate selection, tournament selection, gender based selection [7,15,19], niche based selection [2], dissociative selection [3,4], and some other methods [5,14,18]. Although these mating strategies have been proposed, it has not attracted much attention in the community of evolutionary computation [16]. The major reasons might be that (a) most of existing mating strategies need some problem specific control parameters or are computationally expensive, and (b) some widely used EAs work well by randomly choosing mating pairs. In this paper, we shall demonstrate that existing EAs can be improved by using properly designed mating strategies.
Statistical and machine learning (SML) techniques aim to extract information from data sets and transform it into an understandable pattern or structure for further use [8]. It is arguable that in EAs, an individual can be regarded as a training example, and its corresponding fitness value be a label. From the viewpoint of SML, the population of an EA forms a training data set. Therefore, SML techniques can be naturally applied to EAs to extract population information and guide the search. Some algorithms, such as estimation of distribution algorithms [10], and surrogate assisted evolutionary algorithms [9], are along this direction. Basically SML techniques are computationally expensive comparing to general EAs, which limits their usages. How to use SML techniques in EAs more efficiently is still an open question.
In this paper, we present a new way to combine SML methods with EAs. The basic idea is to iteratively call SML training step and EA evolving step. In the SML step, the obtained population is utilized to train a model that captures the population structure, and then in the EA step, the population structure information extracted in the SML step is used to guide the search. The combination of the SML iteration process and the EA iteration process can find and refine the population structure information and thus save the SML cost. In multi-objective evolutionary optimization, there are some works with similar idea[20]. However for scalar-objective optimization, this strategy is still new. Based on this idea, this paper proposes a clustering based mate selection (CMS) operator for EAs. In CMS, the population is partitioned into classes in each generation, and only the solutions in the same class are allowed to mate with each other. A CMS utilizing the k-means [11] clustering method is proposed and applied to a state-of-the-art EA to show its advantages.
The rest of the paper is organized as follows. Section 2 presents the proposed CMS strategy. An EA integrated CMS is introduced in detail as well. Section 3 compares the proposed CMS strategy with some other mating strategies, and studies the influence of the control parameters. Finally, the paper is concluded in Section 4.
A major challenge by applying SML techniques in EAs is on the high computational cost. This section introduces a clustering based mate selection (CMS) to address the challenge. The basic idea is to combine an EA with an iterative clustering method together. Take the k-means method as an example. In each generation (iteration) of the combined process, the clustering process uses the EA population to assign points and update cluster centers; and then based on the population partition, an EA chooses the parents in the same cluster to generate new trails solutions. It should be noted that the CMS assisted EA does not implement a clustering method in each generation. Instead it combines the clustering iteration with the EA iteration, and only one clustering iteration is implemented in each generation. Actually, the clustering process is only implemented several times sequentially along with the EA process. By this way, the computational cost is saved up. The major components of an iterative clustering method and an EA are combined in the CMS assisted EA (CMS-EA for short). A restart checking component is added to reinitialize the clustering. A major reason is to avoid getting into local optima in clustering.
In this paper, we use the CMS strategy to improve the performance of the composite differential evolution (CoDE) algorithm [21]. In CoDE, each solution produces three candidate offspring solutions by using three reproduction operators with randomly selected three control parameters, and chooses the best candidate as the offspring solution for updating. More details of CoDE can be found in [21]. The k-means clustering method is used to partition the population. In the following, we give the framework of the proposed approach, named as CMS-CoDE.
1 Randomly initialize a population |
2 If |
3 For each solution |
where |
4 For each cluster |
5 For each solution |
5.1 Generate trial solution |
5.2 Set |
5.3 Replace |
6 If the stop condition is not satisfied, set |
We would make the following comments to the above algorithm.
• In Step 2, the clustering process is re-initialized every
• In Step 5.1, CoDE generates three candidate solutions for each solution
u1,j={xr1,j+F⋅(xr2,j−xr3,j)if rand<Cr or j=jrndxjotherwiseu2,j={xr1,j+F⋅(xr2,j−xr3,j)+F⋅(xr4,j−xr5,j)if rand<Cr or j=jrndxjotherwiseu3,j=xj+rand⋅(xr1,j−xj)+F⋅(xr2,j−xr3,j) |
where
• In Step 6, CoDE terminates when the function evaluation exceeds a given threshold.
• It is required that the minimum number of solutions in each cluster is
In this section, we compare the proposed CMS strategy with the following related strategies. Random mating strategy (RND): In this strategy, the parent solutions are randomly chosen from the whole population. The original CoDE algorithm actually utilizes this strategy. Nearest neighbor strategy (NNS): For a solution
To access the performance of the compared strategies, the first
The experimental results are given in Table 1, and the population partitions of a typical run for BCS and CMS strategies with CoDE are plotted in Fig. 1 on two instances.
RND | NNS | BCS | CMS | |
F1 | 3.21e-08+ | 0.00e+00≈ | 0.00e+00≈ | 0.00e+00 |
F2 | 3.14e-01+ | 8.17e-04+ | 2.71e-05− | 5.66e-05 |
F3 | 1.21e+05− | 2.13e+05− | 2.22e+05≈ | 2.64e+05 |
F4 | 1.08e+01+ | 3.74e+00+ | 1.87e-01− | 2.72e-01 |
F5 | 3.90e+02− | 1.00e+03≈ | 6.85e+02− | 8.69e+02 |
F6 | 2.65e+01+ | 7.19e+01+ | 4.55e+01≈ | 3.80e+01 |
F7 | 4.70e+03+ | 4.70e+03+ | 4.70e+03≈ | 4.70e+03 |
F8 | 2.09e+01+ | 2.08e+01+ | 2.02e+01− | 2.03e+01 |
F9 | 1.67e+01+ | 5.15e-06− | 4.65e+00− | 7.31e+00 |
F10 | 1.63e+02+ | 3.52e+01− | 4.64e+01+ | 4.11e+01 |
F11 | 3.37e+01+ | 9.85e+00− | 1.33e+01≈ | 1.35e+01 |
F12 | 1.88e+05+ | 1.54e+05+ | 7.16e+04≈ | 7.90e+04 |
F13 | 8.18e+00+ | 2.65e+00− | 2.58e+00− | 2.91e+00 |
F14 | 1.33e+01+ | 1.21e+01− | 1.24e+01≈ | 1.23e+01 |
F15 | 6.40e+02+ | 5.14e+02+ | 4.71e+02≈ | 4.78e+02 |
F16 | 4.25e+02+ | 3.00e+02− | 3.10e+02≈ | 3.15e+02 |
F17 | 4.66e+02+ | 3.03e+02− | 3.16e+02≈ | 3.19e+02 |
F18 | 9.26e+02− | 9.26e+02≈ | 9.24e+02≈ | 9.24e+02 |
F19 | 9.26e+02≈ | 9.26e+02− | 9.25e+02≈ | 9.26e+02 |
F20 | 9.26e+02≈ | 9.24e+02− | 9.25e+02≈ | 9.25e+02 |
+ | 15 | 7 | 1 | |
− | 3 | 10 | 6 | |
≈ | 2 | 3 | 13 |
RND vs. CMS: From Table 1, we can see that CMS-CoDE performs better than RND-CoDE on 15 test instances and worse than RND-CoDE on 3 test instances. This suggests that, since CMS restricts the mating parents to be selected from the similar individuals, the mating strategy can improve the algorithm performance significantly.
NNS vs. CMS: It is clear from Table 1 that, CMS-CoDE outperforms NNS-CoDE on 7 instances and is outperformed by NNS-CoDE on 10 instances. In NNS, the parents are the closest ones with similar characteristics around the solution. Thus it may help to converge to optima quickly especially when there is no variable dependency in the problems. In CMS, the parents are likely to be the closest ones but there is still some probability that the parents are far away from each. This may help to keep population diversity in a sense. And this might be the reason to explain the different performances between NNS and CMS. Although the results are comparable, we can see from the next section that CMS-CoDE has a lower theoretical computational complexity than NNS-CoDE.
BCS vs. CMS: It is surprising that BCS-CoDE performs slightly better than CMS-CoDE. Table 1 shows that there is not much difference between the results obtained by the two algorithms on 13 out of 20 test instances. The reason might be that the clustering results of k-means highly depend on the initial cluster centers and k-means is very likely to converge to local optima. Therefore, the mis-clustering in k-means leads some randomness to the population and prevent the premature of the population. Although BCS-CoDE is slightly superior to CMS-CoDE, it has a higher computational complexity according to the analysis in the next section.
With respect to the population partitions for BCS and CMS strategies with CoDE on F2 and F3, it is easy to find from Fig. 1 that, for CMS-CoDE, during the continuous 9 generations, the population partitions change a little; but for BCS-CoDE, it presents quite different partition models at each generation. The reason might be that the clustering operation of CMS-CoDE only iterates one time but that of BCS-CoDE iterate for many times. Actually, the stable population should be more helpful to generate the solutions with high quality.
The additional time complexity brought by the mating strategies is a major concern. The time complexity for the four strategies are as follows. RND:
It is reasonable to assume that
We also record the CPU run time in Table 2 although it depends on the algorithm implementation. It clearly shows that the additional cost consumed by CMS is not much by comparing RND and CMS strategies. On some instances, the CPU time of CMS is slightly less than that of RND. On all the instances, BCS needs more time than CMS which is consistent with the above analysis.
RND | NNS | BCS | CMS | |
F1 | 11.68 | 12.84 | 17.61 | 12.99 |
F2 | 12.21 | 12.70 | 15.02 | 12.82 |
F3 | 12.93 | 12.83 | 15.11 | 13.09 |
F4 | 12.99 | 13.29 | 15.39 | 13.31 |
F5 | 16.07 | 15.18 | 17.08 | 15.09 |
F6 | 11.72 | 12.76 | 500.28 | 12.71 |
F7 | 14.54 | 15.73 | 17.33 | 15.58 |
F8 | 16.89 | 17.65 | 16.89 | 14.39 |
F9 | 14.98 | 14.46 | 126.06 | 12.82 |
F10 | 15.06 | 13.14 | 13.98 | 13.09 |
F11 | 61.29 | 58.76 | 58.64 | 57.58 |
F12 | 46.49 | 47.59 | 44.44 | 42.98 |
F13 | 13.06 | 13.15 | 14.53 | 13.00 |
F14 | 17.08 | 16.40 | 16.32 | 14.39 |
F15 | 113.89 | 115.58 | 180.81 | 115.32 |
F16 | 119.73 | 116.78 | 171.78 | 116.16 |
F17 | 120.10 | 117.87 | 452.28 | 117.31 |
F18 | 123.71 | 121.59 | 121.74 | 120.65 |
F19 | 149.03 | 152.96 | 152.79 | 147.89 |
F20 | 220.64 | 217.06 | 218.65 | 215.53 |
There are two control parameters in CMS: the number of clusters
Fig. 2 plots the error bars of the results obtained by CMS-CoDE with different combinations of control parameters over 30 runs on the 6 instances. On F2, it clearly shows that as
In this paper, we proposed a strategy to integrate statistical and machine learning (SML) techniques to guide the search of evolutionary algorithms (EAs) efficiently. The idea is to combine the SML iteration and EA iteration together. The learning process and optimization process are performed alternatively. As an example, a general clustering based mate selection (CMS) assisted EA framework was proposed. In CMS, the population is partitioned into classes and the parents in the same class are allowed to do offspring reproduction. More specifically, a CMS utilizing k-means clustering technique was designed and integrated into a state-of-the-art EA. The experimental results suggested that CMS can improve the performance of existing EAs. The time complexity analysis also showed that the proposed approach does not bring much additional cost to the EA to improve.
It should be noted the current work is very preliminary and there are a variety of directions worth exploring. The combination of CMS and EAs should be improved. How to organize data for model building should be studied. Furthermore, it is worth to apply CMS strategy to multi-objective optimization problems.
[1] | Back T., Fogel D.B., Michalewicz v, et al. (1997) Handbook of Evolutionary Computation Oxford University Press. |
[2] | K. Deb and D. E. Goldberg, An investigation of niche and species formation in genetic function optimization, in Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann Publishers Inc. , 1989, 42–50. |
[3] | L. J. Eshelman and J. D. Schaffer, Preventing premature convergence in genetic algorithms by preventing incest, in International Conference on Genetic Algorithms, 1991,115–122. |
[4] | Fernandes C.M., Rosa A.C. (2008) Evolutionary algorithms with dissortative mating on static and dynamic environments. Advances in Evolutionary Algorithms 181-206. |
[5] | Galán S.F., Mengshoel O.J., Pinter R. (2013) A novel mating approach for genetic algorithms. Evolutionary Computation 21: 197-229. |
[6] |
A. Gog, C. Chira, D. Dumitrescu and D. Zaharie, Analysis of some mating and collaboration strategies in evolutionary algorithms, in 10th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing, IEEE, 2008,538–542. 10.1109/SYNASC.2008.87 |
[7] | Goh K.S., Lim A., Rodrigues B. (2003) Sexual selection for genetic algorithms. Artificial Intelligence Review 19: 123-152. |
[8] |
T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction Second edition. Springer Series in Statistics. Springer, New York, 2009. 10.1007/978-0-387-84858-7 MR2722294 |
[9] |
Jin Y. (2011) Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation 1: 61-70. doi: 10.1016/j.swevo.2011.05.001
![]() |
[10] |
P. Larranaga and J. A. Lozano, Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation Kluwer Academic Publishers, 2002. 10.1007/978-1-4615-1539-5 |
[11] |
J. B. MacQueen, Some methods for classification and analysis of multivariate observations, in Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability, Ed. University of California Press, (1967), 281–297. MR0214227 |
[12] |
G. Ochoa, C. Mädler-Kron, R. Rodriguez and K. Jaffe, Assortative mating in genetic algorithms for dynamic problems, in Applications of Evolutionary Computing, Springer, 2005,617–622. 10.1007/978-3-540-32003-6_65 |
[13] | T. S. Quirino, Improving Search in Genetic Algorithms Through Instinct-Based Mating Strategies Ph. D. dissertation, The University of Miami, 2012. |
[14] |
Quirino T., Kubat M., Bryan N.J. (2010) Instinct-based mating in genetic algorithms applied to the tuning of 1-nn classifiers. IEEE Transactions on Knowledge and Data Engineering 22: 1724-1737. doi: 10.1109/TKDE.2009.211
![]() |
[15] | J. Sanchez-Velazco and J. A. Bullinaria, Sexual selection with competitive/co-operative operators for genetic algorithms, in Neural Networks and Computational Intelligence(NCI). ACTA Press, 2003,191–196. |
[16] | Sivaraj R., Ravichandran T. (2011) A review of selection methods in genetic algorithm. International Journal of Engineering Science and Technology (IJEST) 3: 3792-3797. |
[17] | P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. P. Chen, A. Auger and S. Tiwari, Problem Definitions and Evaluation Criteria for the cec 2005 Special Session on Real-Parameter Optimization Tech. rep. , Nanyang Technological University, Singapore and Kanpur Genetic Algorithms 369 Laboratory, IIT Kanpur, 2005. |
[18] |
Ting C.-K., Li S.-T., Lee C. (2003) On the harmonious mating strategy through tabu search. Information Sciences 156: 189-214. doi: 10.1016/S0020-0255(03)00176-2
![]() |
[19] | S. Wagner and M. Affenzeller, Sexualga: Gender-specific selection for genetic algorithms, in Proceedings of the 9th World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI), 4 (2005), 76–81. |
[20] |
Wang R., Fleming P.J., Purshousea R.C. (2014) General framework for localised multi-objective evolutionary algorithms. Information Sciences 258: 29-53. doi: 10.1016/j.ins.2013.08.049
![]() |
[21] |
Wang Y., Cai Z., Zhang Q. (2011) Differential evolution with composite trial vector generation strategies and control parameters. IEEE Transactions on Evolutionary Computation 15: 55-66. doi: 10.1109/TEVC.2010.2087271
![]() |
RND | NNS | BCS | CMS | |
F1 | 3.21e-08+ | 0.00e+00≈ | 0.00e+00≈ | 0.00e+00 |
F2 | 3.14e-01+ | 8.17e-04+ | 2.71e-05− | 5.66e-05 |
F3 | 1.21e+05− | 2.13e+05− | 2.22e+05≈ | 2.64e+05 |
F4 | 1.08e+01+ | 3.74e+00+ | 1.87e-01− | 2.72e-01 |
F5 | 3.90e+02− | 1.00e+03≈ | 6.85e+02− | 8.69e+02 |
F6 | 2.65e+01+ | 7.19e+01+ | 4.55e+01≈ | 3.80e+01 |
F7 | 4.70e+03+ | 4.70e+03+ | 4.70e+03≈ | 4.70e+03 |
F8 | 2.09e+01+ | 2.08e+01+ | 2.02e+01− | 2.03e+01 |
F9 | 1.67e+01+ | 5.15e-06− | 4.65e+00− | 7.31e+00 |
F10 | 1.63e+02+ | 3.52e+01− | 4.64e+01+ | 4.11e+01 |
F11 | 3.37e+01+ | 9.85e+00− | 1.33e+01≈ | 1.35e+01 |
F12 | 1.88e+05+ | 1.54e+05+ | 7.16e+04≈ | 7.90e+04 |
F13 | 8.18e+00+ | 2.65e+00− | 2.58e+00− | 2.91e+00 |
F14 | 1.33e+01+ | 1.21e+01− | 1.24e+01≈ | 1.23e+01 |
F15 | 6.40e+02+ | 5.14e+02+ | 4.71e+02≈ | 4.78e+02 |
F16 | 4.25e+02+ | 3.00e+02− | 3.10e+02≈ | 3.15e+02 |
F17 | 4.66e+02+ | 3.03e+02− | 3.16e+02≈ | 3.19e+02 |
F18 | 9.26e+02− | 9.26e+02≈ | 9.24e+02≈ | 9.24e+02 |
F19 | 9.26e+02≈ | 9.26e+02− | 9.25e+02≈ | 9.26e+02 |
F20 | 9.26e+02≈ | 9.24e+02− | 9.25e+02≈ | 9.25e+02 |
+ | 15 | 7 | 1 | |
− | 3 | 10 | 6 | |
≈ | 2 | 3 | 13 |
RND | NNS | BCS | CMS | |
F1 | 11.68 | 12.84 | 17.61 | 12.99 |
F2 | 12.21 | 12.70 | 15.02 | 12.82 |
F3 | 12.93 | 12.83 | 15.11 | 13.09 |
F4 | 12.99 | 13.29 | 15.39 | 13.31 |
F5 | 16.07 | 15.18 | 17.08 | 15.09 |
F6 | 11.72 | 12.76 | 500.28 | 12.71 |
F7 | 14.54 | 15.73 | 17.33 | 15.58 |
F8 | 16.89 | 17.65 | 16.89 | 14.39 |
F9 | 14.98 | 14.46 | 126.06 | 12.82 |
F10 | 15.06 | 13.14 | 13.98 | 13.09 |
F11 | 61.29 | 58.76 | 58.64 | 57.58 |
F12 | 46.49 | 47.59 | 44.44 | 42.98 |
F13 | 13.06 | 13.15 | 14.53 | 13.00 |
F14 | 17.08 | 16.40 | 16.32 | 14.39 |
F15 | 113.89 | 115.58 | 180.81 | 115.32 |
F16 | 119.73 | 116.78 | 171.78 | 116.16 |
F17 | 120.10 | 117.87 | 452.28 | 117.31 |
F18 | 123.71 | 121.59 | 121.74 | 120.65 |
F19 | 149.03 | 152.96 | 152.79 | 147.89 |
F20 | 220.64 | 217.06 | 218.65 | 215.53 |
RND | NNS | BCS | CMS | |
F1 | 3.21e-08+ | 0.00e+00≈ | 0.00e+00≈ | 0.00e+00 |
F2 | 3.14e-01+ | 8.17e-04+ | 2.71e-05− | 5.66e-05 |
F3 | 1.21e+05− | 2.13e+05− | 2.22e+05≈ | 2.64e+05 |
F4 | 1.08e+01+ | 3.74e+00+ | 1.87e-01− | 2.72e-01 |
F5 | 3.90e+02− | 1.00e+03≈ | 6.85e+02− | 8.69e+02 |
F6 | 2.65e+01+ | 7.19e+01+ | 4.55e+01≈ | 3.80e+01 |
F7 | 4.70e+03+ | 4.70e+03+ | 4.70e+03≈ | 4.70e+03 |
F8 | 2.09e+01+ | 2.08e+01+ | 2.02e+01− | 2.03e+01 |
F9 | 1.67e+01+ | 5.15e-06− | 4.65e+00− | 7.31e+00 |
F10 | 1.63e+02+ | 3.52e+01− | 4.64e+01+ | 4.11e+01 |
F11 | 3.37e+01+ | 9.85e+00− | 1.33e+01≈ | 1.35e+01 |
F12 | 1.88e+05+ | 1.54e+05+ | 7.16e+04≈ | 7.90e+04 |
F13 | 8.18e+00+ | 2.65e+00− | 2.58e+00− | 2.91e+00 |
F14 | 1.33e+01+ | 1.21e+01− | 1.24e+01≈ | 1.23e+01 |
F15 | 6.40e+02+ | 5.14e+02+ | 4.71e+02≈ | 4.78e+02 |
F16 | 4.25e+02+ | 3.00e+02− | 3.10e+02≈ | 3.15e+02 |
F17 | 4.66e+02+ | 3.03e+02− | 3.16e+02≈ | 3.19e+02 |
F18 | 9.26e+02− | 9.26e+02≈ | 9.24e+02≈ | 9.24e+02 |
F19 | 9.26e+02≈ | 9.26e+02− | 9.25e+02≈ | 9.26e+02 |
F20 | 9.26e+02≈ | 9.24e+02− | 9.25e+02≈ | 9.25e+02 |
+ | 15 | 7 | 1 | |
− | 3 | 10 | 6 | |
≈ | 2 | 3 | 13 |
RND | NNS | BCS | CMS | |
F1 | 11.68 | 12.84 | 17.61 | 12.99 |
F2 | 12.21 | 12.70 | 15.02 | 12.82 |
F3 | 12.93 | 12.83 | 15.11 | 13.09 |
F4 | 12.99 | 13.29 | 15.39 | 13.31 |
F5 | 16.07 | 15.18 | 17.08 | 15.09 |
F6 | 11.72 | 12.76 | 500.28 | 12.71 |
F7 | 14.54 | 15.73 | 17.33 | 15.58 |
F8 | 16.89 | 17.65 | 16.89 | 14.39 |
F9 | 14.98 | 14.46 | 126.06 | 12.82 |
F10 | 15.06 | 13.14 | 13.98 | 13.09 |
F11 | 61.29 | 58.76 | 58.64 | 57.58 |
F12 | 46.49 | 47.59 | 44.44 | 42.98 |
F13 | 13.06 | 13.15 | 14.53 | 13.00 |
F14 | 17.08 | 16.40 | 16.32 | 14.39 |
F15 | 113.89 | 115.58 | 180.81 | 115.32 |
F16 | 119.73 | 116.78 | 171.78 | 116.16 |
F17 | 120.10 | 117.87 | 452.28 | 117.31 |
F18 | 123.71 | 121.59 | 121.74 | 120.65 |
F19 | 149.03 | 152.96 | 152.79 | 147.89 |
F20 | 220.64 | 217.06 | 218.65 | 215.53 |