Research article

An improved particle swarm optimization combined with double-chaos search


  • Received: 28 May 2023 Revised: 28 June 2023 Accepted: 09 July 2023 Published: 28 July 2023
  • Particle swarm optimization (PSO) has been successfully applied to various complex optimization problems due to its simplicity and efficiency. However, the update strategy of the standard PSO algorithm is to learn from the global best particle, making it difficult to maintain diversity in the population and prone to premature convergence due to being trapped in local optima. Chaos search mechanism is an optimization technique based on chaotic dynamics, which utilizes the randomness and nonlinearity of a chaotic system for global search and can escape from local optima. To overcome the limitations of PSO, an improved particle swarm optimization combined with double-chaos search (DCS-PSO) is proposed in this paper. In DCS-PSO, we first introduce double-chaos search mechanism to narrow the search space, which enables PSO to focus on the neighborhood of the optimal solution and reduces the probability that the swarm gets trapped into a local optimum. Second, to enhance the population diversity, the logistic map is employed to perform a global search in the narrowed search space and the best solution found by both the logistic and population search guides the population to converge. Experimental results show that DCS-PSO can effectively narrow the search space and has better convergence accuracy and speed in most cases.

    Citation: Xuepeng Zheng, Bin Nie, Jiandong Chen, Yuwen Du, Yuchao Zhang, Haike Jin. An improved particle swarm optimization combined with double-chaos search[J]. Mathematical Biosciences and Engineering, 2023, 20(9): 15737-15764. doi: 10.3934/mbe.2023701

    Related Papers:

  • Particle swarm optimization (PSO) has been successfully applied to various complex optimization problems due to its simplicity and efficiency. However, the update strategy of the standard PSO algorithm is to learn from the global best particle, making it difficult to maintain diversity in the population and prone to premature convergence due to being trapped in local optima. Chaos search mechanism is an optimization technique based on chaotic dynamics, which utilizes the randomness and nonlinearity of a chaotic system for global search and can escape from local optima. To overcome the limitations of PSO, an improved particle swarm optimization combined with double-chaos search (DCS-PSO) is proposed in this paper. In DCS-PSO, we first introduce double-chaos search mechanism to narrow the search space, which enables PSO to focus on the neighborhood of the optimal solution and reduces the probability that the swarm gets trapped into a local optimum. Second, to enhance the population diversity, the logistic map is employed to perform a global search in the narrowed search space and the best solution found by both the logistic and population search guides the population to converge. Experimental results show that DCS-PSO can effectively narrow the search space and has better convergence accuracy and speed in most cases.



    加载中


    [1] J. Kennedy, R. Eberhart, Particle swarm optimization, in Proceedings of ICNN'95-international conference on neural networks, 4 (1995), 1942–1948. https://doi.org/10.1109/ICNN.1995.488968
    [2] J. H. Holland, Genetic algorithms, Sci. Am., 21 (1992), 66–73. https://doi.org/10.1038/scientificamerican0792-66 doi: 10.1038/scientificamerican0792-66
    [3] G. G. Wang, S. Deb, Z. Cui, Monarch butterfly optimization, Neural Comput. Appl., 31 (2019), 1995–2014. https://doi.org/10.1007/s00521-015-1923-y doi: 10.1007/s00521-015-1923-y
    [4] S. Li, H. Chen, M. Wang, A. A. Heidari, S. Mirjalili, Slime mould algorithm: A new method for stochastic optimization, Future Gener. Comput. Syst., 111 (2020), 300–323. https://doi.org/10.1016/j.future.2020.03.055 doi: 10.1016/j.future.2020.03.055
    [5] G. G. Wang, Moth search algorithm: a bio-inspired metaheuristic algorithm for global optimization problems, Memetic Comput., 10 (2018), 151–164. https://doi.org/10.1007/s12293-016-0212-3 doi: 10.1007/s12293-016-0212-3
    [6] Y. Yang, H. Chen, A. A. Heidari, A. H. Gandomi, Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts, Expert Syst. Appl., 177 (2021), 114864. https://doi.org/10.1016/j.eswa.2021.114864 doi: 10.1016/j.eswa.2021.114864
    [7] I. Ahmadianfar, A. A. Heidari, A. H. Gandomi, X. Chu, H. Chen, RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method, Expert Syst. Appl., 181 (2021), 115079. https://doi.org/10.1016/j.eswa.2021.115079 doi: 10.1016/j.eswa.2021.115079
    [8] J. Tu, H. Chen, M. Wang, A. H. Gandomi, The colony predation algorithm, J. Bionic. Eng., 18 (2021), 674–710. https://doi.org/10.1007/s42235-021-0050-y doi: 10.1007/s42235-021-0050-y
    [9] I. Ahmadianfar, A. A. Heidari, S. Noshadian, H. Chen, A. H. Gandomi, INFO: An efficient optimization algorithm based on weighted mean of vectors, Expert Syst. Appl., 195 (2022), 116516. https://doi.org/10.1016/j.eswa.2022.116516 doi: 10.1016/j.eswa.2022.116516
    [10] A. A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, H. Chen, Harris hawks optimization: Algorithm and applications, Future Gener. Comput. Syst., 97 (2019), 849–872. https://doi.org/10.1016/j.future.2019.02.028 doi: 10.1016/j.future.2019.02.028
    [11] H. Su, D. Zhao, A. A. Heidari, L. Liu, X. Zhang, M. Mafarja, et al., RIME: A physics-based optimization, Neurocomputing, 532 (2023), 183–214. https://doi.org/10.1016/j.neucom.2023.02.010 doi: 10.1016/j.neucom.2023.02.010
    [12] Y. Li, G. Wang, H. Chen, L. Shi, L. Qin, An ant colony optimization based dimension reduction method for high-dimensional datasets, J. Bionic. Eng., 10 (2013), 231–241. https://doi.org/10.1016/S1672-6529(13)60219-X doi: 10.1016/S1672-6529(13)60219-X
    [13] S. Thawkar, S. Sharma, M. Khanna, L. K. Singh, Breast cancer prediction using a hybrid method based on Butterfly Optimization Algorithm and Ant Lion Optimizer, Comput. Biol. Med., 139 (2021), 104968. https://doi.org/10.1016/j.compbiomed.2021.104968 doi: 10.1016/j.compbiomed.2021.104968
    [14] S. Chakraborty, A. K. Saha, S. Nama, S. Debnath, COVID-19 X-ray image segmentation by modified whale optimization algorithm with population reduction, Comput. Biol. Med., 139 (2021), 104984. https://doi.org/10.1016/j.compbiomed.2021.104984 doi: 10.1016/j.compbiomed.2021.104984
    [15] X. Gao, L. Wang, X. Yu, X. Su, Y. Ding, C. Lu, et al., Conditional probability based multi-objective cooperative task assignment for heterogeneous UAVs, Eng. Appl. Artif. Intell., 123 (2023), 106404. https://doi.org/10.1016/j.engappai.2023.106404 doi: 10.1016/j.engappai.2023.106404
    [16] X. Yu, X. Gao, L. Wang, X. Wang, Y. Ding, C. Lu, et al., Cooperative multi-uav task assignment in cross-regional joint operations considering ammunition inventory, Drones, 6 (2022), 77. https://doi.org/10.3390/drones6030077 doi: 10.3390/drones6030077
    [17] C. Li, Y. Zhang, X. Su, X. Wang, An improved optimization algorithm for aeronautical maintenance and repair task scheduling problem, Mathematics, 10 (2022), 3777. https://doi.org/10.3390/math10203777 doi: 10.3390/math10203777
    [18] F. Wang, H. Zhang, K. Li, Z. Lin, J. Yang, X. L. Shen, A hybrid particle swarm optimization algorithm using adaptive learning strategy, Inf. Sci., 436 (2018), 162–177. https://doi.org/10.1016/j.ins.2018.01.027 doi: 10.1016/j.ins.2018.01.027
    [19] A. Ratnaweera, S. K. Halgamuge, H. C. Watson, Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients, IEEE Trans. Evol. Comput., 8 (2004), 240–255. https://doi.org/10.1109/TEVC.2004.826071 doi: 10.1109/TEVC.2004.826071
    [20] B. Y. Qu, P. N. Suganthan, S. Das, A distance-based locally informed particle swarm model for multimodal optimization, IEEE Trans. Evol. Comput., 17 (2013), 387–402. https://doi.org/10.1109/TEVC.2012.2203138 doi: 10.1109/TEVC.2012.2203138
    [21] R. Mendes, J. Kennedy, J. Neves, The fully informed particle swarm: simpler, maybe better, IEEE Trans. Evol. Comput., 8 (2004), 204–210. https://doi.org/10.1109/TEVC.2004.826074 doi: 10.1109/TEVC.2004.826074
    [22] Y. Shi, R. Eberhart, A modified particle swarm optimizer, in 1998 IEEE international conference on evolutionary computation proceedings. IEEE world congress on computational intelligence (Cat. No. 98TH8360), (1998), 69–73. https://doi.org/10.1109/ICEC.1998.699146
    [23] Y. Shi, R. C. Eberhart, Empirical study of particle swarm optimization, in Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), 3 (1999), 1945–1950. https://doi.org/10.1109/CEC.1999.785511
    [24] H. Liu, X. W. Zhang, L. P. Tu, A modified particle swarm optimization using adaptive strategy, Expert Syst. Appl., 152 (2020), 113353. https://doi.org/10.1016/j.eswa.2020.113353 doi: 10.1016/j.eswa.2020.113353
    [25] K. Chen, F. Zhou, L. Yin, S. Wang, Y. Wang, F. Wan, A hybrid particle swarm optimizer with sine cosine acceleration coefficients, Inf. Sci., 422 (2018), 218–241. https://doi.org/10.1016/j.ins.2017.09.015 doi: 10.1016/j.ins.2017.09.015
    [26] J. Kennedy, R. Mendes, Population structure and particle swarm performance, in Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No. 02TH8600), 2 (2002), 1671–1676. https://doi.org/10.1109/CEC.2002.1004493
    [27] J. J. Liang, P. N. Suganthan, Dynamic multi-swarm particle swarm optimizer with a novel constraint-handling mechanism, in 2006 IEEE International Conference on Evolutionary Computation, (2006), 9–16. https://doi.org/10.1109/CEC.2006.1688284
    [28] J. J. Liang, A. K. Qin, P. N. Suganthan, S. Baskar, Comprehensive learning particle swarm optimizer for global optimization of multimodal functions, IEEE Trans. Evol. Comput., 103 (2006), 281–295. https://doi.org/10.1109/TEVC.2005.857610 doi: 10.1109/TEVC.2005.857610
    [29] Y. Wang, B. Wang, Z. Li, C. Xu, A novel particle swarm optimization based on hybrid-learning model, Math. Biosci. Eng., 20 (2023), 7056–7087. https://doi.org/10.3934/mbe.2023305 doi: 10.3934/mbe.2023305
    [30] X. Zhou, S. Zhou, Y. Han, S. Zhu, Lévy flight-based inverse adaptive comprehensive learning particle swarm optimization, Math. Biosci. Eng., 19 (2022), 5241–5268. https://doi.org/10.3934/mbe.2022246 doi: 10.3934/mbe.2022246
    [31] K. T. Alligood, T. D. Sauer, J. A. Yorke, D. Chillingworth, Chaos: an introduction to dynamical systems, Phys. Today, 50 (1997), 67–68. https://doi.org/10.1063/1.882006 doi: 10.1063/1.882006
    [32] B. Li, W. S. Jiang, Chaotic optimization method and its application, Control Theory Appl., 14 (1997), 613–615.
    [33] M. Ji, H. Tang, Application of chaos in simulated annealing, Chaos Solitons Fractals, 21 (2004), 933–941. https://doi.org/10.1016/j.chaos.2003.12.032 doi: 10.1016/j.chaos.2003.12.032
    [34] K. Aihara, T. Takabe, M. Toyoda, Chaotic neural networks, Phys. Lett. A, 144 (1990), 333–340. https://doi.org/10.1016/0375-9601(90)90136-C doi: 10.1016/0375-9601(90)90136-C
    [35] B. L. W. Jiang, Optimizing complex functions by chaos search, Cybern. Syst., 29 (1998), 409–419. https://doi.org/10.1080/019697298125678 doi: 10.1080/019697298125678
    [36] R. M. May, Simple mathematical models with very complicated dynamics, Nature, 261 (1976), 459–467. https://doi.org/10.1038/261459a0 doi: 10.1038/261459a0
    [37] C. B. Xiu, X. D. Liu, Y. H. Zhang, Optimization algorithm using two kinds of chaos and its application, Control Decis., 6 (2003), 724–726.
    [38] H. Y. Liang, X. S. Gu, A novel chaos optimization algorithm based on parallel computing, J. East China Univ Sci. Technol., 4 (2004), 450–453.
    [39] B. Liu, L. Wang, Y. H. Jin, F. Tang, D. X. Huang, Improved particle swarm optimization combined with chaos, Chaos Solitons Fractals, 25 (2005), 1261–1271. https://doi.org/10.1016/j.chaos.2004.11.095 doi: 10.1016/j.chaos.2004.11.095
    [40] P. J. Angeline, Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences, in International conference on evolutionary programming, (1998), 601–610. https://doi.org/10.1007/BFb0040811
    [41] Y. Yu, S. Gao, S. Cheng, Y. Wang, S. Song, F. Yuan, CBSO: a memetic brain storm optimization with chaotic local search, Memetic Comput., 10 (2018), 353–367. https://doi.org/10.1007/s12293-017-0247-0 doi: 10.1007/s12293-017-0247-0
    [42] L. Wang, Intelligent Optimization Algorithms with Applications, Tsinghua University Press, Beijing, 2001.
    [43] Z. Tu, Y. Lu, A robust stochastic genetic algorithm (StGA) for global numerical optimization, IEEE Trans. Evol. Comput., 8 (2004), 456–470. https://doi.org/10.1109/TEVC.2004.831258 doi: 10.1109/TEVC.2004.831258
    [44] C. Y. Lee, X. Yao, Evolutionary programming using mutations based on the Levy probability distribution, IEEE Trans. Evol. Comput., 8 (2004), 1–13. https://doi.org/10.1109/TEVC.2003.816583 doi: 10.1109/TEVC.2003.816583
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1598) PDF downloads(181) Cited by(2)

Article outline

Figures and Tables

Figures(6)  /  Tables(10)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog