Research article Special Issues

Consensus-based global optimization with personal best

  • Received: 15 May 2020 Accepted: 17 August 2020 Published: 11 September 2020
  • In this paper we propose a variant of a consensus-based global optimization (CBO) method that uses personal best information in order to compute the global minimum of a non-convex, locally Lipschitz continuous function. The proposed approach is motivated by the original particle swarming algorithms, in which particles adjust their position with respect to the personal best, the current global best, and some additive noise. The personal best information along an individual trajectory is included with the help of a weighted mean. This weighted mean can be computed very efficiently due to its ac-cumulative structure. It enters the dynamics via an additional drift term. We illustrate the performance with a toy example, analyze the respective memory-dependent stochastic system and compare the per-formance with the original CBO with component-wise noise for several benchmark problems. The proposed method has a higher success rate for computational experiments with a small particle number and where the initial particle distribution is disadvantageous with respect to the global minimum.

    Citation: Claudia Totzeck, Marie-Therese Wolfram. Consensus-based global optimization with personal best[J]. Mathematical Biosciences and Engineering, 2020, 17(5): 6026-6044. doi: 10.3934/mbe.2020320

    Related Papers:

  • In this paper we propose a variant of a consensus-based global optimization (CBO) method that uses personal best information in order to compute the global minimum of a non-convex, locally Lipschitz continuous function. The proposed approach is motivated by the original particle swarming algorithms, in which particles adjust their position with respect to the personal best, the current global best, and some additive noise. The personal best information along an individual trajectory is included with the help of a weighted mean. This weighted mean can be computed very efficiently due to its ac-cumulative structure. It enters the dynamics via an additional drift term. We illustrate the performance with a toy example, analyze the respective memory-dependent stochastic system and compare the per-formance with the original CBO with component-wise noise for several benchmark problems. The proposed method has a higher success rate for computational experiments with a small particle number and where the initial particle distribution is disadvantageous with respect to the global minimum.


    加载中


    [1] R. C. Eberhart, J. Kennedy, Particle swarm optimization, in Proceedings of ICNN'95-International Conference on Neural Networks IEEE, (1995), 1942-1948.
    [2] M. Dorigo, G. Di Caro, Ant colony optimization: a new meta-heuristic, in Proceedings of the 1999 Congress on Evolutionary Computation (Cat. No. 99TH8406 Vol. 2), IEEE, (1999), 1470-1477.
    [3] L. J. Fogel, A. J. Owens, M. J. Wash, Artificial Intelligence through a Simulation Evolution, John Wiley & Sons Inc, New York, 1966.
    [4] J. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Harbor, 1975.
    [5] R. Poli, J. Kennedy, T. Blackwell, Particle swarm optimization, Swarm. Intell., 1 (2007), 33-57.
    [6] R. Pinnau, C. Totzeck, O. Tse, S. Martin, A consensus-based model for global optimization and its mean-field limit, Math. Models Methods Appl. Sci., 27 (2017), 183-204.
    [7] J. A. Carrillo, Y.-P. Choi, C. Totzeck, O. Tse, An analytical framework for consensus-based global optimization method, Math. Models Methods Appl. Sci., 28 (2018), 1037-1066.
    [8] S. Chatterhee, E. Seneta, Towards consensus: Some convergence theorems on repeated averaging, J Appl. Probab., 14 (1977), 89-91.
    [9] R. Hegselmann, U. Krause, Opinion dynamics and bounded confidence models, analysis, and simulation, JASSS, 5 (2002), 1-33.
    [10] S. Motsch, E. Tadmor, Heterophilious dynamics enhances consensus, SIAM Rev., 56 (2014), 577- 621.
    [11] J. A. Carrillo, S. Jin, L. Li, Y. Zhu, A consensus-based global optimization method for high dimensional machine learning problems, arXiv preprint arXiv:1909.09249.
    [12] M. Fornasier, H. Huang, L. Pareschi, P. Sünnen, Consensus-Based Optimization on the Sphere I: Well-Posedness and Mean-Field Limit, arXiv preprint arXiv:2001.11994.
    [13] M. Fornasier, H. Huang, L. Pareschi, P. Sünnen, Consensus-based Optimization on the Sphere II: Convergence to Global Minimizers and Machine Learning, arXiv preprint arXiv:2001.11988.
    [14] S.-Y. Ha, S. Jin, D. Kim, Convergence and error estimates for time-discrete consensus-based optimization algorithms, arXiv preprint arXiv:2003.05086.
    [15] P. Butta, F. Flandoli, M. Ottobre, B. Zegarlinski, A non-linear model of self-propelled particles with multiple equilibria, Kinet. Relat. Mod., 12 (2019), 791.
    [16] D. Crisan, C. Jangjigian, T. G. Kurtz, Particle representations for stochastic partial differential equations with boundary conditions, Electron. J. Probab., 23 (2018), 65-94.
    [17] M. Wiedermann, J. F. Donges, J. Heitzig, J. Kurths, Node-weighted interacting network measures improve the representation of real-world complex systems, Europhys. Lett., 102 (2013), 28007.
    [18] G. Wergen, Records in stochastic processes-theory and applications, J. Phys. A: Math. Theor., 46 (2013), 223001.
    [19] S. Gadat, F. Panloup, Long time behaviour and stationary regime of memory gradient diffusions, Ann. Inst. H. Poincaré Probab. Statist., 50 (2014), 564-601.
    [20] R. Zwanzig, Nonequilibrium Statistical Mechanics, Oxford University Press, New York, 2001.
    [21] H. Duong, G. Pavliotis, Mean field limits for non-Markovian interacting particles: convergence to equilibrium, GENERIC formalism, asymptotic limits and phase transitions, Commun. Math. Sci., 16 (2018), 2199-2230.
    [22] A. Kuntzmann, Convergence in distribution of some self-interacting diffusions, J. Probab. Stat., 2014 (2014), 1-13.
    [23] E. Pardoux, A. Răşcanu, Stochastic Differential Equations, Backward SDEs, Partial Differential Equations, Springer, Cham Heidelberg New York Dordrecht London, 2014.
    [24] A. Dembo, O. Zeitouni, Large deviations techniques and applications, Springer Science & Business Media, 2009.
    [25] S. Kirkpatrick, C. D. Gelatt Jr, M. P. Vecchi, Optimization by Simulated Annealing, Science, 220 (1983), 671-680.
    [26] M. Jamil, X.-S. Yang, A literature survey of benchmark functions for global optimisation problems, Int. J. Math. Mod. Num. Opt., 4 (2013), 150-194.
    [27] D. H. Ackley, A Connectionist Machine for Genetic Hillclimbing, Kluwer Academic Publishers, Boston, 1987.
    [28] L. A. Rastrigin, Systems of extremal control, Nauka, Moscow, 1974.
    [29] R. Durrett, Stochastic calculus: a practical introduction, CRC press, Boca Raton, Florida, 1996.
  • Reader Comments
  • © 2020 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3592) PDF downloads(55) Cited by(17)

Article outline

Figures and Tables

Figures(5)  /  Tables(5)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog