Loading [MathJax]/jax/output/SVG/jax.js
Research article Special Issues

Improving diversification by a hybrid bat-Nelder-Mead algorithm and DDE for rapid convergence to solve global optimization

  • Delay differential equations and algorithms hold a crucial position in the exploration of some biological systems and several models in real-world applications. So, some algorithms contribute to improve mathematical models related to natural life problems and global optimization. A novel hybridization between the downhill Nelder-Mead simplex algorithm (NM) and the classic bat algorithm (BA) was presented. The classic BA suffers from premature convergence, which is due to its global search weakness. In this research, this weakness was overcome by the intervention of NM in the velocity updating formula of the particles as an additional term. This improvement distracts particles from the rapporteur route, toward only the best solution found, to discover the search space more accurately. Once this improvement detects a promising area, sequential expansions are performed to deeply explore the area. This mechanism provides rapid convergence for the algorithm. Deep analysis of the algorithm's behaviour was provided, and thoughtful experiments were conducted and evaluated utilizing several evaluation metrics together with the Wilcoxon signed rank test to accentuate the effectiveness and efficiency of the proposed algorithm.

    Citation: Enas Suhail, Mahmoud El-Alem, Omar Bazighifan, Ahmed Zekri. Improving diversification by a hybrid bat-Nelder-Mead algorithm and DDE for rapid convergence to solve global optimization[J]. AIMS Mathematics, 2024, 9(12): 35655-35677. doi: 10.3934/math.20241692

    Related Papers:

    [1] Najeeb Alam Khan, Samreen Ahmad . Framework for treating non-Linear multi-term fractional differential equations with reasonable spectrum of two-point boundary conditions. AIMS Mathematics, 2019, 4(4): 1181-1202. doi: 10.3934/math.2019.4.1181
    [2] Ibrahim M. Hezam, Osama Abdul-Raof, Abdelaziz Foul, Faisal Aqlan . A Quantum-Inspired Sperm Motility Algorithm. AIMS Mathematics, 2022, 7(5): 9057-9088. doi: 10.3934/math.2022504
    [3] Junqiao Ma, Hongwei Jiao, Jingben Yin, Youlin Shang . Outer space branching search method for solving generalized affine fractional optimization problem. AIMS Mathematics, 2023, 8(1): 1959-1974. doi: 10.3934/math.2023101
    [4] Hongwu Li, Yuling Feng, Hongwei Jiao, Youlin Shang . A novel algorithm for solving sum of several affine fractional functions. AIMS Mathematics, 2023, 8(4): 9247-9264. doi: 10.3934/math.2023464
    [5] Hengdi Wang, Jiakang Du, Honglei Su, Hongchun Sun . A linearly convergent self-adaptive gradient projection algorithm for sparse signal reconstruction in compressive sensing. AIMS Mathematics, 2023, 8(6): 14726-14746. doi: 10.3934/math.2023753
    [6] Yulin Cheng, Jing Gao . An efficient augmented memoryless quasi-Newton method for solving large-scale unconstrained optimization problems. AIMS Mathematics, 2024, 9(9): 25232-25252. doi: 10.3934/math.20241231
    [7] Jie Guo, Zhong Wan . A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems. AIMS Mathematics, 2023, 8(2): 2473-2488. doi: 10.3934/math.2023128
    [8] Keyu Zhong, Qifang Luo, Yongquan Zhou, Ming Jiang . TLMPA: Teaching-learning-based Marine Predators algorithm. AIMS Mathematics, 2021, 6(2): 1395-1442. doi: 10.3934/math.2021087
    [9] Sani Aji, Poom Kumam, Aliyu Muhammed Awwal, Mahmoud Muhammad Yahaya, Kanokwan Sitthithakerngkiet . An efficient DY-type spectral conjugate gradient method for system of nonlinear monotone equations with application in signal recovery. AIMS Mathematics, 2021, 6(8): 8078-8106. doi: 10.3934/math.2021469
    [10] Hongwu Li, Longfei Wang, Yingfeng Zhao . Global optimization algorithm for a class of linear ratios optimization problem. AIMS Mathematics, 2024, 9(6): 16376-16391. doi: 10.3934/math.2024793
  • Delay differential equations and algorithms hold a crucial position in the exploration of some biological systems and several models in real-world applications. So, some algorithms contribute to improve mathematical models related to natural life problems and global optimization. A novel hybridization between the downhill Nelder-Mead simplex algorithm (NM) and the classic bat algorithm (BA) was presented. The classic BA suffers from premature convergence, which is due to its global search weakness. In this research, this weakness was overcome by the intervention of NM in the velocity updating formula of the particles as an additional term. This improvement distracts particles from the rapporteur route, toward only the best solution found, to discover the search space more accurately. Once this improvement detects a promising area, sequential expansions are performed to deeply explore the area. This mechanism provides rapid convergence for the algorithm. Deep analysis of the algorithm's behaviour was provided, and thoughtful experiments were conducted and evaluated utilizing several evaluation metrics together with the Wilcoxon signed rank test to accentuate the effectiveness and efficiency of the proposed algorithm.



    The discovery of several real-world applications is crucially related to delay differential equations and algorithms. Different algorithms are designed to solve global optimization problems. Global optimization problems are inherently intractable. Conventional algorithms (deterministic algorithms) are not efficient to solve global problems. One of the main reasons for this is that deterministic algorithms seek the exact global optimum among all feasible regions, which is a time-consuming procedure for problems of moderate to high dimension. Ideal surrogate for deterministic algorithms in solving these problems are algorithms with a stochastic element, called heuristic algorithms, as the randomness in the global search guides the search toward regions where a better solution is likely to be found. Metaheuristic algorithms are a type of heuristic applicable for a wide range of problems. In the past few decades several nature-inspired metaheuristics have been developed. One class of them is based on the social interactions of animal communities, called swarm intelligence (SI) algorithms, such as the genetic algorithm [1], particle swarm optimization (PSO) algorithm [2], ant colony optimization (ACO) [3], artificial fish swarm algorithm (AFSA) [4], and bat algorithm (BA) [5]. Recently, several other metaheuristic algorithms were proposed such as the liver cancer algorithm (LCA) [6], fata morgana algorithm (FATA) [7], snow geese algorithm (SGA) [8], and more.

    Over time, diligent research has shown some defects that affect the performance of standard metaheuristics in solving certain problems. Accordingly, numerous impressive algorithms were reported that combine metaheuristics with different techniques or other algorithms to improve new versions that overcome the deficiencies in the component algorithms. In [9], the author combined the firefly algorithm (FA) with extremal optimization to tackle the defects of FA in slow convergence and falling into local minimum. In [10], the author built a framework for conducting an effective analysis for recurrent spontaneous abortion. This framework was presented by a combination between the joint self-adaptive sime mould algorithm (JASMA) with the common kernel learning support vector machine with maximum-margin hyperplane theory. BA was not excluded from this cutting-edge phenomenon. The BA has some advantages over other algorithms that make it promising and interesting such as its simplicity in structure, speed in processing, and ease in hybridization with other algorithms. Furthermore, the BA efficiently succeeded in dealing with several optimization problems in different fields such as power systems, economic load dispatch problems, image processing, and medical applications, see [11] for a detailed description. However, the exploration in the BA is weak as the BA focuses in the exploitation rather than exploration which makes it fall into local optimum. For that, many works proposed variants of the BA to improve its performance by introducing different techniques. For instance, the researcher in [12] incorporated a chaotic sequence and chaotic Levy flight schemes to enrich the searching behavior and improve the local and global search capabilities. In [13], the author introduced, first, the iterative local search in the local search of the BA to avoid falling into local minimum and, second, a stochastic inertia weight in the velocity updating formula of the BA. The researcher [14] incorporated quantum evolution and an annealing strategy into the BA to achieve a balance between intensification and diversification. Other works tend to combine the BA with other algorithms in which the combination is referred to as hybrid metaheuristics. This hybridization accentuates the mettle of its components and creates an algorithm capable of solving many hard optimization problems. For instance, [15] introduced a step in the harmony search (HS) algorithm in the BA and added an HS attribute as an operator. Author in [16] introduced a hybrid strategy that combines the genetic algorithm (GA) with BA to improve the efficiency of the global search of the standard BA. In their work, the mean magnitude of relative error was applied as an objective function. Other variants of the BA were proposed by combining the benefits of the BA with the local search Nelder-Mead algorithm (NM). In [17], the author started by following the BA routine to generate a new set of solutions. If any inferior solution was obtained, NM was performed by taking the inferior solution as the starting point of the simplex to refine it. While, in [18], the researcher took the best solution obtained after performing the BA routine as the starting point of the simplex in the NM algorithm. In [19], the NM method was used as a local search instead of the random walk method that is applied in the standard BA to refine the best solution found so far, which helped to accelerate the search process. In [20], the author replaced the random walk that is used as a local search in the standard BA by the pattern search method to increase the intensification ability. Furthermore, the NM method was employed in the final stage of the algorithm to improve the best found solution.

    The motivation of this work is to tackle the main drawback of the BA, which is the weakness in exploration, by making changes in its structure, specifically in the updating process, using the NM algorithm rather than the technique used in [17,18,19], which is refining the solution found by the standard BA utilizing the NM method. For that, a novel hybridization is carried out utilizing the BA and NMA at which the global search ability of the standard BA is enhanced in the proposed algorithm (HBNMA) in premier iterations by the intervention of the NMA in the updating process of the solutions. This helped at first to distract solutions from heading only toward best solution found so far to avoid falling into local minimum. Second, it helped to penetrate deeply in a region where further potential best solutions may be found using sequential expansion steps that are defined in the NM algorithm. This can be seen in the Experiments and results section. By the lapse of iterations, the exploitation arises by the usual local search mechanism of the classic BA. This balance between exploration and exploitation enables HBNMA to solve complex global optimization problems. The contributions of this paper are as follows:

    ● The fast hybrid bat-NM algorithm is proposed with a new contribution of the NM in a hybridization. The NM in this work is used to tackle the main drawback of the BA by improving the updating process of the BA. The proposed algorithm is able to solve complex and high-dimensional global problems.

    ● Breaking the tradition of following a uniform route for all particles, in the proposed algorithm, particles determine their next step by choosing whether to follow the proposed hybrid updating process or the classic one of the standard BA according to which one is able to achieve better results.

    ● Precise analysis of the working mechanism of the proposed algorithm and thoughtful experiments are presented with statistical measurements to perfectly evaluate the merit of the HBNMA.

    The rest of this paper is organized as follows: Section 2 gives an overview on the classic NM and classic BA. Furthermore Section 2 describes the proposed algorithm. In Section 3, experiments and numerical results are illustrated. The time complexity is described in Section 4. In Section 5, the study's conclusion is presented. Finally, future work is described in Section 6.

    In this section, the classic NMA is presented as stated in the original version [21]. The main thrust of the NM is to solve a multidimensional, unconstrained optimization with no need for derivatives.

    minf(x), (2.1)

    where the objective function f:RdR and xRd.

    This algorithm constructs a design framework called simplex with d+1 vertices for a d-dimensional problem, in which the worst vertex is replaced by a better one at each iteration. The surrogate vertex is obtained utilizing one or more procedures including reflection, expansion, contraction, or reduction.

    The NM starts by initializing the points xi (the vertices of the simplex). Next, the fitness value (objective function) f is evaluated for each point xi. The points are reordered according to their objective function value to determine the best xb, the second worst xw2, and the worst xw, which satisfy:

    f(xb)<...<f(xw2)<f(xw).

    After that, the centroid xc of all points except xw is calculated using:

    xc=1ddi=1xi. (2.2)

    The NMA implements different procedures to decide the most effective step in a direction expressed utilizing a line segment joining xc with xw. The procedures are described in the following in which the values of the hyperparameters μr,μe,μoc,μic, and μre that are used in these procedures are assigned to the same values as in [21].

    (1)  Reflection:

    This step indicates a reflection of the point xw using the point xc. The reflection point xr is calculated utilizing the following equation:

    xr=xc+(xcxw). (2.3)

    (2)  Expansion:

    This step expands the reflection step as long as an improvement is achieved. The expansion point xe is calculated utilizing:

    xe=xc+2(xcxw). (2.4)

    (3)  Contraction:

    This step shrinks the size of the simplex utilizing either an outside contraction xoc or inside contraction xic. The points xoc and xic are calculated as follows:

    xoc=xc+0.5(xcxw), (2.5)
    xic=xc0.5(xcxw). (2.6)

    (4)  Reduction:

    This is the last resort after the failure of all previous attempts. In this step, all points xi are updated toward the best point xb using:

    xi=xb+0.5(xixb). (2.7)

    The bat algorithm emulates the echolocation trait and hunting behavior of microbats. [5] designed the classic BA from the following idealized rules:

    (1) Different bats utilize echolocation to sense the distance between them and the target with a magical ability to distinguish between food and background barriers.

    (2) Bats fly randomly with velocity vi at position xi with a fixed frequency fm and a varied wavelength λi and loudness Ai to seek out prey. They can adjust automatically their wavelength (or frequency) according to the proximity of their target.

    (3) Loudness is assumed to vary from a large value A0 to a minimum value Amin.

    According to these rules, the BA is designed as follows:

    (1) Initialize randomly a population of n solutions xiRd.

    (2) Update the frequency fi and the velocity vi using:

    fi=fmin+rand(fmaxfmin), (2.8)
    vti=vt1i+(xt1ix)fi, (2.9)

    where fmin and fmax are prespecified minimum and maximum values for f, rand is a random number, t is the current iteration, and x is the current global solution. Then a new solution xi is generated as follows:

    xti=xt1i+vti. (2.10)

    (3) If rand<ri, evaluate a new local solution around the current global one using:

    xnew=xi+ϵAt, (2.11)

    where ϵ[1,1], and At is the average loudness of all solutions at iteration t.

    (4) Update the best individual solutions and both the loudness and rate of pulse emissions using the following formulae if an improvement is achieved:

    Ati=αAt1i, (2.12)
    rti=r0i(1exp(γt)), (2.13)

    where α(0,1), γ>0, Ati0, and rtir0i, as t.

    (5) Select the current global solution from the best individual solutions.

    A novel hybridization strategy between the BA and NMA is introduced in this section called the HBNMA. The standard BA guides particles toward the best solution whereas the standard NM guides the worst particle (vertex) toward the centroid, or the center of gravity, of the other particles. The HBNMA is based on the concept that a solution at the beginning of the iterations should experience a different direction besides the direction toward the best solution in order to avoid falling into local minimum. Accordingly, the HBNMA developed a new updating formula for the velocity of a particle composed of the usual direction of the BA and the direction of the NM in order to enhance the global search ability of the proposed HBNMA.

    The proposed algorithm starts by initializing n solutions xi randomly using:

    xi=xmin+rand(xmaxxmin), (2.14)

    where [xmin,xmax] is the range of the search space, i=1,2,...,n, and rand(0,1) is a random number. Further, the velocity, frequency, loudness, and pulse emission rate are initialized. Utilizing the solution's initial information, the HBNMA evaluates the fitness value. The solution with the best fitness is stored as a current global solution x.

    Each particle has two choices from which its step can be determined. The first is a combination between the classic BA step vbat (2.9) and the reflection step vr of the NM:

    vr=xc+(xcxi), (2.15)

    where xc is defined in (2.2). Figure 1 shows this combination geometrically in green-colored lines or more detailed in blue-colored lines that generate the step from vt to vt+1. Hence the improved updating formula of each particle's velocity vi becomes:

    vi=vbat+vr. (2.16)
    Figure 1.  Clarification of the direction vt+1bat and the proposed direction vt+1.

    If progress is achieved in the fitness value for xi, that is, f(xt+1i)<f(xti), sequential expansion steps are carried out as long as progress is made to exploit the area deeply. The expansion steps are performed by replacing vr in (2.16) by ve which is calculated as:

    ve=xc+μ(xcxi), (2.17)

    with μ=2μpr, where μpr is the value of μ in the previous expansion step. The expansion steps are of the form:

    vi=vbat+ve. (2.18)

    The term vbat solely guides particles toward the current global solution as in the classic BA, as shown in Figure 1 with red-colored lines, which may make particles fall easily in a local optimum. Combining the terms vr or ve with vbat in (2.16–2.18), shown graphically in Figure 1, distracts the particle's concentration from the exclusive dependence on the direction toward the best one, especially at the beginning of the iterations. This helps to target new areas in the search space which may be promising. Accordingly, it allows one to effectively explore the region. This choice of the particle's step is accepted if progress in the fitness value is achieved. If not, the second choice is to keep the same updating process as in the standard BA (2.9). The proposed mechanism takes place at the beginning of the iterations. While in the later iterations, as is usually done in the standard BA, if rand<ri, solutions are updated locally around the current global solution using (2.11). This local intensification occurs in the area where the global optimal is expected.

    The full description of the HBNMA is shown in the following algorithm and flowchart (Figure 2):

    Figure 2.  Flowchart of the proposed HBNMA.

    Algorithm 1: HBNMA
    Initial population of solutions xi;
    The prespecified parameters
    Output: The global optimal solution x
    Evaluate the fitness value f of each solution;
    Store the current global solution xt

    In this section, we validate the efficiency of the proposed HBNMA and its counterparts using classical and complex benchmark test functions along with the Wilcoxon signed rank test. Furthermore, the working mechanism of the HBNMA is analyzed numerically.

    The performance of the HBNMA is examined utilizing two sets of test functions which provide a good representation of real-world optimization problems and show the merit of the proposed algorithm. The first set is the classical test functions whose definitions are found in [22]. The selected functions vary in their characteristics. Some are unimodal to test the intensification ability of the proposed algorithm while the others are multimodal functions to test the ability to jump out of the local optimum. Moreover, the selected test functions vary in separability that measures the difficulty of the problem. The dimensions of the selected functions, the range used, their optimum values, and their characteristics are described in the table found in the Supplementary Materials. The second set is the IEEE CEC2014 special session and competition on a single-objective real-parameter, numerical optimization benchmarking suit [23]. This set of complex test functions contains 3 rotated unimodal functions F1–F3, 13 rotated and shifted multimodal functions F4–F16, 6 hybrid functions F17–F23, and 8 composition functions F24–F30 that validate the performance of the proposed algorithm.

    The significance of the proposed HBNMA is measured using some evaluation metrics and the Wilcoxon signed-rank test on the results of the experiments in Section 3.3.

    The performance of the proposed HBNMA in optimizing an objective function is evaluated using three metrics which are the best value (BV), the mean value (MV), and the standard deviation (SD) over several runs.

    This test is a non-parametric statistical test that aims to detect whether there is a significant difference between two sample means or not.

    In this research, the Wilcoxon signed-rank test is applied at a significance level of α<0.05. The sum of the ranks for the functions in which the HBNMA outperformed its counterpart is denoted by R+. While the sum of the ranks for the opposite is denoted by R. The two criteria R+ and R are calculated as described in [24] in which the average ranks are used in dealing with ties. The Wilcoxon test produces a test statistic value (Z) which is converted into a probability (p-value). A low p-value (p<0.05) reflects a significant difference between the HBNMA and its counterpart. IBM SPSS software is used in this work to calculate the p-value.

    In this section, three experiments are performed to show the merit of the proposed HBNMA on different dimensional problems. The first experiment is a comparison against the standard BA using problems with d=5 up to d=1000 for each to show the positive impact of the hybridization of the BA with the NMA in the performance of the proposed algorithm. Furthermore, Experiment 1 provides an analysis of the positive impact of the hybridization numerically. The second experiment is a comparison against different variants of the BA that is described in Section 3.3.2 using standard benchmark test functions with different properties to perfectly validate the proposed algorithm. The third experiment is against state-of-the-art heuristics utilizing CEC2014. The results obtained from these experiments are evaluated utilizing three evaluation metrics and the Wilcoxon signed-rank test as described in Section 3.2. The code of the HNMBA was written and run in MATLAB 2013, on Windows 10 with an Intel ® Core i5-1035G CPU processor and 8.00 GB RAM.

    This experiment is divided into two parts. The first part is a comparison between the proposed variant of the BA which is the HBNMA and the standard BA on low- and high-dimensional problems to show the effect of the hybridization of the BA with the NM in the performance of the HBNMA. The second part is an analysis of the algorithm's working mechanism which gives an explanation of the good results obtained for the HBNMA.

    Tables 2 and 3 show the results of the comparison utilizing several test functions of dimensions d=5,10,100 and 1000 for each. The experimental parameter values are the same in this comparision for both the BA and HBNMA to ensure fairness and they are shown in Table 1. The results are evaluated using the error between the global optimal and the obtained solution, the mean number of function evaluations NFE of 40 independent runs needed to obtain the accuracy indicated, as well as the average time taken to perform one run. Each algorithm terminates if either the error equals zero or the maximum number of function evaluations is reached (MaxFE=2104). The results showed the ability of the proposed HBNMA in convergence to the global optimal with zero error and a very low number of function evaluations in most functions with low or high dimension. In contrast, the BA failed to obtain good results in most indicated functions, even though it took the prespecified maximum number of function evaluations, at which the results got worse with an increasing function dimension as illustrated in Tables 2 and 3 and clarified in the example shown in Figure 3 using the Schwefel function. The figure demonstrated that the error value was getting worse in the BA with the increase in dimension, up to 90 in the left-hand figure and from 100 up to 800 in the right-hand figure, because of its weakness to handle the high dimensionality of the problems. Meanwhile, the performance of the HBNMA with the increase in dimension was stable with an error equal to zero. This comparison indicates that the hybridization in the HBNMA at first enhances the exploration ability in the proposed algorithm than that in the standard BA. This enhancement enables the HBNMA to search deeply for the global optimum and jump out from any local optimum no matter the dimension of the test function, as illustrated in the results of the error values of the HBNMA and BA. Second, the hybridization decreases the time consumed to reach the global optimum in the HBNMA from that in the BA. This proves the merit of this work.

    Table 1.  Conditions and common parameters of all experiments.
    Definition Experiment 1 Experiment 2 Experiment 3
    Condition Population size n=40 n=30 n=50
    Number of runs r=40 r=30 r=51
    Maximum number of function evaluations MaxFE=2104 - MaxFE=d105
    Maximum number of iterations - MaxIt=200 -
    Common Parameter Minimum frequency fmin=1 fmin=1 fmin=1
    Maximum frequency fmax=1 fmax=1 fmax=1
    Decay coefficient of loudness α=0.5 α=0.9 α=0.5
    Enhancement coefficient of pulse emission rate γ=0.5 γ=0.6 γ=0.5

     | Show Table
    DownLoad: CSV
    Table 2.  Results of comparison in Experiment 1.
    Function Dim. Algorithm Error NFE AVT Function Dim. Algorithm Error NFE AVT
    Sphere 5 HBNMA 0 400 0.04 Sumsquares 5 HBNMA 0 320 0.02
    BA 1.38×102 2×104 0.56 BA 1.89×104 2×104 0.48
    10 HBNMA 0 400 0.02 10 HBNMA 0 400 0.02
    BA 1.45×101 2×104 0.44 BA 4.26×101 2×104 0.46
    100 HBNMA 0 560 0.04 100 HBNMA 0 480 0.05
    BA 5.97×103 2×104 0.70 BA 2.80×103 2×104 0.53
    1000 HBNMA 0 560 0.11 1000 HBNMA 0 640 0.11
    BA 1.39×105 2×104 0.67 BA 8.54×105 2×104 0.65
    Schwefel 2.21 5 HBNMA 0 400 0.03 Schwefel 2.22 5 HBNMA 0 320 0.02
    BA 3.35×101 2×104 0.43 BA 5.48×102 2×104 0.45
    10 HBNMA 0 480 0.04 10 HBNMA 0 400 0.02
    BA 2.37×100 2×104 0.42 BA 2.00×100 2×104 0.76
    100 HBNMA 0 1360 0.12 100 HBNMA 0 480 0.03
    BA 1.27×101 2×104 0.49 BA 1.85×102 2×104 0.97
    1000 HBNMA 0 9760 1.74 1000 HBNMA 0 720 0.12
    BA 3.34×101 2×104 0.59 BA 2.12×103 2×104 0.76
    Step 5 HBNMA 0 320 0.02 Dixon-Price 5 HBNMA 2.15×101 2×104 0.71
    BA 8.50×101 2×104 0.57 BA 2.14×101 2×104 1.97
    10 HBNMA 0 400 0.03 10 HBNMA 6.67×101 2×104 0.77
    BA 8.57×100 2×104 0.54 BA 3.61×101 2×104 0.51
    100 HBNMA 0 480 0.04 100 HBNMA 6.67×101 2×104 0.94
    BA 4.73×102 2×104 0.80 BA 2.52×104 2×104 0.50

     | Show Table
    DownLoad: CSV
    Table 3.  Results of comparison in Experiment 1.
    Function Dim. Algorithm Error NIT AVT Function Dim. Algorithm Error NIT AVT
    1000 HBNMA 0 560 0.14 1000 HBNMA 9.85×101 2×104 3.72
    BA 6.27×103 2×104 0.76 BA 4.70×107 2×104 0.79
    Sum of Diff- 5 HBNMA 0 320 0.02 Griewank 5 HBNMA 0 320 0.02
    erent Powers BA 2.71×108 2×104 0.53 BA 3.08×101 2×104 0.54
    10 HBNMA 0 320 0.02 10 HBNMA 0 480 0.03
    BA 2.00×107 2×104 0.56 BA 8.78×101 2×104 0.53
    100 HBNMA 0 480 0.05 100 HBNMA 0 560 0.05
    BA 1.70×105 2×104 0.64 BA 5.15×101 2×104 0.60
    1000 HBNMA 0 560 0.19 1000 HBNMA 0 640 0.18
    BA 3.91×104 2×104 1.82 BA 1.47×103 2×104 1.29
    Ackley 5 HBNMA 8.8818×1016 2×104 0.79 Alpine 5 HBNMA 0 400 0.02
    BA 8.14×101 2×104 0.47 BA 2.43×101 2×104 0.43
    10 HBNMA 8.8818×1016 2×104 0.77 10 HBNMA 0 400 0.02
    BA 1.50×100 2×104 0.45 BA 2.02×100 2×104 0.39
    100 HBNMA 8.8818×1016 2×104 1.05 100 HBNMA 0 560 0.04
    BA 2.36×100 2×104 0.58 BA 3.95×101 2×104 0.52
    1000 HBNMA 8.8818×1016 2×104 5.45 1000 HBNMA 0 560 0.15
    BA 2.87×100 2×104 1.09 BA 3.41×102 2×104 0.88
    Rastrigin 5 HBNMA 0 320 0.04 Xin-She 5 HBNMA 0 320 0.02
    BA 5.88×100 2×104 0.71 Yang N.1 BA 2.77×102 2×104 0.46
    10 HBNMA 0 400 0.04 10 HBNMA 0 400 0.02
    BA 2.01×101 2×104 0.42 BA 4.02×101 2×104 0.57
    100 HBNMA 0 480 0.05 100 HBNMA 0 480 0.05
    BA 5.13×102 2×104 0.52 BA 1.43×1030 2×104 0.55
    1000 HBNMA 0 480 0.14 1000 HBNMA 0 3440 1.08
    BA 4.57×103 2×104 0.96 BA Inf 2×104 2.02
    Zakharov 5 HBNMA 0 400 0.03 Salomon 5 HBNMA 0 320 0.02
    BA 3.80×103 2×104 0.41 BA 2.53×101 2×104 0.44
    10 HBNMA 0 400 0.03 10 HBNMA 0 400 0.03
    BA 2.10×101 2×104 0.42 BA 8.29×101 2×104 0.48
    100 HBNMA 0 560 0.03 100 HBNMA 0 560 0.04
    BA 1.81×1010 2×104 0.54 BA 9.88×100 2×104 0.54
    1000 HBNMA 0 560 0.10 1000 HBNMA 0 560 0.12
    BA 1.79×1022 2×104 0.78 BA 3.62×101 2×104 0.67

     | Show Table
    DownLoad: CSV
    Figure 3.  Effect of increasing dimension on the performance of the BA and HBNMA.
    Figure 4.  Clarification of the behavior of bats in the HBNMA.

    Further in this experiment, the behavior of the particles in the HBNMA is demonstrated using four metrics to give a clarification to the superiority of the algorithm. Table 4 illustrates the percentage of the mean number of particles that follow the improved updating formulae (2.16) and (2.18) (Mim) in several iterations in one random run and those that follow the standard updating formula of the BA (2.9) (Mba). Also, the success in any iteration (t+1) to achieve an improvement in the global solution (f(xt+1)<f(xt)) is studied whether referring to a particle that follows the improved formulae or the standard formula. Then the percentages of the successes are calculated and denoted in Table 4 as (Sucim) or (Sucba), respectively. The results indicated that the majority, around two-thirds, of the particles follow the standard updating process of the BA. The reason behind this is that the particles follow the improved updating process if progress in their previous fitness value is achieved which is not continuously done. However, the high percentage of success in improving the global solution is due to particles that follow the improved updating process in all multimodal and some unimodal functions. The high percentage of success in few unimodal functions is in favour of particles that follow the classic updating formula as there is no need in these functions for high diversification. This proves the positive impact of the improved version to search precisely for better solutions as clarified in the following flowchart.

    Table 4.  Performance analysis of the HBNMA.
    Function Mim(%) Mba(%) Sucim(%) Sucba(%)
    Sphere 34.8649% 65.1351% 38.4615% 61.5385%
    Sumsquares 33.1383% 66.8617% 39.1304% 60.8696%
    Schwefel 2.21 43.2065% 56.7935% 69.5652% 30.4348%
    Schwefel 2.22 33.2622% 66.7378% 100% 0%
    Step 34.2692% 65.7308% 62.5000% 37.5000%
    Dixon-Price 32.6144% 67.3856% 50% 50%
    Sum of Different Powers 34.1429% 65.8571% 60.6061% 39.3939%
    Griewank 35.2727% 64.7273% 64.2857% 35.7143%
    Ackley 37.5000% 62.5000% 88.8889% 11.1111%
    Alpine 40.7317% 59.2683% 65.3846% 34.6154%
    Rastrigin 41.5714% 58.4286% 78.9474% 21.0526%
    Xin-She Yang N.1 42.6250% 57.3750% 66.6667% 33.3333%
    Xin-She Yang N.2 39.4565% 60.5435% 83.3333% 16.6667%
    Salomon 38.9063% 61.0938% 72.7273% 27.2727%

     | Show Table
    DownLoad: CSV

    In this experiment, The operational results of the HBNMA using several standard benchmark test functions with dimension d=10 for all test functions except those of certain dimensions are compared with the standard BA and four variants of the BA to examine the ability of the proposed variant HBNMA in competing efficient different variants. The HBNMA in this comparison competes with the quantum annealing bat algorithm (QABA) [14], improved quantum annealing bat algorithm (IQBA) [25], group evolution hybrid bat algorithm (LMBA) [26], bat differential hybrid algorithm (BADE) [27], and the BA [5]. All algorithms involved in this experiment undergo the same conditions and common parameter values as stated in [14] and shown in Table 1 for fair comparison. The values of the other parameters related to a certain algorithm used in this comparison are found in their corresponding original literature. Following the same conditions that are stated by other established literature and yielding better results shows the adaptability and the efficiency of the HBNMA under any circumstances. Tables 5 and 6 show superior mean values (MV) compared to the BA and BADE in all test functions. In comparing with the LMBA, the HBNMA achieved the best MV in 13 of 15 functions, equal value in 1 function, and the worst value in 1 function. The competition was increased when comparing the HBNMA against the IQBA and QABA, but the superiority was in favor of the HBNMA. The HBNMA, with respect to the mean value, achieved the best values in 6 functions, the same in 8 functions, and the worst in only one function against the IQBA. In comparing with the QABA, the HBNMA recorded the best in 4, the same in 9, and the worst in 2. To statistically compare the mean results of the HBNMA against its counterparts, the Wilcoxon signed-rank test was used and is illustrated in Table 7. The p-value indicates whether or not there is a significant difference between the competing algorithms. Table 7 indicates that the HBNMA provided significantly better results than the IQBA, LMBA, BADE, and BA with p<0.05. However, no significant difference was seen between the HBNMA and IQBA as p>0.05. In terms of standard deviation (SD), the HBNMA recorded the best in 4, the same in 8, and the worst in 3 when comparing with the QABA. In comparing with the IQBA, the HBNMA had the best values in 11 functions, the same in 3, and the worst in 1. According to the LMBA and BADE, the HBNMA outperformed the SD results of them in most test functions. When comparing the HBNMA against the BA in SD, the superiority was in favor of the HBNMA. The similarity in some results in this comparison between the HBNMA and any other algorithm means that both algorithms achieved the optimum value. The low SD results reflected the stability of the HBNMA.

    Table 5.  Results of Experiment 2.
    Function HBNMA QABA IQBA LMBA BADE BA
    Sphere BV 0.000 0.000 2.658×10137 7.060×1029 4.777×1035 2.912×101
    MV 0.000 0.000 2.743×1031 4.584×1010 8.720×1011 9.552×101
    SD 0.000 0.000 1.480×1030 1.570×109 4.154×1010 4.532×101
    Rank (MV) 1 1 2 4 3 5
    Sumsquares BV 0.000 0.000 9.882×10148 8.600×1029 2.089×1038 1.617×100
    MV 0.000 0.000 1.977×1041 4.704×109 7.812×1011 5.836×100
    SD 0.000 0.000 1.060×1040 1.950×108 2.978×1010 2.121×100
    Rank (MV) 1 1 2 4 3 5
    Schwefel's 2.20 BV 0.000 0.000 1.843×1078 1.743×1015 4.767×1021 1.116×100
    MV 0.000 1.301×10309 5.869×1028 2.406×105 1.226×107 2.393×100
    SD 0.000 0.000 3.160×1027 5.680×105 3.977×107 5.264×101
    Rank (MV) 1 2 3 5 4 6
    Step BV 0.000 2.982×1019 3.813×103 1.046×106 3.920×102 4.111×101
    MV 0.000 2.819×109 4.323×101 8.421×103 4.551×101 1.340×100
    SD 0.000 4.534×109 4.601×101 1.396×102 2.892×101 6.187×101
    Rank (MV) 1 2 4 3 5 6
    Dixon-Price BV 6.667×101 6.322×102 2.426×101 2.263×101 6.667×101 2.342×100
    MV 6.668×101 1.791×101 3.626×101 2.748×101 9.245×101 9.981×100
    SD 2.3×104 5.924×102 1.604×101 4.146×102 9.194×102 6.783×100
    Rank (MV) 4 1 3 2 5 6
    Sum of Different Powers BV 0.000 0.000 6.107×10157 3.394×1021 2.748×1040 1.142×102
    MV 0.000 0.000 1.992×1054 2.817×109 3.252×1018 2.295×101
    SD 0.000 0.000 1.070×1053 1.172×108 1.746×1017 2.039×101
    Rank (MV) 1 1 2 4 3 5
    Griewank BV 0.000 0.000 0.000 0.000 0.000 3.485×102
    MV 0.000 0.000 0.000 6.655×1011 5.361×1013 1.207×101
    SD 0.000 0.000 0.000 1.719×1010 2.592×1012 4.767×102
    Rank (MV) 1 1 1 3 2 4

     | Show Table
    DownLoad: CSV
    Table 6.  Rest of Experiment 2.
    Function HBNMA QABA IQBA LMBA BADE BA
    Ackley BV 8.882×1016 8.882×1016 8.882×1016 8.882×1016 8.882×1016 1.835×100
    MV 8.882×1016 8.882×1016 8.882×1016 3.260×105 2.810×101 2.727×100
    SD 0.000 0.000 0.000 9.776×105 7.249×101 4.061×101
    Rank (MV) 1 1 1 2 3 4
    Alpine BV 0.000 0.000 5.506×1070 7.231×1014 1.271×1019 2.632×101
    MV 0.000 1.785×10306 6.224×1019 5.990×105 3.797×103 9.019×101
    SD 0.000 0.000 2.650×1018 2.306×104 4.400×103 4.235×101
    Rank (MV) 1 2 3 4 5 6
    Rastrigin BV 0.000 0.000 0.000 0.000 3.330×100 2.700×101
    MV 0.000 0.000 0.000 1.047×105 3.051×101 5.209×101
    SD 0.000 0.000 0.000 5.512×105 1.320×101 1.025×101
    Rank (MV) 1 1 1 2 3 4
    Michalewicz BV 8.299 7.768 5.736 4.441 5.566 5.202
    MV 6.000 5.782 2.529 3.694 3.695 3.575
    SD 1.298×100 1.101×100 7.607×101 5.159×101 9.096×101 6.495×101
    Rank (MV) 1 2 6 4 3 5
    Goldstein-Price BV 3.000 3.000 3.000 3.000 3.000 3.003
    MV 3.000 3.009 22.450 11.605 12.005 5.236
    SD 5.888×105 3.135×102 2.025×101 1.236×101 2.124×101 6.072×100
    Rank (MV) 1 2 6 4 5 3
    Schubert BV 186.7309 186.7309 186.3490 186.7309 186.7309 186.7272
    MV 186.7308 186.7307 142.4349 186.2162 164.2457 179.5836
    SD 1.2217×104 5.618×104 4.594×101 9.798×101 3.054×101 1.088×101
    Rank (MV) 1 2 6 3 5 4
    Hartmann 3-D BV 3.8628 3.8627 3.8039 3.8540 3.8628 3.8566
    MV 3.7339 3.8615 2.5718 3.4375 3.5619 3.4144
    SD 7.052×101 1.097×103 8.506×101 3.661×101 4.712×101 4.684×101
    Rank (MV) 2 1 6 4 3 5
    Six-Hump Camel BV 1.0316 1.0316 1.0316 1.0316 1.0316 1.0316
    MV 1.0316 1.0316 0.9460 1.0316 1.0068 1.0260
    SD 5.820×105 1.970×107 1.001×101 1.384×104 9.946×102 1.215×102
    Rank (MV) 1 1 4 1 3 2

     | Show Table
    DownLoad: CSV
    Table 7.  Wilcoxon signed-ranks test of the results of Experiment 2.
    Comparison R+ R Z p-value Better Equal Worse
    HBNMA versus QABA 69.5 50.5 0.105 0.917 4 9 2
    HBNMA versus IQBA 92 28 2.028 0.043 6 8 1
    HBNMA versus LMBA 107.5 12.5 2.605 0.009 13 1 1
    HBNMA versus BADE 120 0 3.296 <0.001 15 0 0
    HBNMA versus BA 120 0 3.408 <0.001 15 0 0

     | Show Table
    DownLoad: CSV

    This experiment aims to show the efficiency of the proposed HBNMA using the 30 functions of the IEEE CEC2014 benchmark suite against state-of-the-art heuristics which are the genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], differential evolution (DE) [28], and bat algorithm (BA) [5]. Each function for all competing algorithms is of dimension d=10 that runs for 51 times independently with d105 function evaluations for each run as a stopping criterion and a population size equal to 50 as shown in Table 1. Also, the values of the common parameters between the HBNMA and BA are stated in Table 1. The crossover and mutation probabilities of the GA are assigned as 0.9 and 0.05, respectively. In PSO, the inertia weight ranges from 0.9 to 0.4 and the coefficients of acceleration are of value 1.2. The parameters, that are used in DE, are the scale factor which belonging [0.5,0.9] and the crossover rate belonging to [0.1,0.9]. Tables 8 and 9 illustrate the findings of the experiment in terms of MV and SD. The HBNMA succeeded to obtain the best mean error in 29 of 30 test functions and the same in 1 function when comparing with the GA and BA. In comparing with the PSO, the HBNMA obtained the best mean value in 21 of 30 test functions, the same value in 1 functions, and the worst in 8 functions. Furthermore, the HBNMA achieved the best mean value in 23 of 30 test functions, the same in 2 functions, and the worst in 5 functions when comparing against DE. The results showed the superiority of the HBNMA compared with all other algorithms in this experiment. This superiority is also translated in the p-value in Table 10 which demonstrated a significant difference between the HBNMA and the indicated algorithms.

    Table 8.  Results of Experiment 3.
    Function HBNMA GA PSO DE BA
    F1 MV 1.50×103 1.93×107 3.71×104 7.68×104 1.51×108
    SD 9.54×102 5.20×106 1.99×104 2.50×104 9.11×107
    F2 MV 2.02×103 6.01×103 2.10×103 2.99×106 5.76×109
    SD 4.04×103 5.01×103 2.37×103 1.50×106 4.22×1010
    F3 MV 4.08×101 9.53×102 2.20×102 9.08×101 4.01×105
    SD 6.44×102 1.56×104 2.54×103 1.70×101 2.98×106
    F4 MV 2.29×101 3.80×101 3.40×101 1.51×101 7.35×102
    SD 1.61×101 1.62×101 2.40×101 2.66×101 7.89×102
    F5 MV 1.80×101 2.00×101 2.02×101 2.05×101 2.00×101
    SD 5.91×100 4.52×103 7.89×102 9.30×102 3.17×106
    F6 MV 2.82×100 4.79×100 1.32×100 3.13×100 2.21×102
    SD 1.27×100 1.73×100 1.11×100 3.21×100 1.33×100
    F7 MV 3.30×101 5.19×101 1.20×101 1.25×100 9.15×101
    SD 1.34×101 1.80×101 6.15×102 1.20×100 3.51×102
    F8 MV 0.00×100 1.78×100 1.12×100 2.30×101 6.51×101
    SD 0.00×100 1.30×100 1.32×100 0.29×101 2.12×101
    F9 MV 1.52×101 1.13×101 9.02×100 3.27×101 7.11×101
    SD 5.75×100 9.09×100 2.92×100 1.09×101 2.67×101
    F10 MV 0.00×100 1.08×102 9.86×101 1.31×103 1.20×103
    SD 0.00×100 1.48×103 7.31×102 2.14×103 4.18×103
    F11 MV 1.94×102 6.40×102 2.11×102 1.41×103 1.23×103
    SD 1.65×102 3.13×102 1.40×102 2.35×103 3.40×102
    F12 MV 8.57×102 2.70×101 2.81×101 2.31×100 0.74×100
    SD 6.42×103 9.50×103 2.30×102 2.60×102 5.69×102
    F13 MV 3.25×101 4.95×101 0.56×101 2.50×101 3.34×100
    SD 1.36×101 1.33×101 8.22×102 1.50×101 1.30×100

     | Show Table
    DownLoad: CSV
    Table 9.  Rest of Experiment 3.
    Function HBNMA GA PSO DE BA
    F14 MV 1.71×102 2.60×102 1.40×102 2.10×102 3.42×100
    SD 1.65×101 1.13×101 4.34×102 1.86×101 1.36×101
    F15 MV 1.46×100 2.60×100 9.58×101 3.74×100 1.13×104
    SD 5.79×101 1.24×100 3.72×101 1.21×100 1.86×104
    F16 MV 2.32×100 2.49×100 0.94×100 4.36×100 5.45×100
    SD 4.10×101 3.88×101 3.98×101 4.63×101 1.60×101
    F17 MV 6.05×103 3.81×106 7.06×103 1.03×102 1.36×106
    SD 8.23×103 2.05×106 3.30×103 1.90×102 5.01×106
    F18 MV 7.95×102 7.45×103 6.06×103 1.30×103 7.94×106
    SD 2.79×103 9.20×103 9.81×103 1.57×103 1.26×105
    F19 MV 5.96×101 1.49×101 1.51×101 1.41×101 5.44×101
    SD 4.32×101 8.75×101 8.15×100 1.10×101 4.24×101
    F20 MV 1.93×101 7.52×104 1.03×104 6.43×103 8.58×106
    SD 1.63×100 8.60×103 2.71×103 5.11×102 3.21×105
    F21 MV 7.93×102 1.70×106 2.68×102 1.30×102 3.61×105
    SD 4.13×103 2.81×107 2.65×102 1.39×102 9.68×104
    F22 MV 8.04×101 2.24×102 2.82×102 2.13×102 4.53×102
    SD 8.89×101 5.17×101 1.30×102 3.70×102 2.24×102
    F23 MV 2.00×102 3.31×102 3.29×102 3.34×102 5.18×102
    SD 0.00×100 1.15×101 0.00×100 6.67×102 9.74×101
    F24 MV 1.33×102 1.46×102 1.74×102 1.33×102 2.01×102
    SD 9.45×100 2.03×101 5.51×101 1.18×101 3.03×101
    F25 MV 1.39×102 1.80×102 3.01×102 1.96×102 2.00×102
    SD 1.07×101 2.75×101 4.68×101 3.92×101 2.29×101
    F26 MV 1.00×102 1.00×102 \bf \quad 1.00×102 1.00×102 1.00×102
    SD 1.05×101 1.21×101 5.60×102 2.49×101 1.92×100
    F27 MV 4.64×101 2.91×102 2.37×102 2.27×102 5.79×102
    SD 9.68×101 1.78×102 1.73×102 2.62×102 1.23×102
    F28 MV 2.03×102 5.63×102 5.39×102 5.82×102 7.89×102
    SD 2.50×101 1.04×102 5.87×101 1.10×102 3.26×102
    F29 MV 4.64×101 4.67×105 1.20×105 8.54×102 7.35×104
    SD 9.68×101 7.57×106 6.39×106 2.20×103 1.97×105
    F30 MV 2.00×102 2.63×103 9.55×102 6.50×102 9.40×103
    SD 0.00×100 1.25×103 3.16×102 2.23×102 3.40×103

     | Show Table
    DownLoad: CSV
    Table 10.  Wilcoxon signed-ranks test of the results of Experiment 3.
    Comparison R+ R Z p-value Better Equal Worse
    HBNMA versus GA 435 0 4.703 <0.001 29 1 0
    HBNMA versus PSO 375 60 3.406 <0.001 21 1 8
    HBNMA versus DE 346 60 3.256 0.001 23 2 5
    HBNMA versus BA 435 0 4.703 <0.001 29 1 0

     | Show Table
    DownLoad: CSV

    The experimental results illustrated the superior behavior of the HBNMA in different test functions, which mimics real world problems. This is indicated in the following:

    (1) In classical unimodal functions, there exists a single global optimum, but there are no local optima. So, these functions need fair intensification in the algorithm to be solved. The HBNMA succeeded easily in obtaining the global optimum for these functions because it has a good intensification ability as indicated in Experiments 1 and 2.

    (2) The rotated unimodal functions, specifically functions F1–F3 from the CEC2014 benchmarking suit, are harder than the classical ones. So this suit of problems is perfect in evaluating the local searchability of the algorithm. The HBNMA obtained the best mean value in these functions relative to the state-of-the-art heuristics that are illustrated in Experiment 3 due to its strong local search.

    (3) In classical multimodal functions, there are many local minima but only one global optimal. So, these functions are good to test the global searchability of any algorithm. The HBNMA behaved superiorly in these functions due to the strong diversification ability that is enhanced in this work as shown in Experiments 1 and 2.

    (4) In shifted and rotated multimodal functions, specifically functions F4–F16 from the CEC2014 benchmarking suit, the HBNMA succeeded in obtaining the global optimum in the separable functions F8 and F10, but failed to obtain the global optimum in the other non-separable functions as shown in Experiment 3.

    (5) In hybrid functions, F17–F22, which more closely approximate real-world benchmarks, the HBNMA obtained the best mean value in 4 of 6 when comparing against other algorithms. Furthermore, HBNMA has superior results with all composition functions, F23–F30, when comparing with the others.

    In this section, the time complexity of the chosen algorithms is discussed using big O notation. In the standard BA, the time complexity is O(n* MaxIt) [29], where n is the population size and MaxIt is the maximum number of iterations. The maximum number of iterations is taking as the stopping criterion in all algorithms in computing time complexity instead of the number of function evaluations for simplicity. In the proposed HBNMA, the population initialization process has a computational time of O(n). In the main optimization of the proposed algorithm, the calculation of the centroid of the simplex that is needed to compute the reflection and expansion steps is of time complexity O(n*d), where d is the dimension of the problem. In addition, each particle undergoes one of two updating processes which are constant time operations, O(1). Also, continuous expansions may occur as long as refining in the best solution found so far occurs. These continuous expansions have a time complexity of O(k), where k is the maximum number of improved expansions found. The value of the parameter k in the HBNMA is small because if there is an improved expansion, and the probability of obtaining another improved expansion sequentially is small. As a consequence, this probability decreases during further sequential expansions. So, this parameter can be neglected in the time complexity of the HBNMA despite its importance in the robustness of the algorithm. Therefore, the overall time complexity of the HBNMA is O(n*d*MaxIt). The time complexity of the DE algorithm, PSO algorithm, and GA is O(n*d*MaxIt) [30,31].

    Hence, the proposed HBNMA has the same time complexity as in the state-of-the-art metaheuristics GA, PSO, and DE. In comparing with the BA in low-dimensional problems, the HBNMA has almost the same time complexity as in the BA. In high-dimensional problems, the HBNMA is more complex than the BA, but this does not affect the robustness of the HBNMA as it needs a lower number of iterations than that in the BA to reach the global optimal.

    In this work, a hybrid bat-Nelder-Mead algorithm (HBNMA) is presented to solve multidimensional, unconstrained optimization problems. This algorithm is based on introducing two steps of the NM in the updating formula of the particle's velocity of the standard BA. The reflection step is first introduced besides the standard updating formula of the BA. If an improvement in the fitness value is achieved, sequential expansion steps are then introduced to explore the area deeply, which accelerates the performance of the algorithm. Otherwise, the classic updating process, as in the standard BA, is performed. This mechanism enables particles to globally discover the search space and exploit any progress for deeper research for even better results. Accordingly, a satisfactory balance is achieved between global and local searches in the proposed algorithm. Finally, several numerical experiments were established and the simulation results of the proposed HBNMA showed superiority of the algorithm.

    In future research, multi-objective and constrained optimization are going to be addressed as most real-world continuous problems are of that form. New techniques are going to be innovated to deal with the selection of solutions in multiobjective problems rather than the common used approaches. These new techniques are going to be integrated with the HBNMA and any other single-objective, continuous optimization algorithm to be able to handle complex multiobjective problems. Furthermore, greater emphasis can be devoted to create new techniques that can handle constraints in constrained optimization problems.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    Enas Suhail: Formal analysis, Original draft preparation, Visualization, Software; Mahmoud El-Alem: Investigation, Methodology, Writing—review and editing, Supervision; Omar Bazighifan: Resources, Writing—review and editing; Ahmed S. Zekri: Conceptualization, Validation, Writing—review and editing, Supervision. All authors have read and approved the final version of the manuscript for publication.

    All authors declare that they have no conflicts of interest.

    The following table describes the benchmark test functions used in Section 3.3:

    Function Dimension Range Optimum Description
    Sphere vary [100,100] 0 Unimodal, separable
    Sumsquares vary [10,10] 0 Unimodal, separable
    Schwefel 2.20 vary [100,100] 0 Unimodal, separable
    Schwefel 2.21 vary [100,100] 0 Unimodal, separable
    Schwefel 2.22 vary [10,10] 0 Unimodal, non-separable
    Step vary [100,100] 0 Unimodal, separable
    Dixon-Price vary [10,10] 0 Unimodal, non-separable
    Sum of Different Powers vary [1,1] 0 Unimodal
    Griewank vary [600,600] 0 Multimodal, non-separable
    Ackley vary [30,30] 0 Multimodal, non-separable
    Alpine vary [10,10] 0 Multimodal, separable
    Rastrigin vary [5.12,5.12] 0 Multimodal
    Michalewicz vary [0,π] 9.6602 Multimodal
    Goldstein-Price 2 [2,2] 3 Multimodal, non-separable
    Shubert 2 [10,10] 186.7309 Multimodal
    Hartmann 3-D 3 [0,10] 3.8628 Multimodal, non-separable
    Six-Hump Camel 2 [2,2] 1.0316 Multimodal
    Qing vary [500,500] 0 Multimodal, separable
    Xin-She Yang N.1 vary [5,5] 0 Multimodal, separable
    Salomon vary [100,100] 0 Multimodal, non-separable
    Xin-She Yang N.2 vary [2π,2π] 0 Multimodal, non-separable
    Penalized vary [50,50] 0 Multimodal
    Zakharov vary [5,10] 0 Multimodal, non-separable



    [1] J. Holland, Genetic algorithms, Sci. Am., 267 (1992), 66–73.
    [2] J. Kennedy, R. Eberhart, Particle swarm optimization, Proceedings of ICNN'95-International conference on neural network, 1995, 1942–1948. http://doi.org/10.1109/ICNN.1995.488968
    [3] M. Dorigo, V. Maniezzo, A. Colorni, Ant system: Optimization by a colony of cooperating agents, IEEE T. Syst. Man Cy. B, 26 (1996), 29–41. http://doi.org/10.1109/3477.484436 doi: 10.1109/3477.484436
    [4] X. L. Li, Z. J. Shao, J. X. Qian, An optimizing method based on autonomous animats: Fish-swarm algorithm, Syst. Eng.-Theory Pract., 22 (2002), 32–38. https://doi.org/10.12011/1000-6788(2002)11-32 doi: 10.12011/1000-6788(2002)11-32
    [5] X-S. Yang, A new metaheuristic bat-inspired algorithm, In: Nature inspired cooperative strategies for optimization (NICSO 2010), Berlin, Heidelberg: Springer, 2010, 65–74. https://doi.org/10.1007/978-3-642-12538-6_6
    [6] E. H. Houssein, D. Oliva, N. A. Samee, N. F. Mahmoud, M. M. Emam, liver cancer algorithm: a novel bio-inspired optimizer, Comput. Biol. Med., 165 (2023), 107–389. https://doi.org/10.13140/RG.2.2.11139.27688 doi: 10.13140/RG.2.2.11139.27688
    [7] A. Qi, D. Zhao, A. A. Heidari, L. Liu, Yi. Chen, H. Chen, FATA: An efficient optimization method based on geophysics, Neurocomputing, 607 (2024), 128–289. https://doi.org/10.1016/j.neucom.2024.128289 doi: 10.1016/j.neucom.2024.128289
    [8] A-Q. Tian, F-F. Liu, H-X. Lv, snow geese algorithm: A novel migration-inspired meta-heuristic algorithm for constrained engineering optimization problems, Appl. Math. Modell., 126 (2024), 327–347. https://doi.org/10.1016/j.apm.2023.10.045 doi: 10.1016/j.apm.2023.10.045
    [9] M. Chen, L. Yang, G. Zeng, K. Lu, Y. Huang, IFA-EO: An improved firefly algorithm hybridized with extremal optimization for continuous unconstrained optimization problems, 2021, Available from: https://doi.org/10.21203/rs.3.rs-190790/v1
    [10] B. Shi, J. Chen, H. Chen, W. Lin, prediction of recurrent spontaneous abortion using evolutionary machine learning with joint self-adaptive sime mould algorithm, Comput. Bio. Med., 148 (2022), 105–885. https://doi.org/10.1016/j.compbiomed.2022.105885 doi: 10.1016/j.compbiomed.2022.105885
    [11] M. Shehab, M. A. Abu-Hashem, M. K. Y. Shambour, A. I. Alsalibi, O. A. Alomari, A. Gupta, et al., A comprehensive review of bat inspired algorithm: variants, applications, and hybridization, Arch. Computat. Methods Eng., 30(2023), 765–797. https://doi.org/10.1007/s11831-022-09817-5 doi: 10.1007/s11831-022-09817-5
    [12] J-H. Lin, C-W. Chou, C-H. Yang, H-L. Tsai, A chaotic Levy flight bat algorithm for parameter estimation in nonlinear dynamic biological systems, CIT, 2 (2012), 56–63.
    [13] C. Gan, W. Cao, M. Wu, X. Chen, A new bat algorithm based on iterative local search and stochastic inertia weight, Expert Syst. Appl., 104 (2018), 202–212. https://doi.org/10.1016/j.eswa.2018.03.015 doi: 10.1016/j.eswa.2018.03.015
    [14] S. Yu, J. Zhu, C. Lv, A Quantum Annealing Bat Algorithm for Node Localization in Wireless Sensor Network, Sensors, 23 (2023), 782. https://doi.org/10.3390/s23020782 doi: 10.3390/s23020782
    [15] G. Wang, L. Guo, A novel hybrid bat algorithm with harmony search for global numerical optimization, J. Appl. Math., 2013 (2013). https://doi.org/10.1155/2013/696491
    [16] R. Gupta, N. Chaudhary, S. K. Pal, Hybrid model to improve Bat algorithm performance, 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI), 2014, 1967–1970. https://doi.org/10.1109/ICACCI.2014.6968649
    [17] G. B. Mahanta, A. Rout, G. B. Muralia, B. Deepak, B. B. Biswal, Application of Hybrid Nelder-Mead Bat Algorithm to Improve the Grasp Quality during the Automated Robotic Grasping, Proc. Comput. Sci., 133 (2018), 612–619. https://doi.org/10.1016/j.procs.2018.07.093 doi: 10.1016/j.procs.2018.07.093
    [18] B. Kheireddine, B. Zoubida, H. Tarik, Improvements of bat algorithm using crossover technique and hybridization with Nelder-Mead simplex method, COMPEL, 38 (2019), 977–989.
    [19] A. F. Ali, Accelerated bat algorithm for solving integer programming problems, Egypt. Comput. Sci. J., 39 (2015), 507–518.
    [20] A. F. Ali, M. A. Tawhid, Solving integer programming problems by hybrid bat algorithm and direct search method, Trends Artif. Intell., 2 (2018), 46–59. http://doi.org/10.36959/643/303 doi: 10.36959/643/303
    [21] J. A. Nelder, R. Mead, A simplex method for function minimization, Comput. J., 7 (1965), 308–313. http://doi.org/10.1093/COMJNL/7.4.308 doi: 10.1093/COMJNL/7.4.308
    [22] S. Surjanovic, D. Bingham, Virtual Library of Simulation Experiments: Test Functions and Datasets, accessed on 31 December 2022. Available from: https://www.sfu.ca/ssurjano.
    [23] J. Liang, B. Qu, P. Suganthan, Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore, 2013, Technical Report 201311.
    [24] J. Derrac, S. García, D. Molina, F. Herrera, A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms, Swarm Evol. Comput., 1 (2011), 3–18. https://doi.org/10.1016/j.swevo.2011.02.002 doi: 10.1016/j.swevo.2011.02.002
    [25] Z. Zhou, Z. Sun, Research and application of improved quantum-behaved bat algorithm, Comp. Eng. D, 40 (2019), 84–91.
    [26] Z. Li, Improved bat algorithm based on grouping evolution and hybrid optimization, Math. Pr. Th, 50 (2020), 141–149.
    [27] Z. Zhao, M. Zeng, H-M. Mo, Z. Li, T. Wen, Cooperatively intelligent hybrid bat and differential evolution algorithm, Comp. Eng. D, 41 (2020), 402–410.
    [28] R. Storn, K. Price, Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim., 11 (1997), 341–359. https://doi.org/10.1023/A:1008202821328 doi: 10.1023/A:1008202821328
    [29] M-R. Chen, Y-Y. Huang, G-Q. Zeng, K-D. Lu, L-Q. Yang, An improved bat algorithm hybridized with extremal optimization and Boltzmann selection, Expert Syst. Appl., 175 (2021), 114–812. https://doi.org/10.1016/j.eswa.2021.114812 doi: 10.1016/j.eswa.2021.114812
    [30] M. Omran, A. Engelbrecht, Time complexity of population-based metaheuristics, MENDEL, 29 (2023), 255–260. https://doi.org/10.13164/mendel.2023.2.255 doi: 10.13164/mendel.2023.2.255
    [31] L. Zhou, K. Chen, H. Dong, S. Chi, Z. Chen, An improved beetle swarm optimization algorithm for the intelligent navigation control of autonomous sailing robots, IEEE Access, 9 (2020), 5296–5311. https://doi.org/10.1109/ACCESS.2020.3047816 doi: 10.1109/ACCESS.2020.3047816
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(426) PDF downloads(27) Cited by(0)

Figures and Tables

Figures(4)  /  Tables(10)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog