The Aquila optimizer (AO) is a recently developed swarm algorithm that simulates the hunting behavior of Aquila birds. In complex optimization problems, an AO may have slow convergence or fall in sub-optimal regions, especially in high complex ones. This paper tries to overcome these problems by using three different strategies: restart strategy, opposition-based learning and chaotic local search. The developed algorithm named as mAO was tested using 29 CEC 2017 functions and five different engineering constrained problems. The results prove the superiority and efficiency of mAO in solving many optimization issues.
Citation: Huangjing Yu, Heming Jia, Jianping Zhou, Abdelazim G. Hussien. Enhanced Aquila optimizer algorithm for global optimization and constrained engineering problems[J]. Mathematical Biosciences and Engineering, 2022, 19(12): 14173-14211. doi: 10.3934/mbe.2022660
Related Papers:
[1]
Yufei Wang, Yujun Zhang, Yuxin Yan, Juan Zhao, Zhengming Gao .
An enhanced aquila optimization algorithm with velocity-aided global search mechanism and adaptive opposition-based learning. Mathematical Biosciences and Engineering, 2023, 20(4): 6422-6467.
doi: 10.3934/mbe.2023278
[2]
Shuang Wang, Heming Jia, Qingxin Liu, Rong Zheng .
An improved hybrid Aquila Optimizer and Harris Hawks Optimization for global optimization. Mathematical Biosciences and Engineering, 2021, 18(6): 7076-7109.
doi: 10.3934/mbe.2021352
[3]
Yaning Xiao, Yanling Guo, Hao Cui, Yangwei Wang, Jian Li, Yapeng Zhang .
IHAOAVOA: An improved hybrid aquila optimizer and African vultures optimization algorithm for global optimization problems. Mathematical Biosciences and Engineering, 2022, 19(11): 10963-11017.
doi: 10.3934/mbe.2022512
[4]
Huangjing Yu, Yuhao Wang, Heming Jia, Laith Abualigah .
Modified prairie dog optimization algorithm for global optimization and constrained engineering problems. Mathematical Biosciences and Engineering, 2023, 20(11): 19086-19132.
doi: 10.3934/mbe.2023844
[5]
Juan ZHAO, Zheng-Ming GAO .
The heterogeneous Aquila optimization algorithm. Mathematical Biosciences and Engineering, 2022, 19(6): 5867-5904.
doi: 10.3934/mbe.2022275
[6]
Xuepeng Zheng, Bin Nie, Jiandong Chen, Yuwen Du, Yuchao Zhang, Haike Jin .
An improved particle swarm optimization combined with double-chaos search. Mathematical Biosciences and Engineering, 2023, 20(9): 15737-15764.
doi: 10.3934/mbe.2023701
[7]
Di Wu, Changsheng Wen, Honghua Rao, Heming Jia, Qingxin Liu, Laith Abualigah .
Modified reptile search algorithm with multi-hunting coordination strategy for global optimization problems. Mathematical Biosciences and Engineering, 2023, 20(6): 10090-10134.
doi: 10.3934/mbe.2023443
Teng Fei, Hongjun Wang, Lanxue Liu, Liyi Zhang, Kangle Wu, Jianing Guo .
Research on multi-strategy improved sparrow search optimization algorithm. Mathematical Biosciences and Engineering, 2023, 20(9): 17220-17241.
doi: 10.3934/mbe.2023767
[10]
Jiahao Zhang, Zhengming Gao, Suruo Li, Juan Zhao, Wenguang Song .
Improved intelligent clonal optimizer based on adaptive parameter strategy. Mathematical Biosciences and Engineering, 2022, 19(10): 10275-10315.
doi: 10.3934/mbe.2022481
Abstract
The Aquila optimizer (AO) is a recently developed swarm algorithm that simulates the hunting behavior of Aquila birds. In complex optimization problems, an AO may have slow convergence or fall in sub-optimal regions, especially in high complex ones. This paper tries to overcome these problems by using three different strategies: restart strategy, opposition-based learning and chaotic local search. The developed algorithm named as mAO was tested using 29 CEC 2017 functions and five different engineering constrained problems. The results prove the superiority and efficiency of mAO in solving many optimization issues.
1.
Introduction
The process of selecting appropriate and applicable variable values for a specific task is known as optimization [1,2,3]. Optimization exists in almost every domain, including job shop scheduling [4], feature selection [5,6,7], image processing [8,9], face detection and recognition [10], predicting chemical activities [11], classification [12,13], network allocation [14], internet of vehicles [15], routing [16], and neural network [17]. Due to the nature of real-world problems, Optimization becomes very challenging and has many difficulties such as multiobjectivity [18], memetic optimization [19], large-scale optimization [20], fuzzy optimization [21], uncertainties [22] and parameter estimation [23]. Metaheuristics algorithms have been used to solve such problems due to their advantages such as flexibility, efficiency and getting a near-optimal solution in a reasonable time.
Table 1.
Summary of literature review on Aquila optimizer (AO) variants and applications.
Examples of metaheuristics algorithms include particle swarm optimization (PSO) [24], artificial bee colony [25], coot bird [26], genetic algorithms (GAs) [27], the krill herd algorithm [28], the harmony search (HS) algorithm [29], the snake optimizer [30], monarch butterfly optimization [31], the slime mold algorithm [32], the moth search algorithm [33], the hunger games search [34], the Runge-Kutta method [35], the weighted mean of vectors [36], the virus colony search [37], the lightning search algorithm [38], ant lion optimization [39], the crow search algorithm [40], moth-flame optimization [41], the wild horse optimizer [42], the remora optimization algorithm [43], the artificial rabbit Optimizer [44], the artificial hummingbird algorithm [45], grasshopper optimization algorithm (GOA) [46], grey wolf optimizer (GWO) [47] and the whale optimization algorithm (WOA) [48].
The Aquila optimizer (AO) is the latest developed algorithm proposed by Abualigah et al. [49] which simulates the four different phases of Aquila hunting behavior. Wang et al. [50] developed an improved version of the AO by replacing AO's original exploitation phase with the Harris hawks optimizer's exploitation phase. Moreover, they embedded a random opposition-learning strategy and nonlinear escaping operator in their proposed algorithm. They argued that the proposed algorithm is able to achieve the best results compared with the other five metaheuristic optimizers. Also, Mahajan et al. [51] hybridized the AO with the arithmetic optimization algorithm (AOA) [52]. They tested their algorithm which is called AO-AOA with original AO, original AOA, WOA, GOA and GWO. Another hybrid work between AO and AOA has been done by Zhang et al. [53]. Likewise, Zhao et al. [54] developed another version of the AO called the simplified AO algorithm by removing the control equation of the exploitation and exploration procedures (latter strategies) and keeping the former two techniques. They said their developed algorithm IAO achieved better results than many newly developed swarm algorithms. Another enhancement has been done by Ma et al. [55] in which grey wolf optimizer is hybridized using Aquila algorithm that allows some wolves to be able to fly, improving their search techniques, and avoiding getting stuck in local optima. They tested their developed algorithm with many optimizers using 23 functions. Also, Gao et al. [56] employed three different strategies to enhance the AO algorithm. These strategies are Gaussian mutation (GM), random-opposition learning, and developing a search control operator. They argued that their algorithm, an improved AO, has superior results compared to other optimizers.
The AO has been successfully used in many applications. For example, AlRassas et al. [57] tried to forecast oil production by using the AO to optimize the adaptive neuro-fuzzy inference system model. Also, Abdelaziz et al. [58] tried to classify COVID-19 images using the AO algorithm and MobilNet3. Likewise, Fatani et al. [59] developed an extraction and selecting approach for features using the AO and deep learning for iot detection intrusion systems.
Despite the powerfulness and superiority of the algorithm, and as stated by the no free lunch theorem, the AO cannot solve all optimization issues. So, the AO still needs more enhancements and developments.
This paper introduces a novel version of the AO in which three different strategies have been used to overcome the original optimizer drawbacks such as getting stuck in local optima and slow convergence. These strategies are the chaotic local search (CLS), opposition-based learning (OBL) and the restart strategy (RS). Using OBL and the RS enhances the AO exploratory search capabilities whereas the CLS improves AO exploitative search abilities.
The main contributions of this paper are as follows:
● A novel Aquila algorithm has been developed using three strategies: OBL, the RS and the CLS.
● The developed optimizer has been compared with the original AO and nine other algorithms, namely, the CSA [40], EHO [60], GOA [46], LSHADE [61], Lshade-EpSin [62], MFO [63], MVO [64], and PSO [24].
● A scalability test and removing one strategy from the developed algorithm experiments have been carried out.
● mAO was tested using 29 functions and five constrained ones.
This paper is organized as follows: Section 2 discusses the background and preliminaries of the original algorithm, OBL, the CLS and the RS, whereas Section 3 introduces the structure of the modified optimizer and its complexity. Sections 4 and 5 discuss the results of the proposed mAO and other competitors in CEC2017 and five different constrained engineering problems whereas Section 6 concludes the paper.
2.
Preliminaries
2.1. Aquila optimizer (AO)
Aquila algorithm is one of the latest population-based swarm intelligence optimizers developed by Abualigah et al. [49]. Aquila can be considered among the most well-known prey birds existed in north hemisphere. Aquila is brown with a golden back body. Aquila uses its agility and strength with its wide nails and strong feet to catch various types ofprey usually squirrels, rabbits, marmots, and hares [65].
Aquila optimizer (AO) simulates the four different Aquila strategies in hunting. The next subsection shows Aquila's mathematical model.
2.2. Mathematical model
AO begins with a random set of individuals that can be represented mathematically as follows:
where X is an agent position (solution) that can be computed using the following equation:
Xi,j=rand×(UBj−LBj)+LBj,i=1,...,n,j=1,...,D
(2.2)
where D refers to the number of decision variables, N indicates the number of agents, UBjandLBj are the jth upper and lower boundaries, and xi refers to the ith value of the decision variable.
AO simulates Aquila hunting in four different phases where the optimizer can easily move from exploration and exploitation using the following condition:
In this phase, Aquila will determine the area to hunt the prey and select it by a vertical stoop and high soar. The mathematical formula for such a behavior is given by the following two equations:
X1(t+1)=Xbest(t)×(1−tT)+(XM(t)−Xbest(t)∗rand)
(2.3)
XM(t)=1NN∑i=1Xi(t),∀j=1,2,...,Dim
(2.4)
where XM(t) indicates the mean position in the ith generation, Xbest is the best Aquila position founded in this iteration, r is a randomly generated number in the interval [0,1], t is the current generation where T is the maximum generation number, and N is the Aquilas number.
2.2.2. Narrowed exploration (X2)
This technique is the most technique used by Aquila for hunting. To attack the prey, a short gliding is used with contour flight. Aquila position's will be updated as follows:
X2(t+1)=Xbest(t)×Levy(D)+XR(t)+(y−x)∗rand
(2.5)
where XR indicates a position of Aquila generated randomly, rand is a random real number between 0 and 1, D is the number of variables, and Levy refers to a lévy function which is presented as follows:
Levy(D)=s×u×σ|ν|1β
(2.6)
σ=Γ(1+β)×sin(πβ2)Γ(1+β2)×β×2β−12
(2.7)
where s is a fixed value and equals 0.01, uandμ are random numbers between 0 and 1 and β is a constant and equals 1.5. Both y and x are used to model the spiral shape and can be computed using the following two equations:
y=r×cos(θ)
(2.8)
x=r×sin(θ)
(2.9)
where randθ can be calculated as follows:
r=r1+U×D1
(2.10)
θ=−ω×D1+θ1
(2.11)
θ1=3×π2
(2.12)
where U equals 0.00565, ω equals 0.005, and r1 has a value between 1 and 20.
2.2.3. Expanded exploitation (X3)
In the 3rd technique, the prey area is determined and agents can vertically perform a preliminary attack with low flight. Agents can attack the prey as follow:
where XM(t) indicates the mean position in the i-th generation, Xbest is the best Aquila position founded in this iteration, rand is a randomly generated number in the interval [0,1], αandβ are exploitation parameters that are equal 0.1, and UBandLB refer to the upper and lower boundaries.
2.2.4. Narrowed exploitation (X4)
In this phase, Aquila can easily chase the prey and attack attacks it using escape trajectory light which can be modeled as follows:
where QF(t) is the quality value, G1 refers to various AO motions, and G2 refers to chasing prey flight slope.
2.3. Opposition-based learning
Opposition-based learning strategy is a technique introduced by Tizhoosh [66] which has been employed by many researchers to improve many swarm optimizers. For example, Hussien [67] embedded OBL in SSA to overcome getting trapped in local optima. Moreover, Hussien and Amin used OBL with chaotic local search to improve the exploration abilities of HHO [7]. Zhao et al. employed OBL with arithmetic optimization algorithm [1]. OBL works by comparing the original solution with its opposite one. Let x is a real number that falls in the interval [lb,ub], then its opposite can be calculated from the following equation:
ˉx=ub+lb−x
(2.18)
where lbandub are lower boundary and upper one respectively, and ˉx indicates the opposite solution. If x is a vector that has multi values, then ˉx can be computed from the following equation:
¯xj=ubj+lbj−xj
(2.19)
where xj indicates the jth value of x and ubjandlbj refer to upper and lower boundaries respectively.
2.4. Chaotic local search
The chaotic local search (CLS) technique has been integrated with many swarm optimizers such as WOA [68], HHO [7], brain storm optimization [69], and Jaya Algorithm (JAYA) [70]. CLS technique is almost used with the logistic map which is given in the following equation:
os+1=Cos(1−os)
(2.20)
where s is the number of the current iteration, C is a control parameter that equals 4, o1≠ 0.25, 0.50 and 0.75. Local search is used to search in the neighborhood area of the already founded optimal solution. CLS can be represented by the following equation:
Cs=(1−μ)×T+μ`Ci,i=1,2,...,n
(2.21)
where Cs refers to the value generated by CLS in iteration i and `Ci can be easily calculated as follows:
`Ci=LB+Ci×(UB−LB)
(2.22)
μ is a shrinking factor and can be computed from the following below equation:
μ=T−t+1T
(2.23)
where tandT refer to the current and maximum number of iterations.
2.5. Restart strategy
During the search operation, some agents may not be able to find a better location as they may get trapped in local optimum regions. Such agents may affect the overall search as they take many generation resources and don't enhance the search process. Restart strategy (RS) at this point which is proposed by Zhang et al. [71] can help worse agents to jump out from local regions. RS counts the number of times for each individual that has been enhanced and updated. So, if the i-th agent has updated, then the trial value will be zero, otherwise, the trial value will be increased by 1. If the trial is equal to a certain threshold, then the individual position will be changed using the following 2 equations:
X(t+1)=lb+rand.(ub−lb)
(2.24)
X(t+1)=rand.(ub+lb)−X(t)
(2.25)
where ubandlb refer to upper and lower boundaries and rand indicates a random number in the number [0,1].
3.
Enhanced Aquila optimizer
3.1. Shortcoming of Aquila algorithm
AO similar to other swarm optimizers may get stagnation in sub-optimal areas and have a slow convergence, especially when addressing and handling complicated & complex problems that have high dimensional features.
3.2. Architecture of modified AO
Our proposed algorithm which is termed mAO tries to solve the original optimizer limitations. In the proposed mAO, three different strategies are used to improve the classical AO namely: opposition-based Learning, restart strategy, and chaotic local search. OBL strategy is used in both the initialization phase and updating agent position process. OBL is used in initialization by selecting the best N solutions from the pool of X∪ˉx to ensure the algorithm starts with a good set of agents whereas it is embedded in the updating process to improve algorithm exploration abilities. Moreover, a chaotic local search mechanism is used to improve the best solution and existed until now which will lead to the enhancement of the whole individuals. On the other hand, a restart strategy is employed in AO to change the position of the worst individuals if they have already get fallen in local regions. The pseudo-code of the developed optimizer can be seen in algorithm 1.
3.3. Complexity of mAO
The complexity of the proposed algorithm can be computed by calculating the complexity of each phase separately, i.e., initialization, evaluation, and updating process. So O(mAO)=O(Initialization)+O(Evaluation)+O(UpdatingPosition)+O(CLS+OBL+RS). If D is the number of dimensions, N is the number of individuals, and T is the max iteration number, the following can be obtained.
O(Initialization)=O(N)
O(Evaluation)=O(N×T)
O(UpdatingPosition)=O(N×T×D)
O(CLS)=O(N×T)
O(OBL)=O(N×T×D)
O(RS)=O(N×D)
O(CLS+OBL+RS)=O(N×T×D)
O(mAO)=O(N)+O(N×T)+O(N×T×D)+O(N×T×D)=O(N×T×D)
4.
Experiments and discussion
4.1. Parameter setting
To validate our proposed approach, 29 functions from CEC2017 have been used to test mAO performance. These CEC2017 functions are very challenging and contain different types of functions (Unimodal, multimodal, composite, and hybrid). The description of CEC2017 functions is shown in Table 4 where opt. refers to the global optimal value. All experiments have been performed on Matlab 2021b using Intel Corei7 and 8.00 G of RAM. The parameter setting of all experiments is shown in Table 2. mAO is compared with the original Aquila Optimizer and other nine well-known and powerful swarm algorithms namely: crow search algorithm [40], elephant herd optimizer [60], grasshopper optimization algorithm [46], LSHADE [61], Lshade-EpSin [62], moth-flame optimization [63], multi-verse optimization [64], and particle swarm algorithm [24]. The parameter of each mentioned algorithm is given in Table 3.
Algorithm 1 Improved Aquila optimizer
1: Initialize the population X of the AO 2: Calculate X and select the best N from (X U X) 3: Initialize AO parameters 4: while (t < T) do 5: Compute the objective function values 6: Select the best agent Xbest 7: for (i = 1, 2, ..., N) do 8: Update the current solution mean 9: Compute y, x,G1,G2 and Levy(D) 10: if (t≤(23)T) then 11: ifrand≤0.5 then 12: Update current position using Eq (2.3) 13: Compute opposite position using Eq (2.19) 14: else 15: Update current position using Eq (2.5) 16: Compute opposite position using Eq (2.19) 17: if Fitness (rand≤0.5) then 18: Update the current solution using Eq (2.13) 19: Compute opposite position using Eq (2.19) 20: else 21: Update the current solution using Eq (2.14) 22: Compute opposite position using Eq (2.19) 23: end if 24: end if 25: end if 26: end for 27: Apply RS strategy using Eqs (2.24) and (2.25) 28: Apply CLS strategy using Eq (2.21) 29: end while 30: Return best solution
The developed optimizer and its competitors' results are shown in Table 5 in terms of best (min), worst (mac), mean (average), and standard deviation. From the above-mentioned table, it can be seen that the suggested technique has good results and performs well in solving all functions type. For example, in term of average, it ranked first in all unimodal functions (F1andF3), and all multimodal functions (F4andF10). On the other hand, it can be noticed that mAO achieved better results compared to the original optimizer and others. It ranked first in almost functions whereas it ranked first in solving composite problems in 5 functions out of 10. Besides the statistical measures, convergence curve can be seen as a powerful tool to compare any new algorithm with its competitors to see if it has a good convergence or slow one. mAO has been recognized to achieve a fast convergence curve in all mentioned function types as shown in Figures 1–3.
Table 5.
The comparison results of all algorithms over 30 functions.
Furthemore, a statistical comparsion using Wilcoxon test [72,73] has been carried out between the developed algorithm and all other competitors. Table 6 shows the p-values which show a big diffeence in the outputs between different optimizers. From Table 6, results prove the mAO algorithm superiority in finding near-optimal solutions when compared with others. To show the powerful and efficient of the proposed algorithm, a scalability test has been performed on 10 and 50 dimensions using the same functions and the same comparing algorithms. The results of this scalability test are shown in Table 7. It can be seen that mAO is better than other competitors in almost functions.
Table 6.
Wilcoxon rank sum test results for mAO against other algorithms CEC2017.
To show the powerfulness of our suggested optimizer by intergrating three strategies with AO, we test the standard AO with each operator seperately. Table 8 shows the average and standard deviation of four algorithms: AO with OBL (AOOBL), AO with CLS (AOCLS), AO with RS (AORS), and the developed algorithm mAO that contains AO with CLS, RS, and OBL.
Table 8.
Mean and Standard Deviation values obtained by various Enhanced AO.
In this section, the performance of the developed optimizer is tested using many real-world constrained problems which contain many inequalities. These problems are Welded beam design problem, Pressure vessel design problem, Tension/compression spring design problem, Speed reducer design problem, and Three-bar truss design problem. The mathematical formulas for the above problems are existed in [68,74,75].
5.1. Welded beam design problem
The first constrained problem used in this study is welded beam design (WBD) which is proposed by Coello [76]. The aim of this problem is to find the minimum welded beam cost and its design structure is shown in Figure 4. WBD has 7 constraints and 4 design variables namely: bar thickness (b), bar height (t), weld thickness (h), and attached bar part length (l). The mathematical representation of WBD can be formulated as follows:
Results of WBD are shown in Table 9 where mAO is compared with classical AO, GSA [46], GA [77], SSA [78], MPA [79], HHO [80,81], WOA [82], and CSA [40]. From the pre-mentioned table, it's notable that mAO has outperformed other swarm optimizers with an objective value of 1.6565 and decision values (x1,x2,x3,x4) = (0.1625, 3.4705, 9.0234, 0.2057, 1.6565).
Table 9.
Comparison of optimum results for Welded beam design problem.
The 2nd constrained problem introduced in this study is one of the mixed integer optimization problems which is termed as Pressure Vessel Design (PVD) problem proposed by Kannan and Kramer [83]. PVD aims to select the lowest cost of raw materials for the cylindrical vessel as shown in Figure 5. PVD has 4 parameters namely: head thickness (Th), shell thickness (Ts), cylindrical strength (L), and inner radius (R). To mathematically model PVD, the following formula is designed:
$ Results of PVD exists in table 10 where the suggested optimizer is compared with original AO, WOA [82], PSO-SCA [84], HS [85], SMA [86], CPSO [87], GWO [47], HHO [80], GOA [46], TEO [88], and SO [30]. From the pre-mentioned table, it can be seen that the developed optimizer ranked first with a value of 5946.3358 and decision values (x1,x2,x3,x4) = (1.0530, 0.181884, 58.619, 38.8080, 5946.3358).
Table 10.
Comparison of optimum results for pressure vessel design.
The 3rd constrained engineering problem discussed here is Tension/Compression Spring Design (TCSD) which was introduced by Arora [89] and its main objective is to decrease the tension spring weight by determining the optimal design variables' values that satisfy its constrained requirements. TCSD has different 3 variables namely: diameter of mean coil (D), the diameter of the wire (d), and active coils number (N). TCSD design is given in Figure 6 and its mathematical formulation is given as follows:
From the previously mentioned table, we can conclude that mAO has achieved better results compared to original and other competitors. It achieves a fitness value with 0.011056 and decision values (x1,x2,x3) = (0.0502339, 0.32282, 10.5244).
5.4. Speed reducer design problem
The 4th engineering problem discussed in this section is the speed reducer design [94] (SRD) whose main aim is to minimize speed reducer weight with respect to curvature stress of gear teeth, shafts stress, and shafts transverse deflections. It has seven different variables and its design is shown in Figure 7. The formulation of the SRD can be descriped mathematically as follows:
mAO is compared with different metaheuristics optimizers including PSO [95], MDA [96], GSA [93], HS [29], SCA [97], SES [98], SBSM [99], and hHHO-SCA [100] as shown in Table 12. From this table, it's obvious that mAO outperformed other algorithms. mAO ranked first with a fitness value of 3002.7328 and decision values (x1,x2,x3,x4,x5,x6,x7) = (3.5012, 0.7, 17, 7.3100, 7.8873, 3.0541, 5.2994).
Table 12.
Comparison of optimum results for Speed Reducer problem.
The last engineering problem addressed in this manuscript is called the Three-bar truss design (TBD) problem. TBD is a fraction and nonlinear civil engineering problem introduced by Nowcki [101]. Its objective is to find the minimum values of truss weight. It has two variables and its mathematical formulation is shown below:
mAO results are compared with CS [102], GOA [46], DEDS [103], MBA [104], PSO-DE [84], AAA [105], PSO-DE [84], DEDS [103] and AO. The results of TBD are listed in table 13 in which mAO results outperformed other competitors. mAO ranked first with a fitness value of 231.8681 and decision values (x1,x2) = (30.7886, 0.3844).
Table 13.
Optimization results for the Three-bar truss design problem.
In this study, a novel AO version is suggested called mAO to tackle various optimization issues. mAO is based on 3 different techniques: 1) Opposition-based Learning to improve optimizer exploration phase 2) Restart Strategy to remove the worse agents and replace them with totally random agents. 3) Chaotic Local Search to add more exploitation abilities to the original algorithm. mAO is tested using 29 CEC2017 functions and different five engineering optimization problems. Statistical analysis and experimental numbers show the significance of the suggested optimizer in solving various optimization issues. However, mAO like other swarm-based algorithms has a slow convergence in high-dimensional problems so, it won't be able to solve all optimization problem types.
In future, we can apply mAO to feature selection, job scheduling, combinatorial optimization problems, and stress suitability. Binary and multi-objective versions may be proposed in future.
Acknowledgments
The authors would like to thank the support of Digital Fujian Research Institute for Industrial Energy Big Data, Fujian Province University Key Lab for Industry Big Data Analysis and Application, Fujian Key Lab of Agriculture IOT Application, IOT Application Engineering Research Center of Fujian Province Colleges and Universities, Sanming City 5G Innovation Laboratory, and also the anonymous reviewers and the editor for their careful reviews and constructive suggestions to help us improve the quality of this paper. Educational research projects of young and middle-aged teachers in Fujian Province (JAT200648), Fujian Natural Science Foundation Project (2021J011128).
Conflict of interest
The authors declare there is no conflict of interest.
References
[1]
Y. J. Zhang, Y. F. Wang, Y. X. Yan, J. Zhao, Z. M. Gao, Lmraoa: An improved arithmetic optimization algorithm with multi-leader and high-speed jumping based on opposition-based learning solving engineering and numerical problems, Alexandria Eng. J., 61 (2022), 12367–12403. https://doi.org/10.1016/j.aej.2022.06.017 doi: 10.1016/j.aej.2022.06.017
[2]
S. Singh, H. Singh, N. Mittal, A. G. Hussien, F. Sroubek, A feature level image fusion for night-vision context enhancement using arithmetic optimization algorithm based image segmentation, Expert Syst. Appl., 209 (2022), 118272. https://doi.org/10.1016/j.eswa.2022.118272 doi: 10.1016/j.eswa.2022.118272
[3]
A. G. Hussien, A. E. Hassanien, E. H. Houssein, M. Amin, A. T. Azar, New binary whale optimization algorithm for discrete optimization problems, Eng. Optimiz., 52 (2020), 945–959. https://doi.org/10.1080/0305215X.2019.1624740 doi: 10.1080/0305215X.2019.1624740
[4]
L. D. Giovanni, F. Pezzella, An improved genetic algorithm for the distributed and flexible job-shop scheduling problem, Eur. J. Oper. Res., 200 (2010), 395–408. https://doi.org/10.1016/j.ejor.2009.01.008 doi: 10.1016/j.ejor.2009.01.008
[5]
A. G. Hussien, E. H. Houssein, A. E. Hassanien, A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection, in IEEE 2017 Eighth international conference on intelligent computing and information systems (ICICIS), (2017), 166–172. https://doi.org/10.1109/INTELCIS.2017.8260031
[6]
A. G. Hussien, D. Oliva, E. H. Houssein, A. A. Juan, X. Yu, Binary whale optimization algorithm for dimensionality reduction, Mathematics, 8 (2020), 1821. https://doi.org/10.3390/math8101821 doi: 10.3390/math8101821
[7]
A. G. Hussien, M. Amin, A self-adaptive harris hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection, Int. J. Mach. Learn. Cyb., 13 (2022), 309–336. https://doi.org/10.1007/s13042-021-01326-4 doi: 10.1007/s13042-021-01326-4
[8]
Q. Liu, N. Li, H. Jia, Q. Qi, L. Abualigah, Modified remora optimization algorithm for global optimization and multilevel thresholding image segmentation, Mathematics, 10 (2022), 1014. https://doi.org/10.3390/math10071014 doi: 10.3390/math10071014
[9]
A. A. Ewees, L. Abualigah, D. Yousri, A. T. Sahlol, M. A. Al-qaness, S. Alshathri, et al., Modified artificial ecosystem-based optimization for multilevel thresholding image segmentation, Mathematics, 9 (2021), 2363. https://doi.org/10.3390/math9192363 doi: 10.3390/math9192363
[10]
M. Besnassi, N. Neggaz, A. Benyettou, Face detection based on evolutionary haar filter, Pattern Anal. Appl., 23 (2020), 309–330. https://doi.org/10.1007/s10044-019-00784-5 doi: 10.1007/s10044-019-00784-5
[11]
E. H. Houssein, M. Amin, A. G. Hussien, A. E. Hassanien, Swarming behaviour of salps algorithm for predicting chemical compound activities, in IEEE 2017 eighth international conference on intelligent computing and information systems (ICICIS), (2017), 315–320. https://doi.org/10.1109/INTELCIS.2017.8260072
[12]
H. Fathi, H. AlSalman, A. Gumaei, I. I. Manhrawy, A. G. Hussien, P. El-Kafrawy, An efficient cancer classification model using microarray and high-dimensional data, Comput. Intell. Neurosci., 2021 (2021). https://doi.org/10.1155/2021/7231126
[13]
L. Abualigah, A. H. Gandomi, M. A. Elaziz, A. G. Hussien, A. M. Khasawneh, M. Alshinwan, et al., Nature-inspired optimization algorithms for text document clustering—a comprehensive analysis, Algorithms, 13 (2020), 345. https://doi.org/10.3390/a13120345 doi: 10.3390/a13120345
[14]
A. S. Sadiq, A. A. Dehkordi, S. Mirjalili, Q. V. Pham, Nonlinear marine predator algorithm: A cost-effective optimizer for fair power allocation in noma-vlc-b5g networks, Expert Syst. Appl., 203 (2022), 117395. https://doi.org/10.1016/j.eswa.2022.117395 doi: 10.1016/j.eswa.2022.117395
[15]
A. A. Dehkordi, A. S. Sadiq, S. Mirjalili, K. Z. Ghafoor, Nonlinear-based chaotic harris hawks optimizer: algorithm and internet of vehicles application, Appl. Soft Comput., 109 (2021), 107574. https://doi.org/10.1016/j.asoc.2021.107574 doi: 10.1016/j.asoc.2021.107574
[16]
A. S. Sadiq, A. A. Dehkordi, S. Mirjalili, J. Too, P. Pillai, Trustworthy and efficient routing algorithm for iot-fintech applications using non-linear lévy brownian generalized normal distribution optimization, IEEE Internet Things, 2021. https://doi.org/10.1109/JIOT.2021.3109075
[17]
H. Faris, S. Mirjalili, I. Aljarah, Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme, Int. J. Mach. Learn. Cyb., 10 (2019), 2901–2920. https://doi.org/10.1007/s13042-018-00913-2 doi: 10.1007/s13042-018-00913-2
[18]
B. Cao, J. Zhao, P. Yang, Y. Gu, K. Muhammad, J. J. Rodrigues, et al., Multiobjective 3-d topology optimization of next-generation wireless data center network, IEEE Trans. Ind. Inf., 16 (2019), 3597–3605. https://doi.org/10.1109/TII.2019.2952565 doi: 10.1109/TII.2019.2952565
[19]
X. Fu, P. Pace, G. Aloi, L. Yang, G. Fortino, Topology optimization against cascading failures on wireless sensor networks using a memetic algorithm, Comput. Networks, 177 (2020), 107327. https://doi.org/10.1016/j.comnet.2020.107327 doi: 10.1016/j.comnet.2020.107327
[20]
L. Abualigah, A. Diabat, A comprehensive survey of the grasshopper optimization algorithm: results, variants, and applications, Neural Comput. Appl., 32 (2020), 15533–15556. https://doi.org/10.1007/s00521-020-04789-8 doi: 10.1007/s00521-020-04789-8
[21]
H. Chen, H. Qiao, L. Xu, Q. Feng, K. Cai, A fuzzy optimization strategy for the implementation of rbf lssvr model in vis–nir analysis of pomelo maturity, IEEE Trans. Ind. Inf., 15 (2019), 5971–5979. https://doi.org/10.1109/TII.2019.2933582 doi: 10.1109/TII.2019.2933582
[22]
H. G. Beyer, B. Sendhoff, Robust optimization–-a comprehensive survey, Comput. Method Appl. M., 196 (2007), 3190–3218. https://doi.org/10.1016/J.CMA.2007.03.003 doi: 10.1016/J.CMA.2007.03.003
[23]
D. Oliva, A. A. Ewees, M. A. E. Aziz, A. E. Hassanien, M. P. Cisneros, A chaotic improved artificial bee colony for parameter estimation of photovoltaic cells, Energies, 10 (2017), 865. https://doi.org/10.3390/en10070865 doi: 10.3390/en10070865
[24]
J. Kennedy, R. Eberhart, Particle swarm optimization, in IEEE Proceedings of ICNN'95-International Conference on Neural Networks, 4 (1995), 1942–1948. https://doi.org/10.1109/ICNN.1995.488968
[25]
D. Karaboga, C. Ozturk, A novel clustering approach: Artificial bee colony (abc) algorithm, Appl. Soft Comput., 11 (2011), 652–657. https://doi.org/10.1016/j.asoc.2009.12.025 doi: 10.1016/j.asoc.2009.12.025
[26]
R. R. Mostafa, A. G. Hussien, M. A. Khan, S. Kadry, F. A. Hashim, Enhanced coot optimization algorithm for dimensionality reduction, in IEEE 2022 Fifth International Conference of Women in Data Science at Prince Sultan University (WiDS PSU), (2022), 43–48. https://10.1109/WiDS-PSU54548.2022.00020
[27]
J. H. Holland, Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence, MIT press, 1992.
[28]
A. H. Gandomi, A. H. Alavi, Krill herd: A new bio-inspired optimization algorithm, Commun. Nonlinear Sci., 17 (2012), 4831–4845. https://10.1016/j.cnsns.2012.05.010 doi: 10.1016/j.cnsns.2012.05.010
F. A. Hashim, A. G. Hussien, Snake optimizer: A novel meta-heuristic optimization algorithm, Knowl.-Based Syst., 242 (2022), 108320. https://doi.org/10.1016/j.knosys.2022.108320 doi: 10.1016/j.knosys.2022.108320
[31]
G. G. Wang, S. Deb, Z. Cui, Monarch butterfly optimization, Neural Comput. Appl., 31 (2019), 1995–2014. https://doi.org/10.1007/s00521-015-1923-y doi: 10.1007/s00521-015-1923-y
[32]
S. Li, H. Chen, M. Wang, A. A. Heidari, S. Mirjalili, Slime mould algorithm: A new method for stochastic optimization, Future Gener. Comput. Syst., 111 (2020), 300–323. https://doi.org/10.1016/j.future.2020.03.055 doi: 10.1016/j.future.2020.03.055
[33]
G. G. Wang, Moth search algorithm: a bio-inspired metaheuristic algorithm for global optimization problems, Memet. Comput., 10 (2018), 151–164. https://doi.org/10.1007/s12293-016-0212-3 doi: 10.1007/s12293-016-0212-3
[34]
Y. Yang, H. Chen, A. A. Heidari, A. H. Gandomi, Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts, Expert Syst. Appl., 177 (2021), 114864. https://doi.org/10.1016/j.eswa.2021.114864 doi: 10.1016/j.eswa.2021.114864
[35]
I. Ahmadianfar, A. A. Heidari, A. H. Gandomi, X. Chu, H. Chen, Run beyond the metaphor: An efficient optimization algorithm based on runge kutta method, Expert Syst. Appl., 181 (2021), 115079. https://doi.org/10.1016/j.eswa.2021.115079 doi: 10.1016/j.eswa.2021.115079
[36]
I. Ahmadianfar, A. A. Heidari, S. Noshadian, H. Chen, A. H. Gandomi, Info: An efficient optimization algorithm based on weighted mean of vectors, Expert Syst. Appl., 195 (2022), 116516. https://doi.org/10.1016/j.eswa.2022.116516 doi: 10.1016/j.eswa.2022.116516
[37]
A. G. Hussien, A. A. Heidari, X. Ye, G. Liang, H. Chen, Z. Pan, Boosting whale optimization with evolution strategy and gaussian random walks: an image segmentation method, Eng. Comput., (2022), 1–45. https://doi.org/10.1007/s00366-021-01542-0
[38]
L. Abualigah, M. A. Elaziz, A. G. Hussien, B. Alsalibi, S. M. J. Jalali, A. H. Gandomi, Lightning search algorithm: A comprehensive survey, Appl. Intell., 51 (2021), 2353–23760. https://doi.org/10.1007/s10489-020-01947-2 doi: 10.1007/s10489-020-01947-2
[39]
A. S. Assiri, A. G. Hussien, M. Amin, Ant lion optimization: variants, hybrids, and applications, IEEE Access, 8 (2020), 77746–77764. https://doi.org/10.1109/ACCESS.2020.2990338 doi: 10.1109/ACCESS.2020.2990338
[40]
A. G. Hussien, M. Amin, M. Wang, G. Liang, A. Alsanad, A. Gumaei, et al., Crow search algorithm: Theory, recent advances, and applications, IEEE Access, 8 (2020), 173548–173565. https://doi.org/10.1109/ACCESS.2020.3024108 doi: 10.1109/ACCESS.2020.3024108
[41]
A. G. Hussien, M. Amin, M. A. E. Aziz, A comprehensive review of moth-flame optimisation: variants, hybrids, and applications, J. Exp. Theor. Artif., 32 (2020), 705–725. https://doi.org/10.1080/0952813X.2020.1737246 doi: 10.1080/0952813X.2020.1737246
[42]
R. Zheng, A. G. Hussien, H. M. Jia, L. Abualigah, S. Wang, D. Wu, An improved wild horse optimizer for solving optimization problems, Mathematics, 10 (2022), 1311. https://doi.org/10.3390/math10081311 doi: 10.3390/math10081311
[43]
S. Wang, A. G. Hussien, H. Jia, L. Abualigah, R. Zheng, Enhanced remora optimization algorithm for solving constrained engineering optimization problems, Mathematics, 10 (2022), 1696. https://doi.org/10.3390/math10101696 doi: 10.3390/math10101696
[44]
L. Wang, Q. Cao, Z. Zhang, S. Mirjalili, W. Zhao, Artificial rabbits optimization: A new bio-inspired meta-heuristic algorithm for solving engineering optimization problems, Eng. Appl. Artif. Intell., 114 (2022), 105082. https://doi.org/10.1016/j.engappai.2022.105082 doi: 10.1016/j.engappai.2022.105082
[45]
W. Zhao, Z. Zhang, S. Mirjalili, L. Wang, N. Khodadadi, S. M. Mirjalili, An effective multi-objective artificial hummingbird algorithm with dynamic elimination-based crowding distance for solving engineering design problems, Comput. Method. Appl. M., 398 (2022), 115223. https://doi.org/10.1016/j.cma.2022.115223 doi: 10.1016/j.cma.2022.115223
[46]
S. Saremi, S. Mirjalili, A. Lewis, Grasshopper optimisation algorithm: theory and application, Adv. Eng. Software., 105 (2017), 30–47. https://doi.org/10.1016/j.advengsoft.2017.01.004 doi: 10.1016/j.advengsoft.2017.01.004
[47]
S. Mirjalili, S. M. Mirjalili, A. Lewis, Grey wolf optimizer, Adv. Eng. Software, 69 (2014), 46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007 doi: 10.1016/j.advengsoft.2013.12.007
[48]
E. H. Houssein, A. G. Hussien, A. E. Hassanien, S. Bhattacharyya, M. Amin, S-shaped binary whale optimization algorithm for feature selection, in First International Symposium on Signal and Image Processing (ISSIP 2017), 2017. 79–87.
[49]
L. Abualigah, D. Yousri, M. A. Elaziz, A. A. Ewees, M. A. Al-Qaness, A. H. Gandomi, Aquila optimizer: a novel meta-heuristic optimization algorithm, Comput. Ind. Eng., 157 (2021), 107250. https://doi.org/10.1016/j.cie.2021.107250 doi: 10.1016/j.cie.2021.107250
[50]
S. Wang, H. Jia, L. Abualigah, Q. Liu, R. Zheng, An improved hybrid aquila optimizer and harris hawks algorithm for solving industrial engineering optimization problems, Processes, 9 (2021), 1551. https://doi.org/10.3934/mbe.2021352 doi: 10.3934/mbe.2021352
[51]
S. Mahajan, L. Abualigah, A. K. Pandit, M. Altalhi, Hybrid aquila optimizer with arithmetic optimization algorithm for global optimization tasks, Soft Comput., 26 (2022), 4863–4881. https://doi.org/10.1007/s00500-022-06873-8 doi: 10.1007/s00500-022-06873-8
[52]
L. Abualigah, A. Diabat, S. Mirjalili, M. A. Elaziz, A. H. Gandomi, The arithmetic optimization algorithm, Comput. Methods Appl. Mech. Eng., 376 (2021), 113609. https://doi.org/10.1016/j.cma.2020.113609 doi: 10.1016/j.cma.2020.113609
[53]
Y. J. Zhang, Y. X. Yan, J. Zhao, Z. M. Gao, Aoaao: The hybrid algorithm of arithmetic optimization algorithm with aquila optimizer, IEEE Access, 10 (2022), 10907–10933. https://doi.org/10.1109/ACCESS.2022.3144431 doi: 10.1109/ACCESS.2022.3144431
[54]
J. Zhao, Z. M. Gao, H. F. Chen, The simplified aquila optimization algorithm, IEEE Access, 10 (2022), 22487–22515. https://doi.org/10.1109/ACCESS.2022.3153727 doi: 10.1109/ACCESS.2022.3153727
[55]
C. Ma, H. Huang, Q. Fan, J. Wei, Y. Du, W. Gao, Grey wolf optimizer based on aquila exploration method, Expert Syst. Appl., 205 (2022), 117629. https://doi.org/10.1016/j.eswa.2022.117629 doi: 10.1016/j.eswa.2022.117629
[56]
B. Gao, Y. Shi, F. Xu, X. Xu, An improved aquila optimizer based on search control factor and mutations, Processes, 10 (2022), 1451. https://doi.org/10.3390/pr10081451 doi: 10.3390/pr10081451
[57]
A. M. AlRassas, M. A. Al-qaness, A. A. Ewees, S. Ren, M. A. Elaziz, R. Damaševičius, et al., Optimized anfis model using aquila optimizer for oil production forecasting, Processes, 9 (2021), 1194. https://doi.org/10.3390/pr9071194 doi: 10.3390/pr9071194
[58]
M. A. Elaziz, A. Dahou, N. A. Alsaleh, A. H. Elsheikh, A. I. Saba, M. Ahmadein, Boosting covid-19 image classification using mobilenetv3 and aquila optimizer algorithm, Entropy, 23 (2021), 1383. https://doi.org/10.3390/e23111383 doi: 10.3390/e23111383
[59]
A. Fatani, A. Dahou, M. A. Al-Qaness, S. Lu, M. A. Elaziz, Advanced feature extraction and selection approach using deep learning and aquila optimizer for iot intrusion detection system, Sensors, 22 (2021), 140. https://doi.org/10.3390/s22010140 doi: 10.3390/s22010140
[60]
G. G. Wang, S. Deb, L. D. S. Coelho, Elephant herding optimization, in IEEE 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), (2015), 1–5. https://doi.org/10.1109/ISCBI.2015.8
[61]
R. Tanabe, A. S. Fukunaga, Improving the search performance of shade using linear population size reduction, in 2014 IEEE Congress on Evolutionary Computation (CEC), (2014), 1658–1665. https://doi.org/10.1109/CEC.2014.6900380
[62]
N. H. Awad, M. Z. Ali, P. N. Suganthan, Ensemble sinusoidal differential covariance matrix adaptation with euclidean neighborhood for solving cec2017 benchmark problems, in 2017 IEEE Congress on Evolutionary Computation (CEC), (2017), 372–379. https://doi.org/10.1109/CEC.2017.7969336
[63]
S. Mirjalili, Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm, Knowl.-Based Syst., 89 (2015), 228–249. https://doi.org/10.1016/j.knosys.2015.07.006 doi: 10.1016/j.knosys.2015.07.006
[64]
S. Mirjalili, S. M. Mirjalili, A. Hatamlou, Multi-verse optimizer: a nature-inspired algorithm for global optimization, Neural Comput. Appl., 27 (2016), 495–513. https://doi.org/10.1007/s00521-015-1870-7 doi: 10.1007/s00521-015-1870-7
[65]
K. Steenhof, M. N. Kochert, T. L. Mcdonald, Interactive effects of prey and weather on golden eagle reproduction, J. Anim. Ecol., 66 (1997), 350–362.
[66]
H. R. Tizhoosh, Opposition-based learning: a new scheme for machine intelligence, in IEEE International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC'06), 1 (2005), 695–701. https://doi.org/10.1109/CIMCA.2005.1631345
[67]
A. G. Hussien, An enhanced opposition-based salp swarm algorithm for global optimization and engineering problems, J. Amb. Intell. Hum. Comput., 13 (2022), 129–150. https://doi.org/10.1007/s12652-021-02892-9 doi: 10.1007/s12652-021-02892-9
[68]
H. Chen, Y. Xu, M. Wang, X. Zhao, A balanced whale optimization algorithm for constrained engineering design problems, Appl. Math. Modell., 71 (2019), 45–59. https://doi.org/10.1016/j.apm.2019.02.004 doi: 10.1016/j.apm.2019.02.004
[69]
Y. Yu, S. Gao, S. Cheng, Y. Wang, S. Song, F. Yuan, Cbso: A memetic brain storm optimization with chaotic local search, Memet. Comput., 10 (2018), 353–367. https://doi.org/10.1007/s12293-017-0247-0 doi: 10.1007/s12293-017-0247-0
[70]
J. Zhao, Y. Zhang, S. Li, Y. Wang, Y. Yan, Z. Gao, A chaotic self-adaptive jaya algorithm for parameter extraction of photovoltaic models, Math. Biosci. Eng., 19 (2022), 5638–5670. https://doi.org/10.3934/mbe.2022264 doi: 10.3934/mbe.2022264
[71]
H. Zhang, Z. Wang, W. Chen, A. A. Heidari, M. Wang, X. Zhao, et al., Ensemble mutation-driven salp swarm algorithm with restart mechanism: Framework and fundamental analysis, Expert Syst. Appl., 165 (2021), 113897. https://doi.org/10.1016/j.eswa.2020.113897 doi: 10.1016/j.eswa.2020.113897
[72]
Y. Zhang, Y. Wang, S. Li, F. Yao, L. Tao, Y. Yan, et al., An enhanced adaptive comprehensive learning hybrid algorithm of rao-1 and jaya algorithm for parameter extraction of photovoltaic models, Math. Biosci. Eng., 19 (2022), 5610–5637. https://doi.org/10.3934/mbe.2022263 doi: 10.3934/mbe.2022263
[73]
Y. J. Zhang, Y. X. Yan, J. Zhao, Z. M. Gao, Cscahho: Chaotic hybridization algorithm of the sine cosine with harris hawk optimization algorithms for solving global optimization problems, Plos One, 17 (2022), e0263387. https://doi.org/10.1371/journal.pone.0263387 doi: 10.1371/journal.pone.0263387
[74]
M. Y. Cheng, D. Prayogo, A novel fuzzy adaptive teaching–learning-based optimization (fatlbo) for solving structural optimization problems, Eng. Comput., 33 (2017), 55–69. https://doi.org/10.1007/s00366-016-0456-z doi: 10.1007/s00366-016-0456-z
[75]
H. Samma, J. Mohamad-Saleh, S. A. Suandi, B. Lahasan, Q-learning-based simulated annealing algorithm for constrained engineering design problems, Neural Comput. Appl., 32 (2020), 5147–5161. https://doi.org/10.1007/s00521-019-04008-z doi: 10.1007/s00521-019-04008-z
[76]
C. A. C. Coello, Use of a self-adaptive penalty approach for engineering optimization problems, Comput. Ind., 41 (2000), 113–127. https://doi.org/10.1016/S0166-3615(99)00046-9 doi: 10.1016/S0166-3615(99)00046-9
[77]
K. Deb, Optimal design of a welded beam via genetic algorithms, AIAA J., 29 (1991), 2013–2015. https://doi.org/10.2514/3.10834 doi: 10.2514/3.10834
[78]
S. Mirjalili, A. H. Gandomi, S. Z. Mirjalili, S. Saremi, H. Faris, S. M. Mirjalili, Salp swarm algorithm: A bio-inspired optimizer for engineering design problems, Adv. Eng. Software, 114 (2017), 163–191. https://doi.org/10.1016/j.advengsoft.2017.07.002 doi: 10.1016/j.advengsoft.2017.07.002
[79]
A. Faramarzi, M. Heidarinejad, S. Mirjalili, A. H. Gandomi, Marine predators algorithm: A nature-inspired metaheuristic, Expert Syst. Appl., 152 (2020), 113377. https://doi.org/10.1016/j.eswa.2020.113377 doi: 10.1016/j.eswa.2020.113377
[80]
A. A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, H. Chen, Harris hawks optimization: Algorithm and applications, Future Gener. Comput. Syst., 97 (2019), 849–872. https://doi.org/10.1016/j.future.2019.02.028 doi: 10.1016/j.future.2019.02.028
[81]
A. G. Hussien, L. Abualigah, R. A. Zitar, F. A. Hashim, M. Amin, A. Saber, et al., Recent advances in harris hawks optimization: A comparative study and applications, Electronics, 11 (2022), 1919. https://doi.org/10.3390/electronics11121919 doi: 10.3390/electronics11121919
[82]
S. Mirjalili, A. Lewis, The whale optimization algorithm, Adv. Eng. Software, 95 (2016), 51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008 doi: 10.1016/j.advengsoft.2016.01.008
[83]
B. Kannan, S. N. Kramer, An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design, J. Mech. Design, 116 (1994), 405–411. https://doi.org/10.1115/1.2919393 doi: 10.1115/1.2919393
[84]
H. Liu, Z. Cai, Y. Wang, Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization, Appl. Soft Comput., 10 (2010), 629–640. https://doi.org/10.1016/j.asoc.2009.08.031 doi: 10.1016/j.asoc.2009.08.031
[85]
M. Mahdavi, M. Fesanghary, E. Damangir, An improved harmony search algorithm for solving optimization problems, Appl. Math. Comput., 188 (2007), 1567–1579. https://doi.org/10.1016/j.amc.2006.11.033 doi: 10.1016/j.amc.2006.11.033
[86]
J. Zhao, Z. M. Gao, W. Sun, The improved slime mould algorithm with levy flight, in Journal of Physics: Conference Series, 1617 (2020), 012033. https://doi.org/10.1088/1742-6596/1617/1/012033
[87]
Q. He, L. Wang, An effective co-evolutionary particle swarm optimization for constrained engineering design problems, Eng. Appl. Artif. Intell., 20 (2007), 89–99. https://doi.org/10.1016/j.engappai.2006.03.003 doi: 10.1016/j.engappai.2006.03.003
[88]
A. Kaveh, A. Dadras, A novel meta-heuristic optimization algorithm: Thermal exchange optimization, Adv. Eng. Software, 110 (2017), 69–84. https://doi.org/10.1016/j.advengsoft.2017.03.014 doi: 10.1016/j.advengsoft.2017.03.014
[89]
J. S. Arora, Introduction to optimum design, Elsevier, 2004.
[90]
A. Kaveh, M. Khayatazad, A new meta-heuristic method: Ray optimization, Comput. Struct., 112 (2012), 283–294. https://doi.org/10.1016/j.compstruc.2012.09.003 doi: 10.1016/j.compstruc.2012.09.003
[91]
E. Mezura-Montes, C. A. C. Coello, An empirical study about the usefulness of evolution strategies to solve constrained optimization problems, Int. J. Gen. Syst., 37 (2008), 443–473. https://doi.org/10.1080/03081070701303470 doi: 10.1080/03081070701303470
[92]
M. A. Elaziz, D. Oliva, S. Xiong, An improved opposition-based sine cosine algorithm for global optimization, Expert Syst. Appl., 90 (2017), 484–500. https://doi.org/10.1016/j.eswa.2017.07.043 doi: 10.1016/j.eswa.2017.07.043
[93]
E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, Gsa: A gravitational search algorithm, Inf. Sci, 179 (2009), 2232–2248. https://doi.org/10.1016/j.ins.2009.03.004 doi: 10.1016/j.ins.2009.03.004
[94]
E. Mezura-Montes, C. A. C. Coello, Useful infeasible solutions in engineering optimization with evolutionary algorithms, in Mexican International Conference on Artificial Intelligence, 3789 (2005), 652–662. https://doi.org/10.1007/11579427_66
[95]
S. Stephen, D. Christu, A. Dalvi, Design optimization of weight of speed reducer problem through matlab and simulation using ansys, Int. J. Mech. Eng. Technol., 9 (2018), 339–349.
[96]
S. Lu, H. M. Kim, A regularized inexact penalty decomposition algorithm for multidisciplinary design optimization problems with complementarity constraints, J. Mech. Design, 132 (2010), 041005. https://doi.org/10.1115/1.4001206 doi: 10.1115/1.4001206
[97]
S. Mirjalili, Sca: A sine cosine algorithm for solving optimization problems, Knowl.-Based Syst., 96 (2016), 120–133. https://doi.org/10.1016/j.knosys.2015.12.022 doi: 10.1016/j.knosys.2015.12.022
[98]
E. Mezura-Montes, C. C. Coello, R. Landa-Becerra, Engineering optimization using simple evolutionary algorithm, in Proceedings. 15th IEEE International Conference on Tools with Artificial Intelligence, (2003), 149–156. https://doi.org/10.1109/TAI.2003.1250183
[99]
S. Akhtar, K. Tai, T. Ray, A socio-behavioural simulation model for engineering design optimization, Eng. Optimiz., 34 (2002), 341–354. https://doi.org/10.1080/03052150212723 doi: 10.1080/03052150212723
[100]
V. K. Kamboj, A. Nandi, A. Bhadoria, S. Sehgal, An intensify harris hawks optimizer for numerical and engineering optimization problems, Appl. Soft Comput., 89 (2020), 106018. https://doi.org/10.1016/j.asoc.2019.106018 doi: 10.1016/j.asoc.2019.106018
[101]
H. Nowacki, Optimization in pre-contract ship design, In International Conference on Computer Applications in the Automation of Shipyard Operation and Ship Design, 1973.
[102]
A. H. Gandomi, X. S. Yang, A. H. Alavi, Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems, Eng. Comput., 29 (2013), 17–35. https://doi.org/10.1007/s00366-011-0241-y doi: 10.1007/s00366-011-0241-y
[103]
M. Zhang, W. Luo, X. Wang, Differential evolution with dynamic stochastic selection for constrained optimization, Inf. Sci., 178 (2008), 3043–3074. https://doi.org/10.1016/j.ins.2008.02.014 doi: 10.1016/j.ins.2008.02.014
[104]
A. Sadollah, A. Bahreininejad, H. Eskandar, M. Hamdi, Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems, Appl. Soft Comput., 13 (2013), 2592–2612. https://doi.org/10.1016/j.asoc.2012.11.026 doi: 10.1016/j.asoc.2012.11.026
[105]
A. E. YILDIRIM, A. Karci, Application of three bar truss problem among engineering design optimization problems using artificial atom algorithm, in IEEE 2018 International Conference on Artificial Intelligence and Data Processing (IDAP), (2018), 1–5. https://doi.org/10.1109/IDAP.2018.8620762
Amit Chhabra, Abdelazim G. Hussien, Fatma A. Hashim,
Improved bald eagle search algorithm for global optimization and feature selection,
2023,
68,
11100168,
141,
10.1016/j.aej.2022.12.045
3.
Rong Zheng, Abdelazim G Hussien, Raneem Qaddoura, Heming Jia, Laith Abualigah, Shuang Wang, Abeer Saber,
A multi-strategy enhanced African vultures optimization algorithm for global optimization problems,
2023,
10,
2288-5048,
329,
10.1093/jcde/qwac135
4.
Xiang Liu, Min Tian, Jie Zhou, Jinyan Liang,
An efficient coverage method for SEMWSNs based on adaptive chaotic Gaussian variant snake optimization algorithm,
2022,
20,
1551-0018,
3191,
10.3934/mbe.2023150
5.
Lei Wu, Jiawei Wu, Tengbin Wang,
Enhancing grasshopper optimization algorithm (GOA) with levy flight for engineering applications,
2023,
13,
2045-2322,
10.1038/s41598-022-27144-4
6.
Fatma A. Hashim, Reham R. Mostafa, Abdelazim G. Hussien, Seyedali Mirjalili, Karam M. Sallam,
Fick’s Law Algorithm: A physical law-based algorithm for numerical optimization,
2023,
260,
09507051,
110146,
10.1016/j.knosys.2022.110146
7.
Gang Hu, Rui Yang, Muhammad Abbas, Guo Wei,
BEESO: Multi-strategy Boosted Snake-Inspired Optimizer for Engineering Applications,
2023,
1672-6529,
10.1007/s42235-022-00330-w
8.
Yufei Wang, Yujun Zhang, Yuxin Yan, Juan Zhao, Zhengming Gao,
An enhanced aquila optimization algorithm with velocity-aided global search mechanism and adaptive opposition-based learning,
2023,
20,
1551-0018,
6422,
10.3934/mbe.2023278
9.
Chao Shang, Ting-ting Zhou, Shuai Liu,
Optimization of complex engineering problems using modified sine cosine algorithm,
2022,
12,
2045-2322,
10.1038/s41598-022-24840-z
10.
Abdelazim G. Hussien, Fatma A. Hashim, Raneem Qaddoura, Laith Abualigah, Adrian Pop,
An Enhanced Evaporation Rate Water-Cycle Algorithm for Global Optimization,
2022,
10,
2227-9717,
2254,
10.3390/pr10112254
11.
Abdelazim G. Hussien, Ruba Abu Khurma, Abdullah Alzaqebah, Mohamed Amin, Fatma A. Hashim,
Novel memetic of beluga whale optimization with self-adaptive exploration–exploitation balance for global optimization and engineering problems,
2023,
27,
1432-7643,
13951,
10.1007/s00500-023-08468-3
12.
Fei Pan, Xingwei Sun, Heran Yang, Yin Liu, Sirui Chen, Hongxun Zhao,
Prediction of Abrasive Belt Wear Height for Screw Rotor Belt Grinding Based on BP Neural Network with Improved Skyhawk Algorithm,
2024,
2234-7593,
10.1007/s12541-024-01110-8
13.
Yahya Abdulfattah Hamoodi, Ban Ahmed Mitras,
2023,
Improved Chaotic Aquila Optimization Algorithm Based on Artificial Hummingbird Algorithm,
979-8-3503-4440-0,
52,
10.1109/ICESAT58213.2023.10347310
14.
Esra’a Alhenawi, Ruba Abu Khurma, Robertas Damaševic̆ius, Abdelazim G. Hussien,
Solving Traveling Salesman Problem Using Parallel River Formation Dynamics Optimization Algorithm on Multi-core Architecture Using Apache Spark,
2024,
17,
1875-6883,
10.1007/s44196-023-00385-5
15.
Fatma A. Hashim, Ruba Abu Khurma, Dheeb Albashish, Mohamed Amin, Abdelazim G. Hussien,
Novel hybrid of AOA-BSA with double adaptive and random spare for global optimization and engineering problems,
2023,
73,
11100168,
543,
10.1016/j.aej.2023.04.052
16.
Ibrahim Al-Shourbaji, Pramod Kachare, Sajid Fadlelseed, Abdoh Jabbari, Abdelazim G. Hussien, Faisal Al-Saqqar, Laith Abualigah, Abdalla Alameen,
Artificial Ecosystem-Based Optimization with Dwarf Mongoose Optimization for Feature Selection and Global Optimization Problems,
2023,
16,
1875-6883,
10.1007/s44196-023-00279-6
17.
Gopi S., Prabhujit Mohapatra,
Chaotic Aquila Optimization algorithm for solving global optimization and engineering problems,
2024,
108,
11100168,
135,
10.1016/j.aej.2024.07.058
18.
Abdelazim G Hussien, Sumit Kumar, Simrandeep Singh, Jeng-Shyang Pan, Fatma A Hashim,
An enhanced dynamic differential annealed algorithm for global optimization and feature selection,
2023,
11,
2288-5048,
49,
10.1093/jcde/qwad108
19.
Mainak Deb, Krishna Gopal Dhal, Arunita Das, Abdelazim G. Hussien, Laith Abualigah, Arpan Garai,
A CNN-based model to count the leaves of rosette plants (LC-Net),
2024,
14,
2045-2322,
10.1038/s41598-024-51983-y
20.
Reham R. Mostafa, Essam H. Houssein, Abdelazim G. Hussien, Birmohan Singh, Marwa M. Emam,
An enhanced chameleon swarm algorithm for global optimization and multi-level thresholding medical image segmentation,
2024,
36,
0941-0643,
8775,
10.1007/s00521-024-09524-1
21.
Arnapurna Panda,
Digital Channel Equalizer Using Functional Link Artificial Neural Network Trained with Quantum Aquila Optimizer,
2024,
5,
2661-8907,
10.1007/s42979-024-02632-8
22.
Amr A. Abd El-Mageed, Amr A. Abohany, Ahmed Elashry,
Effective Feature Selection Strategy for Supervised Classification based on an Improved Binary Aquila Optimization Algorithm,
2023,
181,
03608352,
109300,
10.1016/j.cie.2023.109300
23.
Kangjian Sun, Ju Huo, Heming Jia, Lin Yue,
Reinforcement learning guided Spearman dynamic opposite Gradient-based optimizer for numerical optimization and anchor clustering,
2023,
11,
2288-5048,
12,
10.1093/jcde/qwad109
24.
Buddhadev Sasmal, Abdelazim G. Hussien, Arunita Das, Krishna Gopal Dhal,
A Comprehensive Survey on Aquila Optimizer,
2023,
30,
1134-3060,
4449,
10.1007/s11831-023-09945-6
25.
Dana Marsetiya Utama, Nabilah Sanafa,
A modified Aquila optimizer algorithm for optimization energy-efficient no-idle permutation flow shop scheduling problem,
2023,
7,
2580-2895,
95,
10.30656/jsmi.v7i2.6446
Abeer Saber, Abdelazim G. Hussien, Wael A. Awad, Amena Mahmoud, Alaa Allakany,
Adapting the pre-trained convolutional neural networks to improve the anomaly detection and classification in mammographic images,
2023,
13,
2045-2322,
10.1038/s41598-023-41633-0
28.
Hao Cui, Yaning Xiao, Abdelazim G. Hussien, Yanling Guo,
Multi-strategy boosted Aquila optimizer for function optimization and engineering design problems,
2024,
27,
1386-7857,
7147,
10.1007/s10586-024-04319-4
29.
S. Gopi, Prabhujit Mohapatra,
Fast random opposition-based learning Aquila optimization algorithm,
2024,
10,
24058440,
e26187,
10.1016/j.heliyon.2024.e26187
30.
Chengzhi Fang, Yushen Chen, Xiaolei Deng, Sangyinhuan Lu, Wanjun Zhang, Yao Chen,
A Novel Temperature Rise Prediction Method of Multi-component Feed System for CNC Machine Tool Based on Multi-source Fusion of Heterogeneous Correlation Information,
2024,
25,
2234-7593,
1571,
10.1007/s12541-024-01022-7
31.
Sarada Mohapatra, Prabhujit Mohapatra,
Fast random opposition-based learning Golden Jackal Optimization algorithm,
2023,
275,
09507051,
110679,
10.1016/j.knosys.2023.110679
32.
Nabil Neggaz, Imene Neggaz, Mohamed Abd Elaziz, Abdelazim G. Hussien, Laith Abulaigh, Robertas Damaševičius, Gang Hu,
Boosting manta rays foraging optimizer by trigonometry operators: a case study on medical dataset,
2024,
36,
0941-0643,
9405,
10.1007/s00521-024-09565-6
33.
Megha Varshney, Pravesh Kumar, Laith Abualigah,
Hybridizing remora and aquila optimizer with dynamic oppositional learning for structural engineering design problems,
2025,
462,
03770427,
116475,
10.1016/j.cam.2024.116475
34.
Xiaowei Wang,
An intensified northern goshawk optimization algorithm for solving optimization problems,
2024,
6,
2631-8695,
045267,
10.1088/2631-8695/ada222
35.
Khaled Mohammed Elgamily, M. A. Mohamed, Ahmed Mohamed Abou-Taleb, Mohamed Maher Ata,
Enhanced object detection in remote sensing images by applying metaheuristic and hybrid metaheuristic optimizers to YOLOv7 and YOLOv8,
2025,
15,
2045-2322,
10.1038/s41598-025-89124-8
36.
Dongning Chen, Xinwei Du, Haowen Wang, Qinggui Xian, Jianhao Sha, Chengyu Yao,
2024,
An Improved Hybrid Aquila Optimizer and Pigeon-Inspired Optimization Algorithm with Its Application in PID Parameter Optimization,
979-8-3503-8028-6,
367,
10.1109/ISCSIC64297.2024.00082