In this research, we propose an optimal choice for the non-negative constant in the Dai-Liao conjugate gradient formula based on the prominent Barzilai-Borwein approach by leveraging the nice features of the Frobenius matrix norm. The global convergence of the new modification is demonstrated using some basic assumptions. Numerical comparisons with similar algorithms show that the new approach is reliable in terms of the number of iterations, computing time, and function evaluations for unconstrained minimization, portfolio selection and image restoration problems.
Citation: Jamilu Sabi'u, Ibrahim Mohammed Sulaiman, P. Kaelo, Maulana Malik, Saadi Ahmad Kamaruddin. An optimal choice Dai-Liao conjugate gradient algorithm for unconstrained optimization and portfolio selection[J]. AIMS Mathematics, 2024, 9(1): 642-664. doi: 10.3934/math.2024034
In this research, we propose an optimal choice for the non-negative constant in the Dai-Liao conjugate gradient formula based on the prominent Barzilai-Borwein approach by leveraging the nice features of the Frobenius matrix norm. The global convergence of the new modification is demonstrated using some basic assumptions. Numerical comparisons with similar algorithms show that the new approach is reliable in terms of the number of iterations, computing time, and function evaluations for unconstrained minimization, portfolio selection and image restoration problems.
[1] | X. Z. Jiang, H. H. Yang, J. B. Jian, X. D. Wu, Two families of hybrid conjugate gradient methods with restart procedures and their applications, Optim. Method. Softw., 38 (2023), 947–974. https://doi.org/10.1080/10556788.2023.2189718 doi: 10.1080/10556788.2023.2189718 |
[2] | X. Z. Jiang, X. M. Ye, Z. F. Huang, M. X. Liu, A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations, Comput. Oper. Res., 159 (2023), 106341. https://doi.org/10.1016/j.cor.2023.106341 doi: 10.1016/j.cor.2023.106341 |
[3] | A. M. Awwal, I. M. Sulaiman, M. Malik, M. Mamat, P. Kumam, K. Sitthithakerngkiet, A spectral RMIL+ conjugate gradient method for unconstrained optimization with applications in portfolio selection and motion control, IEEE Access, 9 (2021), 75398–75414. https://doi.org/10.1109/ACCESS.2021.3081570 doi: 10.1109/ACCESS.2021.3081570 |
[4] | I. M. Sulaiman, N. A. Bakar, M. Mamat, B. A. Hassan, M. Malik, A. M. Alomari, A new hybrid conjugate gradient algorithm for optimization models and its application to regression analysis, Indones. J. Electr. Eng. Comput. Sci., 23 (2021), 1100–1109. https://doi.org/10.11591/ijeecs.v23.i2.pp1100-1109 doi: 10.11591/ijeecs.v23.i2.pp1100-1109 |
[5] | I. M. Sulaiman, M. Mamat, A new conjugate gradient method with descent properties and its application to regression analysis, J. Numer. Anal. Ind. Appl. Math., 14 (2020), 25–39. |
[6] | Z. Aminifard, S. Babaie-Kafaki, Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing, Numer. Algorithm, 89 (2022), 1369–1387. https://doi.org/10.1007/s11075-021-01157-y doi: 10.1007/s11075-021-01157-y |
[7] | M. Malik, I. M. Sulaiman, A. B. Abubakar, G. Ardaneswari, Sukono, A new family of hybrid three-term conjugate gradient method for unconstrained optimization with application to image restoration and portfolio selection, AIMS Mathematics, 8 (2023), 1–28, https://doi.org/10.3934/math.2023001 doi: 10.3934/math.2023001 |
[8] | W. W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35–58. |
[9] | M. R. Hestenes, E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bureau Stand., 49 (1952), 409–436. |
[10] | I. M. Sulaiman, M. Malik, A. M. Awwal, P. Kumam, M. Mamat, S. Al-Ahmad, On three-term conjugate gradient method for optimization problems with applications on COVID-19 model and robotic motion control, Adv. Contin. Discret. Models, 2022 (2022), 1. https://doi.org/10.1186/s13662-021-03638-9 doi: 10.1186/s13662-021-03638-9 |
[11] | N. Aini, M. Mamat, M. Rivaie, I. S. Ibrahim, A hybrid of quasi-Newton method with CG method for unconstrained optimization, J. Phys. Conf. Ser., 1366 (2019), 012079. https://doi.org/10.1088/1742-6596/1366/1/012079. doi: 10.1088/1742-6596/1366/1/012079 |
[12] | M. Maulana, M. Mamat, S. S. Abas, I. M. Sulaiman, F. Sukono, Performance analysis of new spectral and hybrid conjugate gradient methods for solving unconstrained optimization problems, IAENG Int. J. Comput. Sci., 48 (2021), 66–79. |
[13] | P. Kaelo, P. Mtagulwa, M. V. Thuto, A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization, Math. Sci. 14 (2020), 1–9. |
[14] | N. Salihu, P. Kumam, A. M. Awwal, I. M. Sulaiman, T. Seangwattana, The global convergence of spectral RMIL conjugate gradient method for unconstrained optimization with applications to robotic model and image recovery, PLOS One, 18 (2023), e0281250. https://doi.org/10.1371/journal.pone.0281250 doi: 10.1371/journal.pone.0281250 |
[15] | Y. H. Dai, L. Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87–101, https://doi.org/10.1007/s002450010019 doi: 10.1007/s002450010019 |
[16] | W. W. Hager, H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170–192. https://doi.org/10.1137/030601880 doi: 10.1137/030601880 |
[17] | Y. H. Dai, C. X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23 (2013), 296–320. https://doi.org/10.1137/100813026 doi: 10.1137/100813026 |
[18] | N. Andrei, Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. (2), 34 (2011), 319–330. |
[19] | S. Babaie-Kafaki, R. Ghanbari, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, Eur. J. Oper. Res., 234 (2014), 625–630. https://doi.org/10.1016/j.ejor.2013.11.012 doi: 10.1016/j.ejor.2013.11.012 |
[20] | S. Babaie-Kafaki, R. Ghanbari, A descent family of Dai-Liao conjugate gradient methods, Optim. Method. Softw., 29 (2014), 583–591. https://doi.org/10.1080/10556788.2013.833199 doi: 10.1080/10556788.2013.833199 |
[21] | L. Zhang, W. J. Zhou, D. H. Li, Some descent three-term conjugate gradient methods and their global convergence, Optim. Method. Softw., 22 (2007), 697–711. https://doi.org/10.1080/10556780701223293 doi: 10.1080/10556780701223293 |
[22] | S. Babaie-Kafaki, R. Ghanbari, Two optimal Dai-Liao conjugate gradient methods, Optimization, 64 (2015), 2277–2287. https://doi.org/10.1080/02331934.2014.938072 doi: 10.1080/02331934.2014.938072 |
[23] | K. Zhang, H. Liu, Z. Liu, A new Dai-Liao conjugate gradient method with optimal parameter choice, Numer. Funct. Anal. Optim., 40 (2019), 194–215. https://doi.org/10.1080/01630563.2018.1535506 doi: 10.1080/01630563.2018.1535506 |
[24] | U. A. Yakubu, M. Mamat, M. A. Mohamad, M. Rivaie, J. Sabi'u, A recent modification on Dai-Liao conjugate gradient method for solving symmetric nonlinear equations, Far East J. Math. Sci. (FJMS), 103 (2018), 1961–1974. http://doi.org/10.17654/MS103121961 doi: 10.17654/MS103121961 |
[25] | M. Y. Waziri, K. Ahmed, J. Sabi'u, A. S. Halilu, Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations, SeMA J., 78 (2021), 15–51. https://doi.org/10.1007/s40324-020-00228-9 doi: 10.1007/s40324-020-00228-9 |
[26] | J. Sabi'u, A. Shah, M. Y. Waziri, A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations, Int. J. Comput. Math., 99 (2021), 332–354. https://doi.org/10.1080/00207160.2021.1910814 doi: 10.1080/00207160.2021.1910814 |
[27] | J. Sabi'u, A. Shah, M. Y. Waziri, K. Ahmed, Modified Hager-Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint, Int. J. Comput. Method., 18 (2020), 2050043, https://doi.org/10.1142/S0219876220500437 doi: 10.1142/S0219876220500437 |
[28] | J. Sabi'u, A. Shah, M. Y. Waziri, Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations, Appl. Numer. Math., 153 (2020), 217–233. https://doi.org/10.1016/j.apnum.2020.02.017 doi: 10.1016/j.apnum.2020.02.017 |
[29] | M. Y. Waziri, K. Ahmed, J. Sabi'u, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, Arab. J. Math., 9 (2020), 443–457. https://doi.org/10.1007/s40065-019-0264-6 doi: 10.1007/s40065-019-0264-6 |
[30] | M. Y. Waziri, K. A. Hungu, J. Sabi'u, Descent Perry conjugate gradient methods for systems of monotone nonlinear equations, Numer. Algorithms, 85 (2020), 763–785. https://doi.org/10.1007/s11075-019-00836-1 doi: 10.1007/s11075-019-00836-1 |
[31] | J. Barzilai, J. M. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal., 8 (1988), 141–148. https://doi.org/10.1093/imanum/8.1.141 doi: 10.1093/imanum/8.1.141 |
[32] | G. Zoutendijk, Nonlinear programming computational methods, In: Integer and nonlinear programming, Amsterdam: North-Holland, 1970, 37–86. |
[33] | Y. Dai, J. Han, D. Sun, H. Yin, Y. X. Yuan, Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim., 10 (2000), 345–358. https://doi.org/10.1137/S10562349426844. doi: 10.1137/S10562349426844 |
[34] | N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., 10 (2008), 147–161. |
[35] | E. D. Dolan, J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201–213, https://doi.org/10.1007/s101070100263 doi: 10.1007/s101070100263 |
[36] | H. Markowitz, Portfolio selection, J. Finance, 7 (1952), 77–91. https://doi.org/10.2307/2975974 |
[37] | H. Mayo, Investments: An introduction, 12 Eds., Cengage Learning EMEA, 2016. |
[38] | X. Wu, H. Shao, P. Liu, Y. Zhang, Y. Zhuo, An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems, J. Comput. Appl. Math., 422 (2023), 114879. https://doi.org/10.1016/j.cam.2022.114879 doi: 10.1016/j.cam.2022.114879 |