Research article

An accelerated alternating directional method with non-monotone technique for matrix recovery

  • Received: 31 January 2023 Revised: 26 March 2023 Accepted: 29 March 2023 Published: 14 April 2023
  • MSC : 90C06, 90C25

  • Low-rank and sparse structures have been frequently exploited in matrix recovery and robust PCA problems. In this paper, we develop an alternating directional method and its variant equipped with the non-monotone search procedure for solving a non-convex optimization model of low-rank and sparse matrix recovery problems, where the concerned matrix with incomplete data is separable into a low-rank part and a sparse part. The main idea is to use the alternating minimization method for the low-rank matrix part, and to use the non-monotone line search technique for the sparse matrix part to iteratively update, respectively. To some extent, the non-monotone strategy relaxes the single-step descent into a multi-step descent and then greatly improves the performance of the alternating directional method. Theoretically, we prove the global convergence of the two proposed algorithms under some mild conditions. Finally, the comparison of numerical experiments shows that the alternate directional method with non-monotone technique is more effective than the original monotone method and the previous method. The efficiency and effectiveness of the proposed algorithms are demonstrated by solving some instances of random incomplete matrix recovery problems and some problems of the background modeling in video processing.

    Citation: Ruiping Wen, Wenwei Li. An accelerated alternating directional method with non-monotone technique for matrix recovery[J]. AIMS Mathematics, 2023, 8(6): 14047-14063. doi: 10.3934/math.2023718

    Related Papers:

  • Low-rank and sparse structures have been frequently exploited in matrix recovery and robust PCA problems. In this paper, we develop an alternating directional method and its variant equipped with the non-monotone search procedure for solving a non-convex optimization model of low-rank and sparse matrix recovery problems, where the concerned matrix with incomplete data is separable into a low-rank part and a sparse part. The main idea is to use the alternating minimization method for the low-rank matrix part, and to use the non-monotone line search technique for the sparse matrix part to iteratively update, respectively. To some extent, the non-monotone strategy relaxes the single-step descent into a multi-step descent and then greatly improves the performance of the alternating directional method. Theoretically, we prove the global convergence of the two proposed algorithms under some mild conditions. Finally, the comparison of numerical experiments shows that the alternate directional method with non-monotone technique is more effective than the original monotone method and the previous method. The efficiency and effectiveness of the proposed algorithms are demonstrated by solving some instances of random incomplete matrix recovery problems and some problems of the background modeling in video processing.



    加载中


    [1] E. J. Candès, X. Li, Y. Ma, J. Wright, Robust principal component analysis, J. ACM, 58 (2011), 1–37. https://doi.org/10.1145/1970392.1970395 doi: 10.1145/1970392.1970395
    [2] A. E. Water, A. C. Sankaranarayanan, R. G. Baraniuk, SpaRCS: recovering low-rank and sparse matrices from compressive measurements, 24th International Conference on Neural Information Processing Systems, 2011, 1089–1097.
    [3] F. Meng, X. M. Yang, C. H. Zhou, The augmented Lagrange multipliers method for matrix completion from corrupted samplings with application to mixed Gaussian-impulse noise removal, PLoS ONE, 9 (2014), e108125. https://doi.org/10.1371/journal.pone.0108125 doi: 10.1371/journal.pone.0108125
    [4] J. Xue, Y. Zhao, S. Huang, W. Liao, S. G. Kong, Multilayer sparsity-based tensor decomposition for low-rank tensor completion, IEEE T. Neur. Net. Lear., 33 (2022), 6916–6930. https://doi.org/10.1109/TNNLS.2021.3083931 doi: 10.1109/TNNLS.2021.3083931
    [5] J. Xue, Y. Zhao, Y. Bu, J. C. W. Chan, S. G. Kong, When Laplacian scale mixture meets three layer transform: a parametric tensor sparsity for tensor completion, IEEE T. Cybernetics, 52 (2022), 13887–13901. https://doi.org/10.1109/TCYB.2021.3140148 doi: 10.1109/TCYB.2021.3140148
    [6] J. Wright, A. Ganesh, K. Min, Y. Ma, Compressive principal component pursuit, 2012 IEEE International Symposium on Information Theory Proceedings, Cambridge, MA, USA, 2012, 1276 –1280 https://doi.org/10.1109/ISIT.2012.6283062
    [7] J. Wright, A. Ganesh, S. Rao, Y. Peng, Y. Ma, Robust principal component analysis: exact recovery of corrupted low-rank matrices by convex optimization, arXiv: 0905.0233.
    [8] H. Xu, C. Caramanis, S. Sanghavi, Robust PCA via outlier pursuit, IEEE T. Inform. Theory, 58 (2012), 3047–3064. https://doi.org/10.1109/TIT.2011.2173156 doi: 10.1109/TIT.2011.2173156
    [9] Z. Aidene, J. Huang, J. Wang, The effect of perturbation and noise folding on the recovery performance of low-rank matrix via the nuclear norm minimization, Intelligent Systems with Applications, 13 (2022), 200058. https://doi.org/10.1016/J.ISWA.2021.200058 doi: 10.1016/J.ISWA.2021.200058
    [10] Z. Zhou, X. Li, J. Wright, E. J. Candés, Y. Ma, Stable principal component pursuit, 2010 IEEE International Symposium on Information Theory, Austin, TX, USA, 2010, 1518–1522. http://doi.org/10.1109/ISIT.2010.5513535
    [11] S. Zhang, D. Ding, C. Zhao, L. Zhao, Three-dimensional sar imaging with sparse linear array using tensor completion in embedded space, EURASIP J. Adv. Sigal Process., 2022 (2022), 1–18. http://doi.org/10.1186/s13634-022-00896-x doi: 10.1186/s13634-022-00896-x
    [12] J. Yi, W. Xu, Necessary and sufficient null space condition for nuclear norm minimization in low-rank matrix recovery, IEEE T. Inform. Theory, 66 (2020), 6597–6604. http://doi.org/10.1109/TIT.2020.2990948 doi: 10.1109/TIT.2020.2990948
    [13] X. Li, Compressed sensing and matrix completion with constant proportions of corruptions, Constr. Approx., 37 (2013), 73–99. http://doi.org/10.1007/s00365-012-9176-9 doi: 10.1007/s00365-012-9176-9
    [14] Y. D. Chen, H. Xu, C. Caramanis, S. Sanghavi, Robust matrix completion with corrupted columns, IEEE T. Inform. Theory, 62 (2016), 503–526. http://doi.org/10.1109/TIT.2015.2499247
    [15] C. Li, C. L. Wang, J. Wang, Convergence analysis of the augmented Lagrange multiplier algorithm for a class of matrix compressive recovery, Appl. Math. Lett., 59 (2016), 12–17. http://doi.org/10.1016/j.aml.2016.02.022 doi: 10.1016/j.aml.2016.02.022
    [16] Y. D. Chen, A. Jalali, S. Sanghavi, C. Caramanis, Low-rank matrix recovery from errors and erasures, IEEE T. Inform. Theory, 59 (2013), 4324–4337. http://doi.org/10.1109/TIT.2013.2249572 doi: 10.1109/TIT.2013.2249572
    [17] T. Hastie, R. Mazumder, R. Tibshirani, Matrix completion and large-scale SVD computations, 2012. Available from: http://web.stanford.edu/hastie/TALKS/SVD hastie.pdf.
    [18] J. He, L. Balzano, J. Lui, Online robust subspace tracking from partial information, arXiv: 1109.3827.
    [19] J. He, L. Balzano, A. Szlam, Incremental gradient on the Grassmannian for online foreground and background separation in subsampled video, 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 2012, 1568–1575. http://doi.org/10.1109/CVPR.2012.6247848
    [20] L. Balzano, R. Nowak, B. Recht, Online identification and tracking of subspaces from highly incomplete information, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton), Monticello, IL, USA, 2010,704–711. http://doi.org/10.1109/ALLERTON.2010.5706976
    [21] K. Y. Chiang, I. S. Dhillon, C. J. Hsieh, Using side information to reliably learn low-rank matrices from missing and corrupted observations, J. Mach. Learn. Res., 19 (2018), 3005–3039.
    [22] P. Netrapalli, U. N. Niranjan, S. Sanghavi, A. Anandkumar, P. Jain, Non-convex robust PCA, arXiv: 1410.7660.
    [23] Q. Q. Gu, Z. R. Wang, H. Liu, Low-rank and sparse structure pursuit via alternating minimization, The 19th International Conference on Artificial Intelligence and Statistics, 51 (2016), 600–609.
    [24] X. Y. Yi, D. Park, Y. D. Chen, C. Caramanis, Fast algorithms for robust PCA via gradient descent, arXiv: 1605.07784.
    [25] T. Zhang, Y. Yang, Robust PCA by manifold optimization, J. Mach. Learn. Res., 19 (2018), 3101–3139.
    [26] H. Q. Cai, J. F. Cai, K. Wei, Accelerated alternating projections for robust principal component analysis, J. Mach. Learn. Res., 20 (2019), 685–717.
    [27] Y. X. Wang, C. M. Lee, L. F. Cheong, K. C. Toh, Practical matrix completion and corruption recovery using proximal alternating robust subspace minimiaztion, Int. J. Comput. Vis., 111 (2015), 315–344. http://doi.org/10.1007/s11263-014-0746-0 doi: 10.1007/s11263-014-0746-0
    [28] B. Recht, A simpler approach to matrix completion, J. Mach. Learn. Res., 12 (2011), 3413–3430.
    [29] L. Li, W. Huang, I. Gu, Q. Tian, Statistical modeling of complex backgrounds for foreground object detection, IEEE T. Image Process., 13 (2004), 1459–1472. http://doi.org/10.1109/TIP.2004.836169 doi: 10.1109/TIP.2004.836169
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1190) PDF downloads(40) Cited by(0)

Article outline

Figures and Tables

Figures(6)  /  Tables(4)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog