Typesetting math: 100%
Research article

Non-negative Tucker decomposition with double constraints for multiway dimensionality reduction

  • Received: 23 May 2024 Revised: 28 June 2024 Accepted: 01 July 2024 Published: 08 July 2024
  • MSC : 15A69, 62H30, 68U10

  • Nonnegative Tucker decomposition (NTD) is one of the renowned techniques in feature extraction and representation for nonnegative high-dimensional tensor data. The main focus behind the NTD-like model was how to factorize the data to get ahold of a high quality data representation from multidimensional directions. However, existing NTD-like models do not consider relationship and properties between the factor matrix of columns while preserving the geometric structure of the data space. In this paper, we managed to capture nonlinear local features of data space and further enhance expressiveness of the NTD clustering method by syncretizing organically approximately orthogonal constraint and graph regularized constraint. First, based on the uni-side and bi-side approximate orthogonality, we flexibly proposed two novel approximately orthogonal NTD with graph regularized models, which not only in part make the factor matrix tend to be orthogonality, but also preserve the geometrical information from high-dimensional tensor data. Second, we developed the iterative updating algorithm dependent on the multiplicative update rule to solve the proposed models, and provided its convergence and computational complexity. Finally, we used numerical experimental results to demonstrate the effectiveness, robustness, and efficiency of the proposed new methods on the real-world image datasets.

    Citation: Xiang Gao, Linzhang Lu, Qilong Liu. Non-negative Tucker decomposition with double constraints for multiway dimensionality reduction[J]. AIMS Mathematics, 2024, 9(8): 21755-21785. doi: 10.3934/math.20241058

    Related Papers:

    [1] A. El-Mesady, Y. S. Hamed, Khadijah M. Abualnaja . A novel application on mutually orthogonal graph squares and graph-orthogonal arrays. AIMS Mathematics, 2022, 7(5): 7349-7373. doi: 10.3934/math.2022410
    [2] Ziran Yin, Chongyang Liu, Xiaoyu Chen, Jihong Zhang, Jinlong Yuan . A comprehensive characterization of the robust isolated calmness of Ky Fan k-norm regularized convex matrix optimization problems. AIMS Mathematics, 2025, 10(3): 4955-4969. doi: 10.3934/math.2025227
    [3] Xiaoqian Liu, Chang'an Wang, Xingmin Zhang, Lei Gao, Jianing Zhu . Financing constraints change of China's green industries. AIMS Mathematics, 2022, 7(12): 20873-20890. doi: 10.3934/math.20221144
    [4] Jeong-Kweon Seo, Byeong-Chun Shin . Reduced-order modeling using the frequency-domain method for parabolic partial differential equations. AIMS Mathematics, 2023, 8(7): 15255-15268. doi: 10.3934/math.2023779
    [5] Yuelin Gao, Huirong Li, Yani Zhou, Yijun Chen . Semi-supervised graph regularized concept factorization with the class-driven constraint for image representation. AIMS Mathematics, 2023, 8(12): 28690-28709. doi: 10.3934/math.20231468
    [6] Ruifang Yang, Shilin Yang . Representations of a non-pointed Hopf algebra. AIMS Mathematics, 2021, 6(10): 10523-10539. doi: 10.3934/math.2021611
    [7] Hassen Aydi, Bessem Samet, Manuel De la Sen . A fixed point theorem for non-negative functions. AIMS Mathematics, 2024, 9(10): 29018-29030. doi: 10.3934/math.20241408
    [8] Yuanqiang Chen, Jihui Zheng, Jing An . A Legendre spectral method based on a hybrid format and its error estimation for fourth-order eigenvalue problems. AIMS Mathematics, 2024, 9(3): 7570-7588. doi: 10.3934/math.2024367
    [9] B. El-Sobky, G. Ashry . An interior-point trust-region algorithm to solve a nonlinear bilevel programming problem. AIMS Mathematics, 2022, 7(4): 5534-5562. doi: 10.3934/math.2022307
    [10] B. El-Sobky, Y. Abo-Elnaga, G. Ashry, M. Zidan . A nonmonton active interior point trust region algorithm based on CHKS smoothing function for solving nonlinear bilevel programming problems. AIMS Mathematics, 2024, 9(3): 6528-6554. doi: 10.3934/math.2024318
  • Nonnegative Tucker decomposition (NTD) is one of the renowned techniques in feature extraction and representation for nonnegative high-dimensional tensor data. The main focus behind the NTD-like model was how to factorize the data to get ahold of a high quality data representation from multidimensional directions. However, existing NTD-like models do not consider relationship and properties between the factor matrix of columns while preserving the geometric structure of the data space. In this paper, we managed to capture nonlinear local features of data space and further enhance expressiveness of the NTD clustering method by syncretizing organically approximately orthogonal constraint and graph regularized constraint. First, based on the uni-side and bi-side approximate orthogonality, we flexibly proposed two novel approximately orthogonal NTD with graph regularized models, which not only in part make the factor matrix tend to be orthogonality, but also preserve the geometrical information from high-dimensional tensor data. Second, we developed the iterative updating algorithm dependent on the multiplicative update rule to solve the proposed models, and provided its convergence and computational complexity. Finally, we used numerical experimental results to demonstrate the effectiveness, robustness, and efficiency of the proposed new methods on the real-world image datasets.



    Since its initiation in 1979, the Canadian Applied and Industrial Mathematics Society — Société Canadienne de Mathématiques Appliquées et Industrielles (CAIMS–SCMAI) has gained a growing presence in industrial, mathematical, scientific, and technological circles within and outside of Canada. Its members contribute to state-of-the-art research in industry, natural sciences, medicine and health, finance, physics, engineering, and more. The annual meetings are a highlight of the year. CAIMS–SCMAI is an active member society of the International Council for Industrial and Applied Mathematics, which hosts the prestigious ICIAM Congresses every four years.

    Canadian Applied and Industrial Mathematics is at the forefront of scientific and technological development. We use advanced mathematics to tackle real-world problems in science and industry and develop new theories to analyse structures that arise from the modelling of real-world problems.

    Applied Mathematics has evolved from traditional applications in areas such as fluids, mechanics, and physics, to modern topics such as medicine, health, biology, data science, finance, nano-tech, etc. Its growing importance in all aspects of life, health, and management increases the need for publication venues for high-level applied and industrial mathematics. Hence CAIMS–SCMAI decided to start a scientific journal called Mathematics in Science and Industry (MSI) to add value to the discussion of applied and industrial mathematics worldwide.

    Submissions to MSI in all areas of applied and industrial mathematics are welcome (https://caims.ca/mathematics_in_science_and_industry/). We offer a timely and high-quality review process, and papers are published online as open access, with the publication fee being covered by CAIMS for the first five years.

    MSI is honored that leading experts in industrial and applied mathematics have offered their support as editors:

    Editors in Chief:

    ● Thomas Hillen (University of Alberta, thillen@ualberta.ca)

    ● Ray Spiteri (University of Saskatchewan, spiteri@cs.usask.ca)

    Associate Editors:

    ● Lia Bronsard (McMaster University)

    ● Richard Craster (Imperial College of London, UK)

    ● David Earn (McMaster University)

    ● Ronald Haynes (Memorial University)

    ● Jane Heffernan (York University)

    ● Nicholas Kevlahan (McMaster University)

    ● Yong-Jung Kim (KAIST, Korea)

    ● Mark Lewis (University of Alberta)

    ● Kevin J. Painter (Heriot-Watt University, UK)

    ● Vakhtang Putkaradze (ATCO)

    ● Katrin Rohlf (Ryerson University)

    ● John Stockie (Simon Fraser University)

    ● Jie Sun (Huawei, Hong Kong)

    ● Justin Wan (University of Waterloo)

    ● Michael Ward (University of British Columbia)

    ● Tony Ware (University of Calgary)

    ● Brian Wetton (University of British Columbia)

    The first eight papers of MSI, presented here, are published as special issue in AIMS Mathematics. They showcase a broad representation of applied mathematics that touches the interests of Canadian researchers and our many collaborators around the world. The science that we present here is not exclusively "Canadian", but we hope that through the new journal MSI, we can contribute to scientific dissemination of knowledge and add Canadian values to the scientific discussion.

    The next issue of MSI is planned for the fall of 2020 and is expected to appear again as a special issue of AIMS Mathematics.



    [1] P. Deng, T. Li, H. Wang, S. Horng, Z. Yu, X. Wang, Tri-regularized non-negative matrix tri-factorization for co-clustering, Knowl. Based Syst., 226 (2021), 107101. https://doi.org/10.1016/j.knosys.2021.107101 doi: 10.1016/j.knosys.2021.107101
    [2] S. Li, W. Li, J. Hu, Y. Li, Semi-supervised bi-orthogonal constraints dual-graph regularized NMF for subspace clustering, Appl. Intell., 52 (2022), 3227–3248. https://doi.org/10.1007/s10489-021-02522-z doi: 10.1007/s10489-021-02522-z
    [3] B. Cai, G. Lu, Tensor subspace clustering using consensus tensor low-rank representation, Inf. Sci., 609 (2022), 46–59. https://doi.org/10.1016/j.ins.2022.07.049 doi: 10.1016/j.ins.2022.07.049
    [4] M. Wall, A. Rechtsteiner, L. Rocha, Singular value decomposition and principal component analysis, In: D. P. Berrar, W. Dubitzky, M. Granzow, A practical approach to microarray data analysis, Springer, 2003, 91–109. https://doi.org/10.1007/0-306-47815-3_5
    [5] S. Roweis, L. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science, 290 (2000), 2323–2326. https://doi.org/10.1126/science.290.5500.2323 doi: 10.1126/science.290.5500.2323
    [6] W. Yin, Z. Ma, LE and LLE regularized nonnegative Tucker decomposition for clustering of high dimensional datasets, Neurocomputing, 364 (2019), 77–94. https://doi.org/10.1016/j.neucom.2019.06.054 doi: 10.1016/j.neucom.2019.06.054
    [7] A. Gersho, R. Gray, Vector quantization and signal compression, Springer, 2012. https://doi.org/10.1007/978-1-4615-3626-0
    [8] S. Wold, K. Esbensen, P. Geladi, Principal component analysis, Chemometr. Intell. Lab. Syst., 2 (1987), 37–52. https://doi.org/10.1016/0169-7439(87)80084-9 doi: 10.1016/0169-7439(87)80084-9
    [9] Y. Zhao, C. Jiao, M. Wang, J. Liu, J. Wang, C. Zheng, Htrpca: hypergraph regularized tensor robust principal component analysis for sample clustering in tumor omics data, Interdiscip. Sci., 14 (2022), 22–33. https://doi.org/10.1007/s12539-021-00441-8 doi: 10.1007/s12539-021-00441-8
    [10] D. Lee, H. Seung, Learning the parts of objects by non-negative matrix factorization, Nature, 401 (1999), 788–791. https://doi.org/10.1038/44565 doi: 10.1038/44565
    [11] D. Lee, H. Seung, Algorithms for non-negative matrix factorization, Adv. Neural Inf. Process. Syst., 13 (2000), 556–562.
    [12] P. De Handschutter, N. Gillis, A consistent and flexible framework for deep matrix factorizations, Pattern Recogn., 134 (2023), 109102. https://doi.org/10.1016/j.patcog.2022.109102 doi: 10.1016/j.patcog.2022.109102
    [13] Z. Wang, P. Dellaportas, I. Kosmidis, Bayesian tensor factorisations for time series of counts, Mach. Learn., 113 (2023), 3731–3750. https://doi.org/10.1007/s10994-023-06441-7 doi: 10.1007/s10994-023-06441-7
    [14] B. Chen, J. Guan, Z. Li, Unsupervised feature selection via graph regularized non-negative CP decomposition, IEEE Trans. Pattern Anal. Mach. Intell., 45 (2022), 2582–2594. https://doi.org/10.1109/TPAMI.2022.3160205 doi: 10.1109/TPAMI.2022.3160205
    [15] M. Che, Y. Wei, Randomized algorithms for the approximations of Tucker and the tensor train decompositions, Adv. Comput. Math., 45 (2019), 395–428. https://doi.org/10.1007/s10444-018-9622-8 doi: 10.1007/s10444-018-9622-8
    [16] T. Kolda, B. Bader, Tensor decompositions and applications, SIAM Rev., 51 (2009), 455–500. https://doi.org/10.1137/07070111X doi: 10.1137/07070111X
    [17] Y. Kim, S. Choi, Nonnegative Tucker decomposition, 2007 IEEE Conference on Computer Vision and Pattern Recognition, 2007. https://doi.org/10.1109/CVPR.2007.383405
    [18] H. Huang, Z. Ma, G. Zhang, Dimensionality reduction of tensors based on manifold-regularized tucker decomposition and its iterative solution, Int. J. Mach. Learn. Cybern., 13 (2022), 509–522. https://doi.org/10.1007/s13042-021-01422-5 doi: 10.1007/s13042-021-01422-5
    [19] J. Zhang, Y. Han, J. Jiang, Semi-supervised tensor learning for image classification, Multimedia Syst., 23 (2017), 63–73. https://doi.org/10.1007/s00530-014-0416-7 doi: 10.1007/s00530-014-0416-7
    [20] X. Zhang, M. Ng, Sparse nonnegative Tucker decomposition and completion under noisy observations, arXiv, 2022. https://doi.org/10.48550/arXiv.2208.08287
    [21] Q. Liu, L. Lu, Z. Chen, Nonnegative Tucker decomposition with graph regularization and smooth constraint for clustering, Pattern Recogn., 148 (2023), 110207. https://doi.org/10.1016/j.patcog.2023.110207 doi: 10.1016/j.patcog.2023.110207
    [22] Y. Qiu, G. Zhou, Y. Zhang, S. Xie, Graph regularized nonnegative Tucker decomposition for tensor data representation, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing, 2019, 8613–8617. https://doi.org/10.1109/ICASSP.2019.8683766 doi: 10.1109/ICASSP.2019.8683766
    [23] Y. Qiu, G. Zhou, Y. Wang, Y. Zhang, S. Xie, A generalized graph regularized non-negative Tucker decomposition framework for tensor data representation, IEEE Trans. Cybern., 52 (2020), 594–607. https://doi.org/10.1109/TCYB.2020.2979344 doi: 10.1109/TCYB.2020.2979344
    [24] D. Chen, G. Zhou, Y. Qiu, Y. Yu, Adaptive graph regularized non-negative Tucker decomposition for multiway dimensionality reduction, Multimedia Tools Appl., 83 (2024), 9647–9668. https://doi.org/10.1007/s11042-023-15622-4 doi: 10.1007/s11042-023-15622-4
    [25] X. Li, M. Ng, G. Cong, Y. Ye, Q. Wu, MR-NTD: manifold regularization nonnegative tucker decomposition for tensor data dimension reduction and representation, IEEE Trans. Neural Networks Lear. Syst., 28 (2016), 1787–1800. https://doi.org/10.1109/TNNLS.2016.2545400 doi: 10.1109/TNNLS.2016.2545400
    [26] Z. Huang, G. Zhou, Y. Qiu, Y. Yun, Y. Dai, A dynamic hypergraph regularized non-negative Tucker decomposition framework for multiway data analysis, Int. J. Mach. Learn. Cybern., 13 (2022), 3691–3710. https://doi.org/10.1007/s13042-022-01620-9 doi: 10.1007/s13042-022-01620-9
    [27] W. Jing, L. Lu, Q. Liu, Graph regularized discriminative nonnegative Tucker decomposition for tensor data representation, Appl. Intell., 53 (2023), 23864–23882. https://doi.org/10.1007/s10489-023-04738-7 doi: 10.1007/s10489-023-04738-7
    [28] Y. Qiu, G. Zhou, X. Chen, D. Zhang, X. Zhao, Q. Zhao, Semi-supervised non-negative Tucker decomposition for tensor data representation, Sci. China Technol. Sci., 64 (2021), 1881–1892. https://doi.org/10.1007/s11431-020-1824-4 doi: 10.1007/s11431-020-1824-4
    [29] L. Ren, R. Hu, Y. Liu, D. Li, J. Wu, Y. Zang, et al., Improving fraud detection via imbalanced graph structure learning, Mach. Learn., 113 (2023), 1069–1090. https://doi.org/10.1007/s10994-023-06464-0 doi: 10.1007/s10994-023-06464-0
    [30] M. Zhao, W. Li, L. Li, P. Ma, Z. Cai, R. Tao, Three-order tensor creation and Tucker decomposition for infrared small-target detection, IEEE Trans. Geosci. Remote Sens., 60 (2021), 1–16. https://doi.org/10.1109/TGRS.2021.3057696 doi: 10.1109/TGRS.2021.3057696
    [31] T. Jiang, M. K. Ng, J. Pan, G. Song, Nonnegative low rank tensor approximations with multidimensional image applications, Numer. Math., 153 (2023), 141–170. https://doi.org/10.1007/s00211-022-01328-6 doi: 10.1007/s00211-022-01328-6
    [32] C. Ding, X. He, H. Simon, On the equivalence of non-negative matrix factorization and spectral clustering, Proceedings of the 2005 SIAM International Conference on Data Mining, 2005,606–610. https://doi.org/10.1137/1.9781611972757.70 doi: 10.1137/1.9781611972757.70
    [33] J. Pan, M. Ng, Y. Liu, X. Zhang, H. Yan, Orthogonal nonnegative Tucker decomposition, SIAM J. Sci. Comput., 43 (2021), B55–B81. https://doi.org/10.1137/19M1294708 doi: 10.1137/19M1294708
    [34] B. Li, G. Zhou, A. Cichocki, Two efficient algorithms for approximately orthogonal nonnegative matrix factorization, IEEE Signal Process. Lett., 22 (2015), 843–846. https://doi.org/10.1109/LSP.2014.2371895 doi: 10.1109/LSP.2014.2371895
    [35] Y. Qiu, W. Sun, Y. Zhang, X. Gu, G. Zhou, Approximately orthogonal nonnegative Tucker decomposition for flexible multiway clustering, Sci. China Technol. Sci., 64 (2021), 1872–1880. https://doi.org/10.1007/s11431-020-1827-0 doi: 10.1007/s11431-020-1827-0
    [36] D. Cai, X. He, J. Han, T. Huang, Graph regularized nonnegative matrix factorization for data representation, IEEE Trans. Pattern Anal. Mach. Intell., 33 (2010), 1548–1560. https://doi.org/10.1109/TPAMI.2010.231 doi: 10.1109/TPAMI.2010.231
    [37] F. Shang, L. Jiao, J. Shi, F. Wang, M. Gong, Fast affinity propagation clustering: a multilevel approach, Pattern Recogn., 45 (2012), 474–486. https://doi.org/10.1016/j.patcog.2011.04.032 doi: 10.1016/j.patcog.2011.04.032
    [38] F. Shang, L. Jiao, F. Wang, Graph dual regularization non-negative matrix factorization for co-clustering, Pattern Recogn., 45 (2012), 2237–2250. https://doi.org/10.1016/j.patcog.2011.12.015 doi: 10.1016/j.patcog.2011.12.015
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(894) PDF downloads(41) Cited by(0)

Article outline

Figures and Tables

Figures(9)  /  Tables(8)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog