Processing math: 100%
Research article

Inequalities on 2×2 block accretive partial transpose matrices

  • Received: 27 December 2023 Revised: 04 February 2024 Accepted: 20 February 2024 Published: 01 March 2024
  • MSC : 15A18, 15A42, 15A45, 15A60

  • In this note, we first corrected a result of Alakhrass [1], then presented some inequalities related to 2×2 block accretive partial transpose matrices which generalized some results on block positive partial transpose matrices.

    Citation: Lihong Hu, Junjian Yang. Inequalities on 2×2 block accretive partial transpose matrices[J]. AIMS Mathematics, 2024, 9(4): 8805-8813. doi: 10.3934/math.2024428

    Related Papers:

    [1] Moh. Alakhrass . A note on positive partial transpose blocks. AIMS Mathematics, 2023, 8(10): 23747-23755. doi: 10.3934/math.20231208
    [2] Mohammad Al-Khlyleh, Mohammad Abdel Aal, Mohammad F. M. Naser . Interpolation unitarily invariant norms inequalities for matrices with applications. AIMS Mathematics, 2024, 9(7): 19812-19821. doi: 10.3934/math.2024967
    [3] Sourav Shil, Hemant Kumar Nashine . Positive definite solution of non-linear matrix equations through fixed point technique. AIMS Mathematics, 2022, 7(4): 6259-6281. doi: 10.3934/math.2022348
    [4] Kanjanaporn Tansri, Sarawanee Choomklang, Pattrawut Chansangiam . Conjugate gradient algorithm for consistent generalized Sylvester-transpose matrix equations. AIMS Mathematics, 2022, 7(4): 5386-5407. doi: 10.3934/math.2022299
    [5] Junyuan Huang, Xueqing Chen, Zhiqi Chen, Ming Ding . On a conjecture on transposed Poisson n-Lie algebras. AIMS Mathematics, 2024, 9(3): 6709-6733. doi: 10.3934/math.2024327
    [6] Arnon Ploymukda, Kanjanaporn Tansri, Pattrawut Chansangiam . Weighted spectral geometric means and matrix equations of positive definite matrices involving semi-tensor products. AIMS Mathematics, 2024, 9(5): 11452-11467. doi: 10.3934/math.2024562
    [7] Pattrawut Chansangiam, Arnon Ploymukda . Riccati equation and metric geometric means of positive semidefinite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(10): 23519-23533. doi: 10.3934/math.20231195
    [8] Nunthakarn Boonruangkan, Pattrawut Chansangiam . Convergence analysis of a gradient iterative algorithm with optimal convergence factor for a generalized Sylvester-transpose matrix equation. AIMS Mathematics, 2021, 6(8): 8477-8496. doi: 10.3934/math.2021492
    [9] Arnon Ploymukda, Pattrawut Chansangiam . Metric geometric means with arbitrary weights of positive definite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(11): 26153-26167. doi: 10.3934/math.20231333
    [10] Ahmad Y. Al-Dweik, Ryad Ghanam, Gerard Thompson, M. T. Mustafa . Algorithms for simultaneous block triangularization and block diagonalization of sets of matrices. AIMS Mathematics, 2023, 8(8): 19757-19772. doi: 10.3934/math.20231007
  • In this note, we first corrected a result of Alakhrass [1], then presented some inequalities related to 2×2 block accretive partial transpose matrices which generalized some results on block positive partial transpose matrices.



    Let Mn be the set of n×n complex matrices. Mn(Mk) is the set of n×n block matrices with each block in Mk. For AMn, the conjugate transpose of A is denoted by A. When A is Hermitian, we denote the eigenvalues of A in nonincreasing order λ1(A)λ2(A)...λn(A); see [2,7,8,9]. The singular values of A, denoted by s1(A),s2(A),...,sn(A), are the eigenvalues of the positive semi-definite matrix |A|=(AA)1/2, arranged in nonincreasing order and repeated according to multiplicity as s1(A)s2(A)...sn(A). If AMn is positive semi-definite (definite), then we write A0(A>0). Every AMn admits what is called the cartesian decomposition A=ReA+iImA, where ReA=A+A2, ImA=AA2. A matrix AMn is called accretive if ReA is positive definite. Recall that a norm |||| on Mn is unitarily invariant if ||UAV||=||A|| for any AMn and unitary matrices U,VMn. The Hilbert-Schmidt norm is defined as ||A||22=tr(AA).

    For A,B>0 and t[0,1], the weighted geometric mean of A and B is defined as follows

    AtB =A1/2(A1/2BA1/2)tA1/2.

    When t=12, A12B is called the geometric mean of A and B, which is often denoted by AB. It is known that the notion of the (weighted) geometric mean could be extended to cover all positive semi-definite matrices; see [3, Chapter 4].

    Let A,B,XMn. For 2×2 block matrix M in the form

    M=(AXXB)M2n

    with each block in Mn, its partial transpose of M is defined by

    Mτ=(AXXB).

    If M and Mτ0, then we say it is positive partial transpose (PPT). We extend the notion to accretive matrices. If

    M=(AXYB)M2n,

    and

    Mτ=(AYXC)M2n

    are both accretive, then we say that M is APT (i.e., accretive partial transpose). It is easy to see that the class of APT matrices includes the class of PPT matrices; see [6,10,13].

    Recently, many results involving the off-diagonal block of a PPT matrix and its diagonal blocks were presented; see [5,11,12]. In 2023, Alakhrass [1] presented the following two results on 2×2 block PPT matrices.

    Theorem 1.1 ([1], Theorem 3.1). Let (AXXB) be PPT and let X=U|X| be the polar decomposition of X, then

    |X|(AtB)(U(A1tB)U),t[0,1].

    Theorem 1.2 ([1], Theorem 3.2). Let (AXXB) be PPT, then for t[0,1],

    ReX(AtB)(A1tB)(AtB)+(A1tB)2,

    and

    ImX(AtB)(A1tB)(AtB)+(A1tB)2.

    By Theorem 1.1 and the fact si+j1(XY)si(X)sj(Y)(i+jn+1), the author obtained the following corollary.

    Corollary 1.3 ([1], Corollary 3.5). Let (AXXB) be PPT, then for t[0,1],

    si+j1(X)si(AtB)sj(A1tB).

    Consequently,

    s2j1(X)sj(AtB)sj(A1tB).

    A careful examination of Alakhrass' proof in Corollary 1.3 actually revealed an error. The right results are si+j1(X)si(AtB)12sj((A1tB)12) and s2j1(X)sj((AtB)12)sj((A1tB)12). Thus, in this note, we will give a correct proof of Corollary 1.3 and extend the above inequalities to the class of 2×2 block APT matrices. At the same time, some relevant results will be obtained.

    Before presenting and proving our results, we need the following several lemmas of the weighted geometric mean of two positive matrices.

    Lemma 2.1. [3, Chapter 4] Let X,YMn be positive definite, then

    1) XY=max{Z:Z=Z,(XZZY)0}.

    2) XY=X12UY12 for some unitary matrix U.

    Lemma 2.2. [4, Theorem 3] Let X,YMn be positive definite, then for every unitarily invariant norm,

    ||XtY||||X1tYt||||(1t)X+tY||.

    Now, we give a lemma that will play an important role in the later proofs.

    Lemma 2.3. Let M=(AXYB)M2n be APT, then for t[0,1],

    (ReAtReBX+Y2X+Y2ReA1tReB)

    is PPT.

    Proof: Since M is APT, we have that

    ReM=(ReAX+Y2X+Y2ReB)

    is PPT.

    Therefore, ReM0 and ReMτ0.

    By the Schur complement theorem, we have

    ReBX+Y2(ReA)1X+Y20,

    and

    ReAX+Y2(ReB)1X+Y20.

    Compute

    X+Y2(ReAtReB)1X+Y2=X+Y2((ReA)1t(ReB)1)X+Y2=(X+Y2(ReA)1X+Y2)t(X+Y2(ReB)1X+Y2)ReBtReA.

    Thus,

    (ReBtReA)X+Y2(ReAtReB)1X+Y20.

    By utilizing (ReBtReA)=ReA1tReB, we have

    (ReAtReBX+Y2X+Y2ReA1tReB)0.

    Similarly, we have

    (ReAtReBX+Y2X+Y2ReA1tReB)0.

    This completes the proof.

    First, we give the correct proof of Corollary 1.3.

    Proof: By Theorem 1.1, there exists a unitary matrix UMn such that |X|(AtB)(U(A1tB)U). Moreover, by Lemma 2.1, we have (AtB)(U(A1tB)U)=(AtB)12V(U(A1tB)12U). Now, by si+j1(AB)si(A)sj(B), we have

    si+j1(X)si+j1((AtB)(U(A1tB)U))=si+j1((AtB)12VU(A1tB)12U)si((AtB)12)sj((A1tB)12),

    which completes the proof.

    Next, we generalize Theorem 1.1 to the class of APT matrices.

    Theorem 2.4. Let M=(AXYB) be APT, then

    |X+Y2|(ReAtReB)(U(ReA1tReB)U),

    where UMn is any unitary matrix such that X+Y2=U|X+Y2|.

    Proof: Since M is an APT matrix, we know that

    (ReAtReBX+Y2X+Y2ReB1tReA)

    is PPT.

    Let W be a unitary matrix defined as W=(I00U). Thus,

    W(ReAtReBX+Y2X+Y2ReA1tReB)W=(ReAtReB|X+Y2||X+Y2|U(ReA1tReB)U)0.

    By Lemma 2.1, we have

    |X+Y2|(ReAtReB)(U(ReA1tReB)U).

    Remark 1. When M=(AXYB) is PPT in Theorem 2.4, our result is Theorem 1.1. Thus, our result is a generalization of Theorem 1.1.

    Using Theorem 2.4 and Lemma 2.2, we have the following.

    Corollary 2.5. Let M=(AXYB) be APT and let t[0,1], then for every unitarily invariant norm |||| and some unitary matrix UMn,

    ||X+Y2||||(ReAtReB)(U(ReA1tReB)U)||||(ReAtReB)+U(ReA1tReB)U2||||ReAtReB||+||ReA1tReB||2||(ReA)1t(ReB)t||+||(ReA)t(ReB)1t||2||(1t)ReA+tReB||+||tReA+(1t)ReB||2.

    Proof: The first inequality follows from Theorem 2.4. The third one is by the triangle inequality. The other conclusions hold by Lemma 2.2.

    In particular, when t=12, we have the following result.

    Corollary 2.6. Let M=(AXYB) be APT, then for every unitarily invariant norm |||| and some unitary matrix UMn,

    ||X+Y2||||(ReAReB)(U(ReAReB)U)||||(ReAReB)+U(ReAReB)U2||||ReAReB||||(ReA)12(ReB)12||||ReA+ReB2||.

    Squaring the inequalities in Corollary 2.6, we get a quick consequence.

    Corollary 2.7. If M=(AXYB) is APT, then

    tr((X+Y2)(X+Y2))tr((ReAReB)2)tr(ReAReB)tr((ReA+ReB2)2).

    Proof: Compute

    tr((X+Y2)(X+Y2))tr((ReAReB)(ReAReB))=tr((ReAReB)2)tr((ReA)(ReB))tr((ReA+ReB2)2).

    It is known that for any X,YMn and any indices i,j such that i+jn+1, we have si+j1(XY)si(X)sj(Y) (see [2, Page 75]). By utilizing this fact and Theorem 2.4, we can obtain the following result.

    Corollary 2.8. Let M=(AXYB) be APT, then for any t[0,1], we have

    si+j1(X+Y2)si((ReAtReB)12)sj((ReA1tReB)12).

    Consequently,

    s2j1(X+Y2)sj((ReAtReB)12)sj((ReA1tReB)12).

    Proof: By Lemma 2.1 and Theorem 2.4, observe that

    si+j1(X+Y2)=si+j1(|X+Y2|)si+j1((ReAtReB)(U(ReA1tReB)U))=si+j1((ReAtReB)12V(U(ReA1tReB)U)12)si((ReAtReB)12V)sj((U(ReA1tReB)U)12)=si((ReAtReB)12)sj((ReA1tReB)12).

    Finally, we study the relationship between the diagonal blocks and the real part of the off-diagonal blocks of the APT matrix M.

    Theorem 2.9. Let M=(AXYB) be APT, then for all t[0,1],

    Re(X+Y2)(ReAtReB)(ReA1tReB)(ReAtReB)+(ReA1tReB)2,

    and

    Im(X+Y2)(ReAtReB)(ReA1tReB)(ReAtReB)+(ReA1tReB)2.

    Proof: Since M is APT, we have that

    ReM=(ReAX+Y2X+Y2ReB)

    is PPT.

    Therefore,

    (ReAtReBRe(X+Y2)Re(X+Y2)ReA1tReB)=12(ReAtReBX+Y2X+Y2ReA1tReB)+12(ReAtReBX+Y2X+Y2ReA1tReB)0.

    So, by Lemma 2.1, we have

    Re(X+Y2)(ReAtReB)(ReA1tReB).

    This implies the first inequality.

    Since ReM is PPT, we have

    (ReAiX+Y2iX+Y2ReB)=(I00iI)(ReM)(I00iI)0,(ReAiX+Y2iX+Y2ReB)=(I00iI)((ReM)τ)(I00iI)0.

    Thus,

    (ReAiX+Y2iX+Y2ReB)

    is PPT.

    By Lemma 2.3,

    (ReAtReBiX+Y2iX+Y2ReA1tReB)

    is also PPT.

    So,

    12(ReAtReBiX+Y2iX+Y2ReA1tReB)+12(ReAtReBiX+Y2iX+Y2ReA1tReB)0,

    which means that

    (ReAtReBIm(X+Y2)Im(X+Y2)ReA1tReB)0.

    By Lemma 2.1, we have

    Im(X+Y2)(ReAtReB)(ReA1tReB).

    This completes the proof.

    Corollary 2.10. Let (ReAX+Y2X+Y2ReB)0. If X+Y2 is Hermitian and t[0,1], then,

    X+Y2(ReAtReB)(ReA1tReB)(ReAtReB)+(ReA1tReB)2.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    The work is supported by National Natural Science Foundation (grant No. 12261030), Hainan Provincial Natural Science Foundation for High-level Talents (grant No. 123RC474), Hainan Provincial Natural Science Foundation of China (grant No. 124RC503), the Hainan Provincial Graduate Innovation Research Program (grant No. Qhys2023-383 and Qhys2023-385), and the Key Laboratory of Computational Science and Application of Hainan Province.

    The authors declare that they have no conflict of interest.



    [1] M. Alakhrass, A note on positive partial transpose blocks, AIMS Mathematics, 8 (2023), 23747–23755. https://doi.org/10.3934/math.20231208 doi: 10.3934/math.20231208
    [2] R. Bhatia, Matrix analysis, New York: Springer, 1997. https://doi.org/10.1007/978-1-4612-0653-8
    [3] R. Bhatia, Positive definite matrices, Princeton: Princeton University Press, 2007.
    [4] R. Bhatia, P. Grover, Norm inequalities related to the matrix geometric mean, Linear Algebra Appl., 437 (2012), 726–733. https://doi.org/10.1016/j.laa.2012.03.001 doi: 10.1016/j.laa.2012.03.001
    [5] X. Fu, P. S. Lau, T. Y. Tam, Inequalities on 2×2 block positive semidefinite matrices, Linear Multilinear A., 70 (2022), 6820–6829. https://doi.org/10.1080/03081087.2021.1969327 doi: 10.1080/03081087.2021.1969327
    [6] X. Fu, L. Hu, S. A. Haseeb, Inequalities for partial determinants of accretive block matrices, J. Inequal. Appl., 2023 (2023), 101. https://doi.org/10.1186/s13660-023-03008-x doi: 10.1186/s13660-023-03008-x
    [7] S. Hayat, J. H. Koolen, F. Liu, Z. Qiao, A note on graphs with exactly two main eigenvalues, Linear Algebra Appl., 511 (2016), 318–327. https://doi.org/10.1016/j.laa.2016.09.019 doi: 10.1016/j.laa.2016.09.019
    [8] S. Hayat, M. Javaid, J. H. Koolen, Graphs with two main and two plain eigenvalues, Appl. Anal. Discr. Math., 11 (2017), 244–257. https://doi.org/10.2298/AADM1702244H doi: 10.2298/AADM1702244H
    [9] J. H. Koolen, S. Hayat, Q. Iqbal, Hypercubes are determined by their distance spectra, Linear Algebra Appl., 505 (2016), 97–108. https://doi.org/10.1016/j.laa.2016.04.036 doi: 10.1016/j.laa.2016.04.036
    [10] L. Kuai, An extension of the Fiedler-Markham determinant inequality, Linear Multilinear A., 66 (2018), 547–553. https://doi.org/10.1080/03081087.2017.1304521 doi: 10.1080/03081087.2017.1304521
    [11] E. Y. Lee, The off-diagonal block of a PPT matrix, Linear Algebra Appl., 486 (2015), 449–453. https://doi.org/10.1016/j.laa.2015.08.018 doi: 10.1016/j.laa.2015.08.018
    [12] M. Lin, Inequalities related to 2×2 block PPT matrices, Oper. Matrices, 9 (2015), 917–924. http://doi.org/10.7153/oam-09-54 doi: 10.7153/oam-09-54
    [13] H. Xu, X. Fu, S. A. Haseeb, Trace inequalities related to 2×2 block sector matrices, Oper. Matrices, 17 (2023), 367–374. http://doi.org/10.7153/oam-2023-17-26 doi: 10.7153/oam-2023-17-26
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(903) PDF downloads(60) Cited by(0)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog