Research article Special Issues

Equivalent analysis of different estimations under a multivariate general linear model

  • Received: 12 May 2024 Revised: 23 July 2024 Accepted: 26 July 2024 Published: 06 August 2024
  • MSC : 15A10, 62F10, 62H12, 62J05

  • This article explores the mathematical and statistical performances and connections of the two well-known ordinary least-squares estimators (OLSEs) and best linear unbiased estimators (BLUEs) of unknown parameter matrices in the context of a multivariate general linear model (MGLM) for regression, both of which are defined under two different optimality criteria. Tian and Zhang [38] once collected a series of existing and novel identifying conditions for OLSEs to be BLUEs under general linear models: On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Probabil. Lett., 112 (2016), 105–112. In this paper, we show how to extend this kind of results to multivariate general linear models. We shall give a direct algebraic procedure to derive explicit formulas for calculating the OLSEs and BLUEs of parameter spaces in a given MGLM, discuss the relationships between OLSEs and BLUEs of parameter matrices in the MGLM, establish many algebraic equalities related to the equivalence of OLSEs and BLUEs, and give various intrinsic statistical interpretations about the equivalence of OLSEs and BLUEs of parameter matrices in a given MGLM using some matrix analysis tools concerning ranks, ranges, and generalized inverses of matrices.

    Citation: Bo Jiang, Yongge Tian. Equivalent analysis of different estimations under a multivariate general linear model[J]. AIMS Mathematics, 2024, 9(9): 23544-23563. doi: 10.3934/math.20241144

    Related Papers:

  • This article explores the mathematical and statistical performances and connections of the two well-known ordinary least-squares estimators (OLSEs) and best linear unbiased estimators (BLUEs) of unknown parameter matrices in the context of a multivariate general linear model (MGLM) for regression, both of which are defined under two different optimality criteria. Tian and Zhang [38] once collected a series of existing and novel identifying conditions for OLSEs to be BLUEs under general linear models: On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Probabil. Lett., 112 (2016), 105–112. In this paper, we show how to extend this kind of results to multivariate general linear models. We shall give a direct algebraic procedure to derive explicit formulas for calculating the OLSEs and BLUEs of parameter spaces in a given MGLM, discuss the relationships between OLSEs and BLUEs of parameter matrices in the MGLM, establish many algebraic equalities related to the equivalence of OLSEs and BLUEs, and give various intrinsic statistical interpretations about the equivalence of OLSEs and BLUEs of parameter matrices in a given MGLM using some matrix analysis tools concerning ranks, ranges, and generalized inverses of matrices.



    加载中


    [1] I. S. Alalouf, G. P. H. Styan, Characterizations of estimability in the general linear model, Ann. Statist., 7 (1979), 194–200. http://dx.doi.org/10.1214/aos/1176344564 doi: 10.1214/aos/1176344564
    [2] T. W. Anderson, An introduction to multivariate statistical analysis, 2 Eds., New York: Wiley, 1984.
    [3] A. Basilevsky, Applied matrix algebra in the statistical sciences, New York: Dover Publications, 2013.
    [4] D. Bertsimas, M. S. Copenhaver, Characterization of the equivalence of robustification and regularization in linear and matrix regression, Euro. J. Oper. Res., 70 (2018), 931–942. https://dx.doi.org/10.1016/j.ejor.2017.03.051 doi: 10.1016/j.ejor.2017.03.051
    [5] N. H. Bingham, W. J. Krzanowski, Linear algebra and multivariate analysis in statistics: development and interconnections in the twentieth century, British Journal for the History of Mathematics, 37 (2022), 43–63. http://dx.doi.org/10.1080/26375451.2022.2045811 doi: 10.1080/26375451.2022.2045811
    [6] R. Christensen, Linear models for multivariate, time series, and spatial data, New York: Springer, 1991. http://dx.doi.org/10.1007/978-1-4757-4103-2
    [7] M. H. Ding, H. Y. Liu, G. H. Zheng, On inverse problems for several coupled PDE systems arising in mathematical biology, J. Math. Biol., 87 (2023), 86. http://dx.doi.org/10.1007/s00285-023-02021-4 doi: 10.1007/s00285-023-02021-4
    [8] R. W. Farebrother, A. C. Aitken and the consolidation of matrix theory, Linear Algebra Appl., 264 (1997), 3–12. http://dx.doi.org/10.1016/S0024-3795(96)00398-9 doi: 10.1016/S0024-3795(96)00398-9
    [9] J. E. Gentle, Matrix algebra: theory, computations, and applications in statistics, 2 Eds., New York: Springer, 2017. http://dx.doi.org/10.1007/978-0-387-70873-7
    [10] R. Gnanadesikan, Methods for statistical data analysis of multivariate observations, 2 Eds., New York: Wiley, 1997. http://dx.doi.org/10.1002/9781118032671
    [11] D. A. Harville, Matrix algebra from a statistician's perspective, New York: Springer, 1997. https://dx.doi.org/10.1007/b98818
    [12] B. Jiang, Y. G. Tian, On additive decompositions of estimators under a multivariate general linear model and its two submodels, J. Multivariate Anal., 162 (2017), 193–214. http://dx.doi.org/10.1016/j.jmva.2017.09.007 doi: 10.1016/j.jmva.2017.09.007
    [13] B. Jiang, Y. G. Tian, On equivalence of predictors/estimators under a multivariate general linear model with augmentation, J. Korean Stat. Soc., 46 (2017), 551–561. http://dx.doi.org/10.1016/j.jkss.2017.04.001 doi: 10.1016/j.jkss.2017.04.001
    [14] K. Kim, N. Timm, Univariate and multivariate general linear models: theory and applications with SAS, 2 Eds., New York: CRC Press, 2006.
    [15] H. Y. Liu, C. W. K. Lo, Determining a parabolic system by boundary observation of its non-negative solutions with biological applications, Inverse Probl., 40 (2024), 025009. http://dx.doi.org/10.1088/1361-6420/ad149f doi: 10.1088/1361-6420/ad149f
    [16] R. Ma, Y. G. Tian, A matrix approach to a general partitioned linear model with partial parameter restrictions, Linear Multilinear A., 70 (2022), 2513–2532. http://dx.doi.org/10.1080/03081087.2020.1804521 doi: 10.1080/03081087.2020.1804521
    [17] A. Markiewicz, S. Puntanen, All about the $\perp$ with its applications in the linear statistical models, Open Math., 13 (2015), 33–50. http://dx.doi.org/10.1515/math-2015-0005 doi: 10.1515/math-2015-0005
    [18] A. Markiewicz, S. Puntanen, G. P. H. Styan, The legend of the equality of OLSE and BLUE: highlighted by C. R. Rao in 1967, In: Methodology and applications of statistics, Cham: Springer, 2021, 51–76. https://doi.org/10.1007/978-3-030-83670-2_3
    [19] G. Marsaglia, G. P. H. Styan, Equalities and inequalities for ranks of matrices, Linear Multilinear A., 2 (1974), 269–292. http://dx.doi.org/10.1080/03081087408817070 doi: 10.1080/03081087408817070
    [20] S. K. Mitra, Generalized inverse of matrices and applications to linear models, Handbook of Statistics, 1 (1980), 471–512. https://dx.doi.org/10.1016/S0169-7161(80)80045-9 doi: 10.1016/S0169-7161(80)80045-9
    [21] K. E. Muller, P. W. Stewart, Linear model theory: univariate, multivariate, and mixed models, New York: Wiley, 2006. http://dx.doi.org/10.1002/0470052147
    [22] S. C. Narula, P. J. Korhonen, Multivariate multiple linear regression based on the minimum sum of absolute errors criterion, Euro. J. Oper. Res., 73 (1994), 70–75. http://dx.doi.org/10.1016/0377-2217(94)90144-9 doi: 10.1016/0377-2217(94)90144-9
    [23] S. C. Narula, J. F. Wellington, Multiple criteria linear regression, Euro. J. Oper. Res., 181 (2007), 767–772. http://dx.doi.org/10.1016/j.ejor.2006.06.026 doi: 10.1016/j.ejor.2006.06.026
    [24] R. Penrose, A generalized inverse for matrices, Math. Proc. Cambridge, 51 (1955), 406–413. http://dx.doi.org/10.1017/S0305004100030401 doi: 10.1017/S0305004100030401
    [25] S. Puntanen, G. P. H. Styan, The equality of the ordinary least squares estimator and the best linear unbiased estimator, with comments by O. Kempthorne, S. R. Searle, and a reply by the authors, Am. Stat., 43 (1989), 153–161. http://dx.doi.org/10.1080/00031305.1989.10475644 doi: 10.1080/00031305.1989.10475644
    [26] S. Puntanen, G. P. H. Styan, J. Isotalo, Matrix tricks for linear statistical models: Our personal top twenty, Berlin: Springer, 2011. http://dx.doi.org/10.1007/978-3-642-10473-2
    [27] C. R. Rao, S. K. Mitra, Generalized inverse of matrices and its applications, New York: Wiley, 1972.
    [28] C. R. Rao, M. B. Rao, Matrix algebra and its applications to statistics and econometrics, Singapore: World Scientific, 1998. http://dx.doi.org/10.1142/9789812779281
    [29] G. C. Reinsei, R. P. Velu, Multivariate reduced-rank regression: theory and applications, New York: Springer, 1998. http://dx.doi.org/10.1007/978-1-4757-2853-8
    [30] J. S. Respondek, Matrix black box algorithms–a survey, B. Pol. Acad. Sci.-Tech., 70 (2022), e140535. http://dx.doi.org/10.24425/bpasts.2022.140535 doi: 10.24425/bpasts.2022.140535
    [31] S. R. Searle, A. I. Khuri, Matrix algebra useful for statistics, 2 Eds., Hoboken: Wiley, 2017.
    [32] G. A. F. Seber, Multivariate observations, Hoboken: Wiley, 2004. http://dx.doi.org/10.1002/9780470316641
    [33] Y. G. Tian, On equalities of estimations of parametric functions under a general linear model and its restricted models, Metrika, 72 (2010), 313–330. http://dx.doi.org/10.1007/s00184-009-0255-2 doi: 10.1007/s00184-009-0255-2
    [34] Y. G. Tian, A new derivation of BLUPs under random-effects model, Metrika, 78 (2015), 905–918. http://dx.doi.org/10.1007/s00184-015-0533-0 doi: 10.1007/s00184-015-0533-0
    [35] Y. G. Tian, Matrix rank and inertia formulas in the analysis of general linear models, Open Math., 15 (2017), 126–150. http://dx.doi.org/10.1515/math-2017-0013 doi: 10.1515/math-2017-0013
    [36] Y. G. Tian, B. Jiang, Matrix rank/inertia formulas for least-squares solutions with statistical applications, Spec. Matrices, 4 (2016), 130–140. http://dx.doi.org/10.1515/spma-2016-0013 doi: 10.1515/spma-2016-0013
    [37] Y. G. Tian, C. Wang, On simultaneous prediction in a multivariate general linear model with future observations, Stat. Probabil. Lett., 128 (2017), 52–59. http://dx.doi.org/10.1016/j.spl.2017.04.007 doi: 10.1016/j.spl.2017.04.007
    [38] Y. G. Tian, X. Zhang, On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Probabil. Lett., 112 (2016), 105–112. http://dx.doi.org/10.1016/j.spl.2016.01.019 doi: 10.1016/j.spl.2016.01.019
    [39] Y. W. Yin, W. S. Yin, P. C. Meng, H. Y. Liu, The interior inverse scattering problem for a two-layered cavity using the Bayesian method, Inverse Probl. Imag., 16 (2022), 673–690. http://dx.doi.org/10.3934/ipi.2021069 doi: 10.3934/ipi.2021069
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(580) PDF downloads(64) Cited by(0)

Article outline

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog