Research article Special Issues

Further properties of Tsallis extropy and some of its related measures

  • Received: 24 August 2023 Revised: 23 September 2023 Accepted: 27 September 2023 Published: 13 October 2023
  • MSC : 62B10, 94A15, 94A17

  • This article introduces the concept of residual and past Tsallis extropy as a continuous information measure within the context of continuous distribution. Moreover, the characteristics and their relationships with other models are evaluated. Several stochastic comparisons are provided, along with outcomes concerning order statistics. Additionally, the models acquired include instances such as uniform and power function distributions. The measure incorporates its monotonic traits, and the outcomes defining its characteristics are presented. On the other hand, a different portrayal of the Tsallis extropy is introduced, expressed in relation to the hazard rate function. The Tsallis extropy of the lifetime for both mixed and coherent systems is explored. In the case of mixed systems, components' lifetimes are considered independent and identically distributed. Additionally, constraints on the Tsallis extropy of these systems are established, along with a clarification of their practical applicability. Non-parametric estimation using an alternative form of Tsallis function extropy for simulated and real data is performed.

    Citation: Mohamed Said Mohamed, Haroon M. Barakat, Aned Al Mutairi, Manahil SidAhmed Mustafa. Further properties of Tsallis extropy and some of its related measures[J]. AIMS Mathematics, 2023, 8(12): 28219-28245. doi: 10.3934/math.20231445

    Related Papers:

  • This article introduces the concept of residual and past Tsallis extropy as a continuous information measure within the context of continuous distribution. Moreover, the characteristics and their relationships with other models are evaluated. Several stochastic comparisons are provided, along with outcomes concerning order statistics. Additionally, the models acquired include instances such as uniform and power function distributions. The measure incorporates its monotonic traits, and the outcomes defining its characteristics are presented. On the other hand, a different portrayal of the Tsallis extropy is introduced, expressed in relation to the hazard rate function. The Tsallis extropy of the lifetime for both mixed and coherent systems is explored. In the case of mixed systems, components' lifetimes are considered independent and identically distributed. Additionally, constraints on the Tsallis extropy of these systems are established, along with a clarification of their practical applicability. Non-parametric estimation using an alternative form of Tsallis function extropy for simulated and real data is performed.



    加载中


    [1] G. Alomani, M. Kayid, Further properties of Tsallis entropy and its application, Entropy, 25 (2023), 199. https://doi.org/10.3390/e25020199 doi: 10.3390/e25020199
    [2] I. Bagai, S. C. Kochar, On tail-ordering and comparison of failure rates, Commun. Stat.-Theor. M., 15 (1986), 1377–1388. https://doi.org/10.1080/03610928608829189 doi: 10.1080/03610928608829189
    [3] N. Balakrishnan, F. Buono, M. Longobardi, On Tsallis extropy with an application to pattern recognition, Stat. Probabil. Lett., 180 (2022), 109241. https://doi.org/10.1016/j.spl.2021.109241 doi: 10.1016/j.spl.2021.109241
    [4] J. E. Contreras-Reyes, D. I. Gallardo, O. Kharazmi, Time-dependent residual Fisher information and distance for some special continuous distributions, Commun. Stat.-Simul. C., 2022. https://doi.org/10.1080/03610918.2022.2146136 doi: 10.1080/03610918.2022.2146136
    [5] N. Ebrahimi, E. Maasoumi, E. S. Soofi, Ordering univariate distributions by entropy and variance, J. Econ., 90 (1999), 317–336.
    [6] S. M. A. Jahanshahi, H. Zarei, A. H. Khammar, On cumulative residual extropy, Probab. Eng. Inform. Sci., 34 (2020), 605–625. https://doi.org/10.1017/S0269964819000196 doi: 10.1017/S0269964819000196
    [7] T. M. Jawa, N. Fatima, N. Sayed-Ahmed, R. Aldallal, M. S. Mohamed, Residual and past discrete Tsallis and Renyi extropy with an application to softmax function, Entropy, 24 (2022), 1732. https://doi.org/10.3390/e24121732 doi: 10.3390/e24121732
    [8] O. Kamari, F. Buono, On extropy of past lifetime distribution, Ric. Mat., 70 (2021), 505–515. https://doi.org/10.1007/s11587-020-00488-7 doi: 10.1007/s11587-020-00488-7
    [9] S. Kayal, N. Balakrishnan, Weighted fractional generalized cumulative past entropy and its properties, Methodol. Comput. Appl., 25 (2023), 61. https://doi.org/10.1007/s11009-023-10035-0 doi: 10.1007/s11009-023-10035-0
    [10] O. Kharazmi, J. E. Contreras-Reyes, N. Balakrishnan, Jensen-Fisher information and Jensen-Shannon entropy measures based on complementary discrete distributions with an application to Conway's game of life, Physica D, 453 (2023), 133822. https://doi.org/10.1016/j.physd.2023.133822 doi: 10.1016/j.physd.2023.133822
    [11] A. S. Krishnan, S. M. Sunoj, N. U. Nair, Some reliability properties of extropy for residual and past lifetime random variables, J. Korean Stat. Soc., 49 (2020), 457–474. https://doi.org/10.1007/s42952-019-00023-x doi: 10.1007/s42952-019-00023-x
    [12] F. Lad, G. Sanfilippo, G. Agro, Extropy: Complementary dual of entropy, Stat. Sci., 30 (2015), 40–58. https://doi.org/10.1214/14-STS430 doi: 10.1214/14-STS430
    [13] F. Lad, G. Sanfilippo, G. Agro, The duality of entropy/extropy, and completion of the Kullback information complex, Entropy, 20 (2018), 593. https://doi.org/10.3390/e20080593 doi: 10.3390/e20080593
    [14] J. Liu, F. Xiao, Renyi extropy, Commun. Stat.- Theor. M., 52 (2023), 5836–5847. https://doi.org/10.1080/03610926.2021.2020843 doi: 10.1080/03610926.2021.2020843
    [15] D. Meng, T. Xie, P. Wu, S. Zhu, Z. Hu, Y. Li, Uncertainty-based design and optimization using first order saddle point approximation method for multidisciplinary engineering systems, ASCE-ASME J. Risk U. A, 6 (2020), 04020028. https://doi.org/10.1061/AJRUA6.0001076 doi: 10.1061/AJRUA6.0001076
    [16] M. S. Mohamed, N. Alsadat, O. S. Balogun, Continuous Tsallis and Renyi extropy with pharmaceutical market application, AIMS Math., 8 (2023), 14176–14195. https://doi.org/10.3934/math.20231233 doi: 10.3934/math.20231233
    [17] M. S. Mohamed, H. M. Barakat, S. A. Alyami, M. A. A. Elgawad, Cumulative residual Tsallis entropy-based test of uniformity and some new findings, Mathematics, 10 (2022), 771. https://doi.org/10.3390/math10050771 doi: 10.3390/math10050771
    [18] A. M. Mariz, On the irreversible nature of the Tsallis and Renyi entropies, Phys. Lett. A, 165 (1992), 409–411.
    [19] H. A. Noughabi, J. Jarrahiferiz, On the estimation of extropy, J. Nonparametric Stat., 31 (2019), 88–99. https://doi.org/10.1080/10485252.2018.1533133 doi: 10.1080/10485252.2018.1533133
    [20] H. A. Noughabi, J. Jarrahiferiz, Extropy of order statistics applied to testing symmetry, Commun. Stat.-Simul. C., 2020. https://doi.org/10.1080/03610918.2020.1714660 doi: 10.1080/03610918.2020.1714660
    [21] G. Qiu, The extropy of order statistics and record values, Stat. Probab. Lett., 120 (2017), 52–60. https://doi.org/10.1016/j.spl.2016.09.016 doi: 10.1016/j.spl.2016.09.016
    [22] G. Qiu, K. Jia, Extropy estimators with applications in testing uniformity, J. Nonparametr. Stat., 30 (2018a), 182–196. https://doi.org/10.1016/j.spl.2017.09.014 doi: 10.1016/j.spl.2017.09.014
    [23] M. Z. Raqab, G. Qiu, On extropy properties of ranked set sampling, Statistics, 53 (2019), 210–226. https://doi.org/10.1080/02331888.2018.1533963 doi: 10.1080/02331888.2018.1533963
    [24] A. Renyi, On measures of entropy and information, Proceeding of the Fourth Berkeley Symposium on Mathematical Statistics and Probability Vol. 1, University of California Press, Berkeley, CA, 1961,547–561.
    [25] F. J. Samaniego, System signatures and their applications in engineering reliability, Springer Science and Business Media: Berlin/Heidelberg, Germany, 2007,110.
    [26] M. Shaked, J. G. Shanthikumar, Stochastic orders, Springer, New York, 2007.
    [27] C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J., 27 (1948), 623–656. https://doi.org/10.1002/j.1538-7305.1948.tb00917.x doi: 10.1002/j.1538-7305.1948.tb00917.x
    [28] S. Tahmasebi, A. Toomaj, On negative cumulative extropy with applications, Commun. Stat.- Theor. M., 2020, 1–23. https://doi.org/10.1080/03610926.2020.1831541 doi: 10.1080/03610926.2020.1831541
    [29] C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys., 52 (1988), 479–487.
    [30] O. Vasicek, A test for normality based on sample entropy, J. Roy. Stat. Soc. B, 38 (1976), 54–59.
    [31] W. Wolberg, O. Mangasarian, N. Street, W. Street, Breast cancer wisconsin (Diagnostic), UCI Machine Learning Repository 1995. https://doi.org/10.24432/C5DW2B
    [32] D. Xie, F. Xiao, W. Pedrycz, Information quality for intuitionistic fuzzy values with its application in decision making, Eng. Appl. Artif. Intel., 2021. https://doi.org/10.1016/j.engappai.2021.104568 doi: 10.1016/j.engappai.2021.104568
    [33] Y. Xue, Y. Deng, Tsallis eXtropy, Commun. Stat.-Theor. M., 52 (2023), 751–762. https://doi.org/10.1080/03610926.2021.1921804 doi: 10.1080/03610926.2021.1921804
    [34] Q. Zhou, Y. Deng, Belief eXtropy: Measure uncertainty from negation, Commun. Stat.-Theor. M., 52 (2023), 3825–3847. https://doi.org/10.1080/03610926.2021.1980049 doi: 10.1080/03610926.2021.1980049
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(796) PDF downloads(84) Cited by(0)

Article outline

Figures and Tables

Figures(8)  /  Tables(3)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog