It is well established that a diverse range of entropic measures, while remarkably adaptable, must inevitably be complemented by innovative approaches to enhance their effectiveness across various domains. These measures play a crucial role in fields like communication and coding theory, driving researchers to develop numerous new information measures that can be applied in a wide array of disciplines. This paper introduces a pioneering isolated entropic measure and its solicitations to queueing theory the study of dissimilarities of uncertainty. By creating the newly developed discrete entropy, we have articulated an optimization principle where the space capacity is predetermined and solitary evidence accessible is around the mean size. Additionally, we have conveyed the solicitations of "maximum entropy principle" to maximize the entropy probability distributions.
Citation: Vikramjeet Singh, Sunil Kumar Sharma, Om Parkash, Retneer Sharma, Shivam Bhardwaj. A newfangled isolated entropic measure in probability spaces and its applications to queueing theory[J]. AIMS Mathematics, 2024, 9(10): 27293-27307. doi: 10.3934/math.20241326
It is well established that a diverse range of entropic measures, while remarkably adaptable, must inevitably be complemented by innovative approaches to enhance their effectiveness across various domains. These measures play a crucial role in fields like communication and coding theory, driving researchers to develop numerous new information measures that can be applied in a wide array of disciplines. This paper introduces a pioneering isolated entropic measure and its solicitations to queueing theory the study of dissimilarities of uncertainty. By creating the newly developed discrete entropy, we have articulated an optimization principle where the space capacity is predetermined and solitary evidence accessible is around the mean size. Additionally, we have conveyed the solicitations of "maximum entropy principle" to maximize the entropy probability distributions.
[1] | M. A. Abd Elgawad, H. M. Barakat, S. W. Xiong, S. A. Alyami, Information measures for generalized order statistics and their concomitants under general framework from Huang-Kotz FGM bivariate distribution, Entropy, 23 (2021), 1–17. https://doi.org/10.3390/e23030335 doi: 10.3390/e23030335 |
[2] | E. Abad-Segura, M. D. González-Zamar, M. Squillante, Examining the research on business information-entropy correlation in the accounting process of organizations, Entropy, 23 (2021), 1–25. https://doi.org/10.3390/e23111493 doi: 10.3390/e23111493 |
[3] | J. E. Contreras-Reyes, Lerch distribution based on maximum nonsymmetric entropy principle: Application to Conway's game of life cellular automaton, Chaos Solitons Fract., 151 (2021), 111272. https://doi.org/10.1016/j.chaos.2021.111272 doi: 10.1016/j.chaos.2021.111272 |
[4] | R. Fowler, J. J. Heckman, Misanthropic entropy and renormalization as a communication channel, Int. J. Modern Phys. A, 37 (2022), 2250109. https://doi.org/10.1142/s0217751x22501093 doi: 10.1142/s0217751x22501093 |
[5] | X. Z. Gao, Y. Deng, The pseudo-Pascal triangle of maximum Deng entropy, Int. J. Comput. Commun. Control, 15 (2020), 1–10. https://doi.org/10.15837/ijccc.2020.1.3735 doi: 10.15837/ijccc.2020.1.3735 |
[6] | J. Havrada, F. Charvat, Quantification methods of classification process. Concept of structural α-entropy, Kybernetika, 3 (1967), 30–35. |
[7] | E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev., 106 (1957), 620. https://doi.org/10.1103/physrev.106.620 doi: 10.1103/physrev.106.620 |
[8] | P. Jizba, J. Korbel, Maximum entropy principle in statistical inference: case for non-Shannonian entropies, Phys. Rev. Lett., 122 (2019), 120601. https://doi.org/10.1103/physrevlett.122.120601 doi: 10.1103/physrevlett.122.120601 |
[9] | J. N. Kapur, G. Baciu, H. K. Kesavan, The MinMax information measure, Int. J. Syst. Sci., 26 (1995), 1–12. https://doi.org/10.1080/00207729508929020 doi: 10.1080/00207729508929020 |
[10] | J. N. Kapur, Review of maximum-entropy models in science and engineering, Biometrics, 48 (1992), 333–334. https://doi.org/10.2307/2532770 doi: 10.2307/2532770 |
[11] | J. N. Kapur, Review of measures of information and their applications, Biometrics, 52 (1996), 379. https://doi.org/10.2307/2533186 doi: 10.2307/2533186 |
[12] | Y. X. Li, B. Geng, S. B. Jiao, Dispersion entropy-based Lempel-Ziv complexity: a new metric for signal analysis, Chaos Solitons Fract., 161 (2022), 112400. https://doi.org/10.1016/j.chaos.2022.112400 doi: 10.1016/j.chaos.2022.112400 |
[13] | I. Legchenkova, M. Frenkel, N. Shvalb, S. Shoval, O. V. Gendelman, E. Bormashenko, From chaos to ordering: New studies in the Shannon entropy of 2D patterns, Entropy, 24 (2022), 1–16. https://doi.org/10.3390/e24060802 doi: 10.3390/e24060802 |
[14] | X. Liu, X. Y. Wang, J. Xie, B. T. Li, Construction of probability box model based on maximum entropy principle and corresponding hybrid reliability analysis approach, Struct. Multidiscip. Optim., 61 (2019), 599–617. https://doi.org/10.1007/s00158-019-02382-9 doi: 10.1007/s00158-019-02382-9 |
[15] | A. M. Mariz, On the irreversible nature of the Tsallis and Rényi entropies, Phys. Lett. A, 165 (1992), 409–411. https://doi.org/10.1016/0375-9601(92)90339-N doi: 10.1016/0375-9601(92)90339-N |
[16] | R. P. Mondaini, S. C. de Albuquerque Neto, Alternative entropy measures and generalized Khinchin-Shannon inequalities, Entropy, 23 (2021), 1–12. https://doi.org/10.3390/e23121618 doi: 10.3390/e23121618 |
[17] | O. Parkash, V. Singh, R. Sharma, A new discrete information model and its applications for the study of contingency tables, J. Discrete Math. Sci. Cryptogr., 25 (2022), 785–792. https://doi.org/10.1080/09720529.2021.2014135 doi: 10.1080/09720529.2021.2014135 |
[18] | A. Rastegin, Estimating the Shannon entropy and (un)certainty relations for design-structured POVMs, SIAM J. Appl. Math., 82 (2022), 1001–1019. https://doi.org/10.1137/21m1408105 doi: 10.1137/21m1408105 |
[19] | A. Renyi, On measures of entropy and information, In: Proceedings 4th Berkeley symposium on mathematical statistics and probability, 1 (1961), 547–561. |
[20] | C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J., 27 (1948), 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x doi: 10.1002/j.1538-7305.1948.tb01338.x |
[21] | A. Sholehkerdar, J. Tavakoli, Z. Liu, Theoretical analysis of Tsallis entropy-based quality measure for weighted averaging image fusion, Inform. Fusion, 58 (2020), 69–81. https://doi.org/10.1016/j.inffus.2019.12.010 doi: 10.1016/j.inffus.2019.12.010 |
[22] | T. Suguro, Shannon's inequality for the Rényi entropy and an application to the uncertainty principle, J. Funct. Anal., 283 (2022), 109566. https://doi.org/10.1016/j.jfa.2022.109566 doi: 10.1016/j.jfa.2022.109566 |
[23] | C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Statist. Phys., 52 (1988), 479–487. https://doi.org/10.1007/BF01016429 doi: 10.1007/BF01016429 |
[24] | J. J. Wan, N. Guo, Shannon entropy in configuration space for Ni-like isoelectronic sequence, Entropy, 22 (2020), 1–23. https://doi.org/10.3390/e22010033 doi: 10.3390/e22010033 |
[25] | J. L. Zhang, J. Y. Shi, Asymptotic normality for plug-in estimators of generalized Shannon's entropy, Entropy, 24 (2022), 1–10. https://doi.org/10.3390/e24050683 doi: 10.3390/e24050683 |
[26] | Z. Zhang, S. Cheng, H. S. Xu, R. J. Pan, M. Kuang, Research on airport leading goods selection based on maximum entropy principle, In: Proceedings of the 2018 international workshop on education reform and social sciences (ERSS 2018), 2019. https://doi.org/10.2991/erss-18.2019.97 |