
This study delved into the analytical investigation of two significant nonlinear partial differential equations, namely the fractional Kawahara equation and fifth-order Korteweg-De Vries (KdV) equations, utilizing advanced analytical techniques: the Aboodh residual power series method and the Aboodh transform iterative method. Both equations were paramount in various fields of applied mathematics and physics due to their ability to describe diverse nonlinear wave phenomena. Here, we explored using the Aboodh methods to efficiently solve these equations under the framework of the Caputo operator. Through rigorous analysis and computational simulations, we demonstrated the efficacy of the proposed methods in providing accurate and insightful solutions to the time fractional Kawahara equation and fifth-order KdV equations. Our study advanced the understanding of nonlinear wave dynamics governed by fractional calculus, offering valuable insights and analytical tools for tackling complex mathematical models in diverse scientific and engineering applications.
Citation: Musawa Yahya Almusawa, Hassan Almusawa. Numerical analysis of the fractional nonlinear waves of fifth-order KdV and Kawahara equations under Caputo operator[J]. AIMS Mathematics, 2024, 9(11): 31898-31925. doi: 10.3934/math.20241533
[1] | Jin Gao, Lihua Dai . Weighted pseudo almost periodic solutions of octonion-valued neural networks with mixed time-varying delays and leakage delays. AIMS Mathematics, 2023, 8(6): 14867-14893. doi: 10.3934/math.2023760 |
[2] | Xiaofang Meng, Yongkun Li . Pseudo almost periodic solutions for quaternion-valued high-order Hopfield neural networks with time-varying delays and leakage delays on time scales. AIMS Mathematics, 2021, 6(9): 10070-10091. doi: 10.3934/math.2021585 |
[3] | N. Mohamed Thoiyab, Mostafa Fazly, R. Vadivel, Nallappan Gunasekaran . Stability analysis for bidirectional associative memory neural networks: A new global asymptotic approach. AIMS Mathematics, 2025, 10(2): 3910-3929. doi: 10.3934/math.2025182 |
[4] | Rakkiet Srisuntorn, Wajaree Weera, Thongchai Botmart . Modified function projective synchronization of master-slave neural networks with mixed interval time-varying delays via intermittent feedback control. AIMS Mathematics, 2022, 7(10): 18632-18661. doi: 10.3934/math.20221025 |
[5] | Nina Huo, Bing Li, Yongkun Li . Global exponential stability and existence of almost periodic solutions in distribution for Clifford-valued stochastic high-order Hopfield neural networks with time-varying delays. AIMS Mathematics, 2022, 7(3): 3653-3679. doi: 10.3934/math.2022202 |
[6] | R. Sriraman, P. Vignesh, V. C. Amritha, G. Rachakit, Prasanalakshmi Balaji . Direct quaternion method-based stability criteria for quaternion-valued Takagi-Sugeno fuzzy BAM delayed neural networks using quaternion-valued Wirtinger-based integral inequality. AIMS Mathematics, 2023, 8(5): 10486-10512. doi: 10.3934/math.2023532 |
[7] | Zhifeng Lu, Fei Wang, Yujuan Tian, Yaping Li . Lag synchronization of complex-valued interval neural networks via distributed delayed impulsive control. AIMS Mathematics, 2023, 8(3): 5502-5521. doi: 10.3934/math.2023277 |
[8] | Ailing Li, Mengting Lv, Yifang Yan . Asymptotic stability for quaternion-valued BAM neural networks via a contradictory method and two Lyapunov functionals. AIMS Mathematics, 2022, 7(5): 8206-8223. doi: 10.3934/math.2022457 |
[9] | Huahai Qiu, Li Wan, Zhigang Zhou, Qunjiao Zhang, Qinghua Zhou . Global exponential periodicity of nonlinear neural networks with multiple time-varying delays. AIMS Mathematics, 2023, 8(5): 12472-12485. doi: 10.3934/math.2023626 |
[10] | Yongkun Li, Xiaoli Huang, Xiaohui Wang . Weyl almost periodic solutions for quaternion-valued shunting inhibitory cellular neural networks with time-varying delays. AIMS Mathematics, 2022, 7(4): 4861-4886. doi: 10.3934/math.2022271 |
This study delved into the analytical investigation of two significant nonlinear partial differential equations, namely the fractional Kawahara equation and fifth-order Korteweg-De Vries (KdV) equations, utilizing advanced analytical techniques: the Aboodh residual power series method and the Aboodh transform iterative method. Both equations were paramount in various fields of applied mathematics and physics due to their ability to describe diverse nonlinear wave phenomena. Here, we explored using the Aboodh methods to efficiently solve these equations under the framework of the Caputo operator. Through rigorous analysis and computational simulations, we demonstrated the efficacy of the proposed methods in providing accurate and insightful solutions to the time fractional Kawahara equation and fifth-order KdV equations. Our study advanced the understanding of nonlinear wave dynamics governed by fractional calculus, offering valuable insights and analytical tools for tackling complex mathematical models in diverse scientific and engineering applications.
Because of advancements in manufacturing and technical development, products and devices are becoming increasingly reliable, making typical life tests under normal conditions difficult, if not impossible. For the industrial markets, such tests are too time-consuming and costly to get sufficient information about a product's lifetime distribution or even a prototype. As a result, the accelerated life test (ALT) is becoming more popular as well as important, as it gives information on a highly reliable product's lifetime in a short amount of time, see [1,2]. More failures can be collected quickly by conducting the life test at higher stress levels than normal operating conditions. A suitable stress-response regression model is then used to estimate the lifetime distribution at the usage stress. As a particular class of ALT, the progressive-stress test implements a special stress-loading scheme where the stress increases in time, see [3,4,5,6].
Censoring often occurs when some lifetimes of products are missing or when some experimental design purposes are being implemented. Type-I and type-II censoring are the most prevalent schemes. These two types of censoring do not have the flexibility to allow the experimenter to withdraw units from a life test at different stages during the experiment. Because of this lack of flexibility, progressive censoring is proposed as a more general censoring technique. This technique enables the experimenter to withdraw units from a life test at different stages, other than the final point, through the test experiment.
One of various risk factors could cause the units to fail in reliability, engineering, biomedicine, physical studies, and other fields. If the risks are unclear, an issue may occur regarding the factor that caused the unit to fail, and as a result, the lifetime associated with this particular risk cannot be determined. In such cases, the maximum (minimum) lifetime value with all risks can only be observed. This is referred to as a "series (parallel) system" in the literature. These two systems can be joined to form a new system known as a "series-parallel system", see its description in Figure 1. In a series system, components are connected in such a way that the failure of a single component leads to the failure of the system. On the other hand, a system in which failure of all components leads to the failure of the system is called a parallel system. Finally, a series-parallel system is a system in which m subsystems are connected in series and each subsystem consists of k components connected in parallel, see Figure 1. For such a system, failure is observed if a subsystem fails. The technique of compounding of distribution functions may be used to construct the lifetime distribution of the series-parallel system.
Abdel-Hamid and Hashem [7] introduced a new distribution for the series-parallel system by compounding two Poisson distributions (truncated at zero) with an exponential distribution. They used six estimation methods to estimate the included parameters. In [8], they obtained a new distribution by compounding two discrete distributions with a mixture of continuous distributions based on a parallel-series system. Nadarajah et al. [9] applied the progressive-stress ALT technique using type-I progressively hybrid censored data with binomial removal to components connected as a parallel-series structure. Hu et al. [10], suggested an ideally distributed series-parallel system, as well as two analytical reliability assessment approaches to analyze the reliability of a distributed power system. Hashem and Alyami [11] introduced a new distribution based on a parallel-series system.
In this article, we introduce a new distribution that describes the lifetime of a series-parallel system when the lifetimes of the included components are subject to a mixture of exponential and gamma distributions. The distribution will be called Poisson-geometric-Lomax distribution (PGLD) which can be obtained by compounding truncated Poisson, geometric, and Lomax distributions
The following is how the rest of the article is put together: The distribution of a series-parallel system based on a finite mixture of distributions is constructed in Section 2. In Section 3, the model is described based on progressive-stress ALT and progressive type-II censoring. In Section 4, estimation of the parameters is obtained using the maximum likelihood and Bayes methods. In Section 5, the parameters are estimated numerically using maximum likelihood and Bayes estimation methods. An illustrative example, based on two real data sets, is studied in Section 6. A simulation study is worked done in Section 7. Concluding remarks followed by certain features and motivations of the PGLD as well as future work are presented in Section 8.
A mixture of distributions is considered a combination of statistical distributions, that occur when sampling from inhomogeneous populations (or mixed populations) with a different probability density function (PDF) in each component.
Let F(y∣ω) be the cumulative distribution function (CDF) of a continuous random variable (RV) Y, with realization y, that is dependent on a continuous RV Ω, with realization ω, and Π(ω) be the CDF of Ω. The marginal CDF F(y) is given by
F(y)=∫ΩF(y∣ω)dΠ(ω), | (2.1) |
which is called a mixture (according to Teicher [12]) of the CDFs F(y∣ω) and Π(ω). Fisher [13] called F(y) "compound distribution".
The corresponding PDF is given by
f(x)=∫Ωf(y∣ω)π(ω)dω. | (2.2) |
If the RV Ω assumes only a finite number of points {ωj,j=1,…,κ}, Π(ω) is then a mass function and assigns positive probabilities to only ωj. The integral in (2.1) is then replaced by a sum to give a { finite mixture} of the form
F(y)=κ∑j=1F(y;ωj)Π(ωj). | (2.3) |
Suppose, in (2.3), Π(ωj)=pj, j=1,…,κ, and F(y;ωj)=Fj(y). Then (2.3) takes the form
F(y)=κ∑j=1pjFj(y). | (2.4) |
A corresponding finite mixture of PDFs is given by
f(y)=κ∑j=1pjfj(y). | (2.5) |
In (2.4) and (2.5) the masses pj are called mixing proportions with the conditions:
0≤pj≤1,j=1,…,κ,andκ∑j=1pj=1. |
The functions Fj and fj are called jth components in the finite mixture of CDFs (2.4) and PDFs (2.5), respectively. For more details on finite mixture of distributions, see [14,15]. Several distributions may arise due to compound distribution (2.2) and mixture distribution (2.5), see [16], such as
(i) If in (2.2), f(y∣ω) has exp(y∣ω) distribution and π(ω) has gamma(α,γ) distribution, then we get Lomax(α,γ) distribution (LD) as follows:
Lomax(α,γ)≡f(y;α,γ)=∫∞0ωe−ωyγαΓ(α)ωα−1e−γωdω=αγ(1+yγ)−(α+1),y>0,(α,γ>0). | (2.6) |
Therefore, the corresponding CDF is given by
F(y;α,γ)=1−(1+yγ)−α,x>0,(α,γ>0). | (2.7) |
(ii) If in (2.5), κ=2,p1=θθ+1, f1(y) has exp(θ) distribution and f2(y) has gamma(2,θ) distribution, then we get Lindley distribution as follows:
Lindley(θ)≡f(y;θ)=θθ+1[θe−θy]+1θ+1[θ2ye−θy]=θ2θ+1(1+y)e−θy,y>0,(θ>0). |
If in (2.2), f(y∣Ω=ω)=ωf1(y)+(1−ω)f2(y) and the RV Ω subjects to a beta distribution, B(a,b), then
f(y)=∫10f(y∣ω)π(ω)dω=1B(a,b)∫10(ωf1(y)+(1−ω)f2(y))ωa−1(1−ω)b−1dω=aa+bf1(y)+ba+bf2(y),t>0,(a,b>0). | (2.8) |
Several new distributions of the series-parallel system may emerge by choosing different distributions for the number of subsystems as well as those describing the lifetimes of the included components, as explained in [7,8], but in exchange, several questions may arise, for example:
● Does the presented distribution have any interesting properties?
● Does it arise naturally in some observable process like natural phenomena?
● Does it have any motivations?
There is no need for new distributions if there are no positive answers to the above questions. A response to these questions will be illustrated in Section 8.
According to [7,8], the following theorem gives the CDF and PDF of the PGLD which represents the failure time distribution of the series-parallel system when the number of series subsystems and the number of their components, that are connected as a parallel structure, are RVs subject to truncated Poisson and geometric distributions, respectively. At the same time, the failure times of the components have a finite mixture of distribution functions each of which may be responsible for a different cause of failure.
Theorem 2.1. Suppose that, for i=1,2,…,kj, j=1,2,…,m, a mixed system of series-parallel structure type has the lifetime X=minj(maxiYij), where Yij are IID RVs, see Figure 1, with PDF(CDF) fY(y)(FY(y)) of Lomax distribution given by (2.6) ((2.7)). Consider M and Kj are two discrete RVs subject to truncated Poisson and geometric distributions with PMFs P(M=m)=e−λλmm!(1−e−λ),m=1,2,…,(λ>0) and P(Kj=kj)=(1−θ)θkj−1,kj=1,2,…,(0<θ<1), respectively. Then the distribution of X has the PGLD with CDF and PDF given, respectively, by
FX(x)=1−e−λΩ(x)1−e−λ, | (2.9) |
fX(x)=λΩ2(x)e−λΩ(x)fY(x)(1−θ)(1−e−λ)F2Y(x), | (2.10) |
where
Ω(x)=(1−θ)FY(x)1−θFY(x),x>0. | (2.11) |
Proof. The proof is similarly as in [7].
Remark 2.1. It is worth noting that while γ is a scale parameter for LD with CDF (2.7), it is also a scale parameter for PGLD with CDF (2.9).
The PDF and hazard rate function (HRF), h(x)=f(x)/(1−F(x)), of the PGLD are plotted in Figure 2, in which one can observe that the PDF exhibits decreasing and unimodal shapes while the HRF exhibits a unimodal shape and may be sudden fluctuation at its end. These fluctuations usually imply that the product's performance has degraded over time. Non-stationary data can exhibit this characteristic, and the PGLD can help to represent such data. The non-stationary nature of failure times may help the researcher forecast how some items will behave in the environment.
The features and motivations of the PGLD with CDF and PDF (2.9) and (2.10) are summarized in the last section.
In the following section, some important properties of the PGLD are given.
In this section, some important properties of the PGLD, such as the q-th quantile, mode, r-th moment, mean residual lifetime, Bonferroni and Lorenz curves, Rényi and Shannon's entropies, PDF and CDF of the i-th order statistic, are given.
Theorem 3.1. The q-th quantile xq of the PGLD with CDF (2.9) can be obtained as
xq=γ[(1+ln[1−q(1−e−λ)]λ−λθ−θln[1−q(1−e−λ)])−1α−1]. | (3.1) |
Proof. By solving the equation FX(xq)=q with respect to xq, the proof can be achieved immediately.
Remark 3.1. As a particular case, the median of PGLD with CDF (2.9) can be obtained by putting q=1/2 in Eq (3.1).
Theorem 3.2. Let X be a RV subject to the PGLD with PDF (2.10). Then the mode is given by
x∗=γ((−D2+√D22−4D1D32D1)1α−1), | (3.2) |
where
D1=(α+1)(1−θ)2,D2=2θ(1−θ)(α+1)+α(λ−2θ)(1−θ),D3=θ2(1−α). |
Proof. The mode can be directly obtained by solving dln[fX(x)]dx=0 with respect to x.
Theorem 3.3. Let X be a RV subject to PGLD with PDF (2.10). Then, for r=1,2,…, the r-th moment of X is given by
m(r)=N∑ı=02νı(1−yı)2Φ(r)(y∗ı), |
where yı,νı are the zeros and the corresponding Christoffel numbers of the Legendre-Gauss quadrature formula on the interval (-1, 1), see Canuto et al. [17].
Proof. The r-th moment of X is given by
m(r)=∫∞0xrfX(x)dx=αλγ(1−θ)(1−e−λ)∫1−12(1−y)2Φ(r)(y∗)dy, |
where y∗=1+y1−y, and
Φ(r)(y)=yrΩ2(y)e−λΩ(y)(1+yγ)−α−1(1−(1+yγ)−α)2. | (3.3) |
The last integral can be approximated, by using Legendre-Gauss quadrature formula, as
m(r)=N∑ı=02νı(1−yı)2Φ(r)(y∗ı), | (3.4) |
where
νı=2(1−y2ı)[L′N+1(yı)]2andL′N+1(yı)=dLN+1(y)dyaty=yı, | (3.5) |
and LN(.) is the Legendre polynomial of degree N.
Theorem 3.4. The mean residual lifetime of the PGLD is given by
MRL(x0)=2x0e−λΩ(x0)−e−λN∑ı=0νı(1−yı)2(e−λΩ(2x01−yı)−e−λ), |
where νı is given by (3.5).
Proof. The mean residual lifetime of the PGLD is given by
MRL(x0)=E[X−x0∣X>x0]=1S(x0)∫∞x0S(x)dx=1S(x0)∫1−12x0(1−y)2S(2x01−y)dy, |
where S(x)=1−F(x) is the survival function.
The last integral can be approximated, by using Legendre-Gauss quadrature formula, as
MRL(x0)=2x0e−λΩ(x0)−e−λN∑ı=0νı(1−yı)2(e−λΩ(2x01−yı)−e−λ), |
where νı is given by (3.5).
Theorem 3.5. Let X be a RV subject to the PGLD with PDF (2.10). Then, the Bonferroni curve (BC) and Lorenz curve (LC) are given, respectively, by
BC(η)=AηN∑ı=0νıΦ(1)(pη2(yı+1)),LC(η)=ηAηN∑ı=0νıΦ(1)(pη2(yı+1)), | (3.6) |
where νı is given by (3.5), 0<η<1 and
Aη=αλpη2m(1)ηγ(1−θ)(1−e−λ), |
pη=F−1(η)=γ((1+ln[1−η(1−e−λ)]λ−λθ−θln[1−η(1−e−λ)])−1α−1), |
and Φ(1)(.) and m(1) are given, respectively, by (3.3) and (3.4) at r=1.
Proof. The Bonferroni curve of PGLD is given by
BC(η)=1ηm(1)∫pη0xfX(x)dx=Aη∫1−1Φ(1)(pη2(y+1))dy=AηN∑ı=0νıΦ(1)(pη2(yı+1)). |
The Lorenz curve of PGLD is given by
LC(η)=1m(1)∫pη0xfX(x)dx=ηAηN∑ı=0νıΦ(1)(pη2(yı+1)). |
The Bonferroni and Lorenz curves are plotted in Figure 3.
Theorem 3.6. Let X be a RV subject to the PGLD with PDF (2.10). Then, the Rényi and Shannon's entropies of X are given, respectively, by
RE(ℓ)=11−ℓ(ℓln[αλγ(1−θ)(1−e−λ)]+ln[N∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı)]),SHE=ln[γ(1−θ)(1−e−λ)αλ]+2αλγ(1−θ)(1−e−λ)N∑ı=02νı(1−yı)2W∗(1+yı1−yı)Φ(0)(1+yı1−yı), | (3.7) |
where νı is given by (3.5) and
Wℓ(y)=Ω2ℓ(y)e−λℓΩ(y)(1+yγ)−αℓ−ℓ[1−(1+yγ)−α]2ℓ, |
W∗(y)=−2ln[Ω(y)]+λΩ(y)+(α+1)ln[1+yγ]+2ln[1−(1+yγ)−α]. |
Proof. The Rényi entropy of X is given by
RE(ℓ)=11−ℓln[∫∞0fℓX(x)dx], |
where ℓ>0 and ℓ≠1.
Based on PDF (2.10), we obtain
∫∞0fℓX(x)dx=[αλγ(1−θ)(1−e−λ)]ℓ∫∞0Wℓ(x)dx=[αλγ(1−θ)(1−e−λ)]ℓN∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı). |
Then,
RE(ℓ)=11−ℓ(ℓln[αλγ(1−θ)(1−e−λ)]+ln[N∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı)]). |
The Shannon's entropy of T is given by
SHE=E[−ln[fX(x)]]=ln[α(1−θ)(1−e−λ)αλ]+E[W∗(x)]=ln[γ(1−θ)(1−e−λ)αλ]+2αλγ(1−θ)(1−e−λ)N∑ı=02νı(1−yı)2W∗(1+yı1−yı)Φ(0)(1+yı1−yı). |
Theorem 3.7. Let X1,…,Xn is a random sample from the PGLD with CDF (2.9) and PDF (2.10). Then, the PDF and CDF of the i-th order statistic, say Xi:n, are given, respectively, by
fj:n(x)=j(nj)αλγ(1−θ)Ω2(x)(1+xγ)−α−1[1−(1+xγ)−α]2×n−j∑r1=0j+r1−1∑r2=0(−1)r1+r2(n−jr1)(j+r1−1r2)e−λ(1+r2)Ω(x)(1−e−λ)r1+j, | (3.8) |
Fj:n(x)=n∑r3=jn−r3∑r4=0r3+r4∑r5=0(−1)r4+r5(nr3)(n−r3r4)(r3+r4r5)e−λr5Ω(x)(1−e−λ)r3+r4. | (3.9) |
Proof. The PDF fj:n(t) of the j-th order statistic, see [18,19], is given by
fj:n(x)=j(nj)fX(x)[FX(x)]j−1[1−FX(x)]n−j, | (3.10) |
where FX(x) and fX(x) are given by Eqs (2.9) and (2.10), respectively.
Therefore,
fj:n(x)=j(nj)n−j∑r1=0(−1)r1(n−jr1)fX(x)[FX(x)]j+r1−1=j(nj)αλγ(1−θ)Ω2(x)(1+xγ)−α−1[1−(1+xγ)−α]2×n−j∑r1=0j+r1−1∑r2=0(−1)r1+r2(n−jr1)(j+r1−1r2)e−λ(1+r2)Ω(x)(1−e−λ)r1+j. |
The CDF Fj:n(x), corresponding to PDF (3.10), is given by
Fj:n(x)=n∑r3=j(nr3)[FX(x)]r3[1−FX(x)]n−r3=n∑r3=jn−r3∑r4=0r3+r4∑r5=0(−1)r4+r5(nr3)(n−r3r4)(r3+r4r5)e−λr5Ω(x)(1−e−λ)r3+r4. |
Based on units connected in a series-parallel structure, we discuss, in this section, the application of a progressive-stress model to units whose lifetime distribution subjects to the PGLD with CDF (2.9). We assume that the units are subject to progressive type-II censoring and that the number of surviving units eliminated follows a binomial distribution.
It is well-known in the previous literature regarding the progressive-stress model that the stress is considered an increasing linear function of time. Now, in the current article, the stress is supposed to be represented by an increasing nonlinear function of time.
(1) The lifetime of units under normal conditions is governed by CDF (2.9) of PGLD with scale parameter γ.
(2) According to the progressive-stress model, it is assumed that the stress s is a function of time t and affects the scale parameter γ of CDF (2.7) which is also a scale parameter of CDF (2.9).
(3) The parameter γ follows the inverse power law with two parameters c and d. This means that γ≡γ(t)=1c(s(t))d. For the sake of simplicity, we'll assume that the parameter d assigns the value 1 from now on.
(4) The progressive-stress s(t) assigns an increasing nonlinear function of time, s(t)=sinh(vt), which is also continuous and differentiable for t>0.
(5) To start the testing procedure, the total N units to be tested are split into ℏ(≥2) groups, each of them consists of ni units under progressive-stress si(t)=sinh(vit),i=1,2,…,ℏ, such that the stress rates satisfy 0<v1<v2<⋯<vℏ.
(6) At whatever stress level vi,i=1,2,…,ℏ, a unit's failure mechanisms are the same.
(7) The cumulative exposure model holds for the effect of changing stress, see [1].
According to Steps 1, 3, and 5, and using Assumption 7, we can write
Φi(x)=∫x01γ(si(t))dt=cvi(cosh(vix)−1),i=1,2,…,ℏ, | (4.1) |
where c is a parameter that should be estimated.
Therefore, if GiY(.) denotes the CDF for a unit in group i under progressive-stress model, then using Assumption 7, it takes the form
GiY(x)=FiY(Φi(x))=1−[1+cvi(cosh(vix)−1)]−α=1−[Ψi(x)]−α, | (4.2) |
where FiY(.) is the CDF (2.7) of LD, included in (2.9), under group i, with scale parameter value equal to 1 and
Ψi(x)=Φi(x)+1, | (4.3) |
where Φi(x) is given by (4.1).
The corresponding PDF of (4.2) is given by
giY(x)=αcsinh(vix)[Ψi(x)]−α−1. | (4.4) |
Then, by replacing FY(y) and fY(y) with CDF (4.2) and PDF (4.4), respectively, CDF (2.9) under progressive-stress ALT becomes
FiX(x)=1−e−λΩi(x)1−e−λ. | (4.5) |
The corresponding PDF of (4.5) is given by
fiX(x)=λαcΩ2i(x) sinh(vi(x))Ψ−α−1i(x)(1−θ)(1−e−λ)eλΩi(x)(1−Ψ−αi(x))2, | (4.6) |
where
Ωi(x)=(1−θ)(1−Ψ−αi(x))1−θ(1−Ψ−αi(x)),x>0, | (4.7) |
and Ψi(x) is given by (4.3).
There are a variety of censored tests available. Progressive type-II censoring is one of the most widely used censored tests. It is implemented under progressive-stress model as follows: Suppose that, in the i-th group, i=1,2,…,ℏ, ni units are put through a life test and the experimenter specifies beforehand quantity wi, the number of failure units that will be observed. Now, when the first failure occurs, ri1 of the remaining ni−1 surviving units are eliminated from the experiment at random. Continuing, when the second failure occurs, ri2 of the remaining ni−ri1−2 surviving units are eliminated from the experiment at random. Finally, at the time of occurring the wi-th failure, all the remaining riwi=ni−wi−ri1−ri2−⋯−riwi−1 surviving units are eliminated from the experiment at random. Note that, in this scheme, ri1,ri2,…,riwi are all prefixed. However, these numbers may occur at random, in some practical situations. For example, in some reliability experiments, an experimenter may assess that it is unsuitable or too risky to perform the test on some of the tested units even though these units have not failed. In such situations, the pattern of elimination at each failure is random. Such a situation leads to progressive censoring with random removals, see [20,21].
We discuss, in this section, two estimation methods (maximum likelihood estimation (MLE) and Bayesian estimation (BE)) for the parameters included in CDF (4.5) under progressive type-II censoring with binomial removals.
For group i, i=1,2,…,ℏ, suppose xi1,xi2,…,xini are ni lifetimes from a population with CDF (4.5) and PDF (4.6), and x=(x(ri1…,riwi)i1:wi:ni,x(ri1…,riwi)i2:wi:ni, …,x(ri1…,riwi)iwi:wi:ni) is a vector of wi progressively type-II ordered lifetimes out of ni with progressive censoring scheme (ri1…,riwi). Suppose also wi and rij are all predetermined before the test. From now on, we will write xij instead of x(ri1…,riwi)ij:wi:ni, j=1,2,…,wi, for simplicity. Then, based on Eqs (4.2)–(4.7), the conditional likelihood function is given by
L1(α,λ,θ,c;x∣R=r)∝ℏ∏i=1wi∏j=1fiX(xij)[1−FiX(xij)]rij=(λαc1−θ)ℏ∑i=1wi×ℏ∏i=1wi∏j=1Ω2i(xij) sinh(vi(xij))Ψ−α−1i(xij)(1−e−λ)ri+1eλΩi(xij)(1−Ψ−αi(xij))2(e−λΩi(xij)−e−λ)rij. | (5.1) |
The numbers rij may occur at random in some practical scenarios as a result of unanticipated dropout experimental units. Therefore, we assume that Rij(i=1,2,…,ℏ,j=1,2,…,wi−1) are RVs, with realizations rij, subject to the following binomial distributions
Ri1∼b(ni−wi,p), |
whereas,
(Rij∣ri1,ri2,…,ri(j−1))∼b(ni−wi−j−1∑k=1rik,p),j=2,3,…,wi−1. |
Then
L(α,λ,θ,c;x)=L1(α,λ,θ,c;x∣R=r)P(Ri1=ri1)×wi−1∏k=2P(Rik=rik∣Ri1=ri1,…,Ri(k−1)=ri(k−1)), | (5.2) |
where, for fixed i and j, rij is an integer number satisfying 0≤rij≤ni−wi−(ri1+ri2+…,ri(j−1)),i=1,2,…,ℏ,j=2,3,…,wi−1, and
P(Rij=rij∣Ri(j−1)=ri(j−1),…Ri1=ri1)=(ni−wi−∑j−1k=1rikrij)prij(1−p)ni−wi−∑jkrik. |
Furthermore, suppose that Rij is independent of Xij for i=1,…,ℏ, j=1,…,wi. Then
P(R=r)=ℏ∏i=1[P(Ri(wi−1)=ri(wi−1)∣Ri(wi−2)=ri(wi−2),…,Ri1=ri1)…P(Ri1=ri1)]=ℏ∏i=1[(ni−wi)!(ni−wi−∑wi−1k=1rik)!wi−1∏k=1rik!p∑wi−1k=1rik(1−p)(wi−1)(ni−wi)−∑wi−1k=1rik(wi−k)]. | (5.3) |
Since P(R=r) does not depend on the parameters (α,λ,θ,c), hence the MLE of them can be derived by maximizing (5.1). Similarly, (5.1) does not involve the binomial parameter p, the MLE of p can be found by maximizing (5.3) directly. Thus
∂(lnL)∂p=0=1pℏ∑i=1wi−1∑k=1rik−ℏ∑i=1{(wi−1)(ni−wi)−∑wi−1k=1(wi−k)rik1−p}. |
Therefore,
ˆp=∑ℏi=1∑wi−1k=1rik∑ℏi=1(wi−1)(ni−wi)−∑ℏi=1∑wi−1k=1(wi−k−1)rik. | (5.4) |
The local Fisher information matrix, I, for (ˆα,ˆλ,ˆθ,ˆc) is the 4×4 symmetric matrix of negative second partial derivatives of £ = \text{ln} L_1 with respect to \alpha , \lambda , \theta , and c , see [1]. Let (\varphi_1 = \alpha, \, \varphi_2 = \lambda, \, \varphi_3 = \theta, \, \varphi_4 = c) . Therefore, matrix {\bf{I}} is given by
\begin{equation*} {\bf{I}} = -\left(\frac{\partial^2\hat{£}}{\partial \varphi_{i} \partial \varphi_{j}} \right)_{4\times4}, i,j = 1,\dots,4, \end{equation*} |
where the caret \hat{} denotes that the derivative is computed at (\hat{\varphi_1} = \hat{\alpha}, \, \hat{\varphi_2} = \hat{\lambda}, \, \hat{\varphi_3} = \hat{\theta}, \, \hat{\varphi_4} = \hat{c}) . It is easy to get the matrix's elements.
The local estimate {\bf{V}} of the asymptotic variance-covariance matrix of (\hat{\alpha}, \hat{\lambda}, \hat{\theta}, \hat{c}) can be obtained by inverting matrix {\bf{I}} . Therefore,
\begin{equation} {\bf{V}} = {\bf{I^{-1}}} = \left(\text{cov}\left(\varphi_{i},\varphi_{j}\right) \right)_{4\times4}, i,j = 1,\dots,4. \end{equation} | (5.5) |
The sampling distribution of \frac{\hat{\varphi_i}-\varphi_i}{\sqrt{\text{var}(\hat{\varphi_i})}}, \; i = 1, \dots, 4, follows the general asymptotic theory of MLEs and hence it may be approximated by a standard normal distribution which can be used to create confidence intervals (CIs) for unknown parameters.
Therefore, a two-sided (1-\eta^{\star})100\% asymptotic CIs for the parameters \varphi_i, \; i = 1, \dots, 4 , can then be created as follows:
\hat{\varphi_i}\mp z_{\eta^{\star}/2}\sqrt{\text{var}(\hat{\varphi_i})} |
where z_{\eta^{\star}/2} is the value of a standard normal RV that leaves an area \eta^{\star}/2 to the right and \sqrt{\text{var}(\hat{\varphi_i})} can be determined from (5.5).
It's commonly beneficial in practical applications to have a concept of how long a test should last. Because the time it takes to complete the test is proportional to the expense. For i = 1, \dots, \hbar , the expected termination time under progressive type-II censoring scheme with binomial removals is given by, see [22].
\begin{equation} \begin{split} E[X_{iw_i} \mid {{\bf R}_i} = {{\bf r}_i}] = &C({{\bf r}_i})\sum\limits_{\ell_1 = 0}^{r_{i1}}\dots \sum\limits_{\ell_{w_i} = 0}^{r_{iw_i}}(-1)^{\mathcal{A}_i} \frac{ \left( \begin{array}{c} {r_{i1}}\\ {\ell_1} \\ \end{array} \right) \dots \left( \begin{array}{c} {r_{iw_i}}\\ {\ell_{w_i}} \\ \end{array} \right) }{\prod_{k = 1}^{w_i-1}h(\ell_k)} \int_{0}^{\infty}x f_X(x) F_X^{h(\ell_{w_i})-1}(x)\text{d}x, \end{split} \end{equation} | (5.6) |
where {{\bf R}_i} = (R_{i1} = r_{i1}, \dots, R_{i(w_i-1)} = r_{i(w_i-1)}) , ( F_X(x), f_X(x) ) is given by ((4.5), (4.6)), \mathcal{A}_i = \sum_{k = 1}^{w_i}\ell_k , h(\ell_k) = \ell_1+\dots+ \ell_k+k , and C({{\bf r}_i}) = n_i(n_i-r_{i1}-1)(n_i-r_{i1}-r_{i2}-2)\dots(n_i-\sum_{k = 1}^{w_i-1}(r_{ik}+1)) . Therefore,
\begin{equation*} \begin{split} E[X_{iw_i}] = &E_{{\bf R}}[E(X_{iw_i} \mid {{\bf R}_i})]\\ = & \sum\limits_{r_{i1} = 0}^{q(r_{i1})} \dots \sum\limits_{r_{i(w_i-1)} = 0}^{q(r_{i(w_i-1)})}P({\bf R} = {\bf r})E(X_{iw_i}\mid {\bf R}), \end{split} \end{equation*} |
where q(r_{i1}) = n_i-w_i, \, q(r_{ij}) = n_i-w_i-r_{i1}-r_{i2}-\dots-r_{i(j-1)}, \, i = 1, 2, \dots, \hbar, \, j = 2, 3, \dots, w_i-1, and P({\bf R} = {\bf r}) is given by Eq (5.3). Then compute the ratio of expected experiment time (REET) as follows:
REET = \frac{E(X_{iw_i})}{E(X_{in_i})}. |
Based on two asymmetric loss functions (general entropy (GE) and linear exponential (LINEX)), we discuss the Bayesian estimation of the parameters of CDF (4.5). Because they offer overestimation or underestimation of the parameters, symmetric loss functions may be unsuitable in many real-life situations. Overestimating the parameters can have worse or worse repercussions than underestimating them or vice versa. As a result, asymmetric loss functions have been the subject of research, see [23,24].
The LINEX loss function was suggested by Varian [25]. It is given by
\mathcal{L}(\xi)\propto e^{\nu \xi}- \nu \xi-1, \nu\neq 0, |
where \xi = \tilde{\Theta}-\Theta and \tilde{\Theta} is the LINEX estimator of \Theta .
The Bayes estimate under LINEX (BEL) loss function of \Theta is given by
\begin{equation} \begin{split} \tilde{\Theta} = \frac{-1}{\nu}\ln[E({e^{-\nu \Theta}}|{{\bf{x}}})]. \end{split} \end{equation} | (5.7) |
The GE loss function was introduced by Calabria and Pulcini [26]. It is given by
\mathcal{L}(\ddot{\Theta},\Theta)\propto \left(\frac{\ddot{\Theta}}{\Theta}\right)^{\nu}-\nu\ln\left(\frac{\ddot{\Theta}}{\Theta}\right)-1, \nu\neq 0. |
The Bayes estimate under GE (BEG) loss function of \Theta is given by
\begin{equation} \begin{split} \ddot{\Theta} = \left[E(\Theta^{-\nu})\right]^{\frac{-1}{\nu}}. \end{split} \end{equation} | (5.8) |
Assume that the experimenter's prior belief is measured by a function \vartheta(\alpha, \lambda, \theta, c) , with all parameters being independent and having log-normal distributions except \theta , which has a beta distribution. Therefore, if (\varphi_1 = \alpha, \, \varphi_2 = \lambda, \varphi_3 = c, \varphi_4 = \theta) , then the prior function of \varphi_i , i = 1, 2, 3 , is given by
\begin{equation} \vartheta_i(\varphi_i) = \frac{1}{\sigma_i\varphi_i \sqrt{2 \pi}}\text{exp}\left\{-\frac{1}{2}\left(\frac{\text{ln } \varphi_i-\mu_i}{\sigma_i}\right)^{2}\right\},\; \varphi_i > 0,\; (-\infty < \mu_i < \infty,\, \sigma_i > 0), \end{equation} | (5.9) |
and the prior function of \varphi_4 is given by
\begin{equation} \vartheta_4(\varphi_4) = \frac{1}{B(a_1,a_2)}\varphi_4^{a_1-1}(1-\varphi_4)^{a_2-1},\; 0 < \varphi_4 < 1, \, (a_1,a_2 > 0). \end{equation} | (5.10) |
The joint prior density function is then calculated as follows:
\begin{equation} \vartheta(\alpha,\lambda,\theta,c) = \vartheta_{1}(\alpha)\, \vartheta_{2}(\lambda)\, \vartheta_{3}(c)\, \vartheta_{4}(\theta)\propto \frac{1}{\alpha\lambda c}e^{-\Delta}\theta^{a_1-1}(1-\theta)^{a_2-1}, \end{equation} | (5.11) |
\Delta = \frac{1}{2}\left\{\left(\frac{\text{ln } \alpha-\mu_1}{\sigma_1}\right)^{2}+\left(\frac{\text{ln } \lambda-\mu_2}{\sigma_2}\right)^{2}+\left(\frac{\text{ln } c-\mu_3}{\sigma_3}\right)^{2}\right\}. |
From (5.1) and (5.11), the joint posterior density function is then calculated as follows:
\begin{equation} \begin{split} \vartheta^{*}(\alpha,\lambda,\theta,c|{{\bf{x}}}, {{\bf{r}}})& = K^{-1}\frac{\alpha^{\sum\limits_{i = 1}^\hbar w_i-1} \lambda^{\sum\limits_{i = 1}^\hbar w_i-1} c^{\sum\limits_{i = 1}^\hbar w_i-1}}{(1-e^{-\lambda})^{\sum\limits_{i = 1}^{\hbar}(w_i+\sum\limits_{j = 1}^{w_i}r_{ij})}} \frac{\theta^{a_1-1}e^ {-\Delta}}{(1-\theta)^{\sum\limits_{i = 1}^{\hbar}w_i-a_2+1}}\\\\& \times \prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{\Omega_{i}^{2}(x_{ij})\text{ sinh}(v_i\,x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}, \end{split} \end{equation} | (5.12) |
where
K = \int_{0}^{\infty}\int_{0}^{1}\int_{0}^{\infty}\int_{0}^{\infty}\vartheta(\alpha,\lambda,\theta,c) L(\alpha,\lambda,\theta,c)\text{ d}\alpha\text{ d}\lambda\text{ d}\theta\text{ d}c. |
It can be noted that K involves quad integral and it is not reducible in a closed form, and hence generating samples directly from the joint posterior density is not possible. The MCMC algorithm, presented in the following subsection, can be implemented in this case in which we need the following conditional posterior distributions of the parameters \alpha , \lambda , \theta and c ,
\begin{equation} \left. \begin{split} \vartheta^{*}(\alpha\mid \lambda,\theta,c)& \propto \alpha^{\sum\limits_{i = 1}^{\hbar}w_i-1}\text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }\alpha-\mu_1}{\sigma_1}\right)^{2}\right] \\& \times\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i}\frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(\lambda\mid \alpha,\theta,c)& \propto \frac{\lambda^{\sum\limits_{i = 1}^\hbar w_i-1} }{(1-e^{-\lambda})^{\sum\limits_{i = 1}^{\hbar}(w_i+\sum\limits_{j = 1}^{w_i}r_{ij})}} \text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }\lambda-\mu_2}{\sigma_2}\right)^{2}\right] \\ & \times \prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(\theta\mid \alpha,\lambda,c)& \propto \frac{\theta^{a_1-1}}{(1-\theta)^{\sum\limits_{i = 1}^{\hbar}w_i-a_2+1}}\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(c\mid \alpha,\lambda,\theta)&\propto c^{\sum\limits_{i = 1}^{\hbar}w_i-1}\text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }c-\mu_3}{\sigma_3}\right)^{2}\right] \\ & \times\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i}\frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}. \end{split}\right\} \end{equation} | (5.13) |
The Metropolis-Hasting technique can be used if the conditional posterior distribution isn't one of the known parametric distributions. Then, it can be used to generate samples from conditional posterior distributions using the MCMC algorithm, see [27].
The following procedure can be used to compute BELs and BEGs of \alpha , \lambda , \theta , and c ,
(1) Assign some initial values of \alpha , \lambda , \theta and c say \alpha_{0} , \lambda_{0} , \theta_{0} and c_{0} .
(2) For i = 1 , using Metropolis-Hastings technique, generate \alpha_{i} , \lambda_{i} , \theta_{i} and c_{i} from the conditional posterior distributions presented in (5.13).
(3) Repeat Step 2, \mathbb{M} times.
(4) Calculate the BELs of \alpha , \lambda , \theta and c , using Eq (5.7), as
\begin{equation} \left.\begin{split} \tilde{\alpha}& = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \alpha_{i}}\right], \tilde{\lambda} = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \lambda_{i}}\right],\\ \tilde{\theta}& = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \theta_{i}}\right], \tilde{c} = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu c_{i}}\right], \end{split}\right\} \end{equation} | (5.14) |
where \mathbb{W} is the burn-in period.
(5) Calculate the BEGs of \alpha , \lambda , \theta and c , using Eq (5.8), as follows:
\begin{equation} \left.\begin{split} \ddot{\alpha}& = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\alpha_{i}^{-\nu}\right)^{\frac{-1}{\nu}}, \ddot{\lambda} = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\lambda_{i}^{-\nu}\right)^{\frac{-1}{\nu}},\\ \ddot{\theta}& = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\theta_{i}^{-\nu}\right)^{\frac{-1}{\nu}}, \ddot{c} = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}c_{i}^{-\nu}\right)^{\frac{-1}{\nu}}. \end{split}\right\} \end{equation} | (5.15) |
(6) The highest posterior density (HPD) credible interval of parameter \alpha can be computed by arranging in an ascending order \alpha_{\mathbb{W}+1} , \dots, \alpha_{\mathbb{M}} as \alpha_{[1]} < \dots < \alpha_{[\mathbb{M}-\mathbb{W}]} , then compute the lower and upper bounds of all (1-\eta^{\star})100\% credible intervals of \alpha as follows:
\left(\alpha_{[1]},\alpha_{[(\mathbb{M}-\mathbb{W})(1-\eta^{\star})+1]}\right),\dots,\left(\alpha_{[(\mathbb{M}-\mathbb{W})\eta^{\star}]}, \alpha_{[\mathbb{M}-\mathbb{W}]}\right), |
where [x] denotes the largest integer number less than or equal to x . Then the HPD credible interval of \alpha is the one with the shortest length. The HPD credible interval of \lambda , \theta and c can also be computed in the same way.
(7) The symmetric credible intervals of \alpha can be computed as follows:
\left(\alpha_{[(\mathbb{M}-\mathbb{W})\frac{\eta^{\star}}{2}]}, \alpha_{[(\mathbb{M}-\mathbb{W})(1-\frac{\eta^{\star}}{2})]}\right). |
The symmetric credible intervals of \lambda , \theta and c can also be computed in the same way.
Two real data sets are proposed, in this section, for fitting and comparing the PGLD, Poisson-Lomax distribution (PLD) [28], geometric-Lomax distribution (GLD), exponentiated LD (ELD) [29] and LD with CDF (2.7).
The CDFs of PLD, GLD, and ELD are given, respectively, by
\begin{equation} \left. \begin{split} F_{PLD}& = \frac{e^{-\lambda \left(1+\frac{x}{\gamma}\right)^{-\alpha} } -e^{-\lambda}} {1-e^{-\lambda}}, \\ F_{GLD}& = \frac{(1-\theta)\left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]}{1-\theta\left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]},\\ F_{ELD}& = \left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]^{\lambda}.\\ \end{split}\right\} \end{equation} | (6.1) |
It can be noticed that the GLD is a special case of the PGLD as \lambda\rightarrow 0^+ .
● The first data set is taken from [30]. It consists of 72 exceedances for the years from 1958 to 1984, rounded to one decimal place. The data are given as follows:
1.7, 2.2, 14.4, 1.1, 0.4, 20.6, 5.3, 0.7, 1.9, 13.0, 12.0, 9.3, 1.4, 18.7, 8.5, 25.5, 11.6, 14.1, 22.1, 1.1, 2.5, 14.4, 1.7, 37.6, 0.6, 2.2, 39.0, 0.3, 15.0, 11.0, 7.3, 22.9, 1.7, 0.1, 1.1, 0.6, 9.0, 1.7, 7.0, 20.1, 0.4, 2.8, 14.1, 9.9, 10.4, 10.7, 30.0, 3.6, 5.6, 30.8, 13.3, 4.2, 25.5, 3.4, 11.9, 21.5, 27.6, 36.4, 2.7, 64.0, 1.5, 2.5, 27.4, 1.0, 27.1, 20.2, 16.8, 5.3, 9.7, 27.5, 2.5, 27.0.
● The second data set represents the marks of 48 slow space students in Mathematics in the final examination of the Indian Institute of Technology, Kanpur in year 2003, [31]. The data are given as follows:
29, 25, 50, 15, 13, 27, 15, 18, 7, 7, 8, 19, 12, 18, 5, 21, 15, 86, 21, 15, 14, 39, 15, 14, 70, 44, 6, 23, 58, 19, 50, 23, 11, 6, 34, 18, 28, 34, 12, 37, 4, 60, 20, 23, 40, 65, 19, 31.
The Kolmogorov-Smirnov (K-S) statistic and its corresponding p-value are used to check the validity of the PGLD, PLD, GLD, ELD, and LD to fit the above two data sets. The results are shown in Table 1. Also, the results of a comparison among these five distributions using some criteria such as the Akaike information criterion (AIC), consistent AIC (CAIC), and Bayesian information criterion (BIC), are shown in Table 1, where
\text{AIC} = 2\,r-2£(\hat{\beta}), \text{CAIC} = \frac{2\,r\, w}{w-r-1}-2£(\hat{\beta}), \text{BIC} = r\,\ln[w]-2£(\hat{\beta}), |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
where £(\hat{\beta}) stands for the log-likelihood function calculated at the MLE \hat{\beta} of \beta , r and w denote the number of parameters and the sample size, respectively.
According to the values of K-S statistic and its corresponding p-value, presented in Table 1, it's worth noting that the PGLD has the smallest(largest) K-S (p-)values than those for the PLD, GLD, ELD, and LD. Therefore, the PGLD is a better fit for the data than the other four distributions. Since the PGLD has the smallest values of AIC, CAIC, and BIC, then this is considered another indicator of the superiority of the PGLD. Figure 4 shows the comparison graphically by plotting the empirical CDF against the CDF of PGLD, PLD, GLD, ELD, and LD. In Table 1, we note that the PLD, ELD, and LD (GLD, ELD, and LD) do not fit well the first (second) data set (based on the p-value ( < 0.05 )) but we use them for comparison purposes.
In this section, a simulation study is performed to evaluate the performance of the estimation methods presented in Section 5. The MLEs, BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed and compared, through their mean squared errors (MSEs) and relative absolute biases (RABs), via a Monte Carlo simulation. The 95\% CIs, symmetric and HPD credible intervals are also computed and compared through their average interval lengths (AILs). The following algorithm is used to perform a simulation study:
(1) For i = 1, \dots, \hbar , generate the values of progressive censoring with binomial removals, r_{ij} , such that R_{ij}\sim Binomial (n_{i}-w_{i}-\sum_{k = 1}^{j-1}R_{ik}, p) distribution, where p is the removal probability, j = 1, \dots, w_{i}-1 and R_{i w_{i}} \; = \; n_{i} \; - \; w_{i} \; - \; \sum_{j = 1}^{w_{i}-1} \; R_{ij} .
(2) For given values of the prior parameters ( \mu_{1} , \mu_{2} , \mu_{3} , \sigma_{1} , \sigma_{2} , \sigma_{3} , a_{1} , a_{2} ), generate values for the parameters ( \alpha , \lambda , \theta, c ).
(3) For i = 1, \dots, \hbar , generate a progressively type-II censored sample of size w_{i} from PGLD with CDF (2.9), according to the algorithm given in [32].
(4) The MLEs, BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed as shown in Section 5. The BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed based on \mathbb{M}( = 5500) MCMC samples and discard the first 500 values as burn-in period.
(5) Repeat the above steps \mathbb{N}( = 1,000) times.
(6) If \hat{\Theta} is an estimate of \Theta , then the average of estimates, MSE and RAB of \hat{\Theta} over \mathbb{N} samples are given, respectively, by
\begin{split} \overline{\widehat{\Theta}} = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}\hat{\Theta}_{i}, \text{MSE}(\hat{\Theta}) = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}(\hat{\Theta}_{i}-\Theta)^{2}, \text{RAB}(\hat{\Theta}) = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}\frac{|\hat{\Theta}_{i}-\Theta|}{\Theta}. \end{split} |
(7) Calculate the average of estimates of the parameters \alpha , \lambda , \theta and c and their MSEs and RABs as shown in Step 6. Calculate also the average of the MSEs (MMSE) and the average of the RABs (MRAB) over the four parameter estimates.
(8) Calculate the 95\% CIs (as shown in Subsection 5.1), symmetric and HPD credible intervals (as shown in Subsection 5.3) of the parameters and then calculate the average interval lengths (AILs) of them. Calculate also the average of the AILs (MAIL) over the four parameter estimates.
Tables 2–4 provide the computational results, which take into account the prior parameter values: \mu_1 = 0.154 , \mu_2 = 0.8 , \mu_3 = -1.65 , \sigma_1 = 0.8 , \sigma_2 = 0.07 , \sigma_3 = 0.3 , a_{1} = 2.3 and a_{2} = 0.25 to generate population parameter values: \alpha = 1.6 , \lambda = 2.3 , \theta = 0.9 and c = 0.2 . Three distinct removal probabilities p = 0.15, 0.55, and 0.95 are considered.
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |
The total number of observations N is divided into two groups, \hbar = 2 , and then into four groups, \hbar = 4 , for comparison among MLEs, BELs, and BEGs.
● When there are two groups (\hbar = 2) ,
\begin{split} n_{1}& = n_{2} = N/2 \text{and} w_{1} = w_{2} = 50\% \,\, \text{and} \,\, 80\% \,\,\text{of the sample size},\\ v_{1}& = 0.5 \,\, \text{and}\,\, v_{2} = 1. \end{split} |
● When there are four groups (\hbar = 4) ,
\begin{split} n_{1}& = n_{2} = n_{3} = n_{4} = N/4 \; \text{and} \; w_{1} = w_{2} = w_{3} = w_{4} = 50\% \,\, \text{and} \,\, 80\% \,\,\text{of the sample size}\\ v_{1}& = 0.5, v_{2} = 1.0, \, v_{3} = 1.5, \,\, \text{and}\,\, v_{4} = 2.0. \end{split} |
The MLEs (BELs and BEGs) of the parameters \alpha , \lambda , \theta and c with their MSEs, RABs, MMSEs, MRABs are displayed in Table 2 (Table 3). The HPD credible intervals, symmetric credible intervals, and CIs with their AILs and MAILs are presented in Table 4.
We can observe from Tables 2–4 that:
(1) Through the MMSEs and MRABs, the BELs and BEGs outperform the MLEs.
(2) Through the MAILs, the HPD and symmetric credible intervals are superior to CIs. This confirms that BELs and BEGs are superior to MLEs.
(3) The HPD is better than symmetric credible intervals via the MAILs.
(4) For fixed p , \hbar , N , and n_{i} (or w_{i} ), i = 1, \dots, \hbar , by increasing w_{i} (or n_{i} ) the MSEs, RABs, MMSEs, MRABs, AILs and MAILs decrease. This assures that the more data we collect, the more accurate our results will be.
(5) The BELs and BEGs are better at \nu = 0.5 than those at \nu = -0.5 through comparing the MMSE.
(6) The BELs are better than BEGs at \nu = 0.5 but the converse is true at \nu = -0.5 through comparing the MMSE.
(7) The MLEs are better at p = 0.15 than those at p = 0.55, 0.95 through comparing the MMSE.
(8) Better results have been obtained at \hbar = 2 than those obtained at \hbar = 4 since the number of observations in the subgroups became smaller than those at \hbar = 2 .
Except in a few rare cases, the above results are satisfactory, which could be related to data fluctuations.
Furthermore, if the hyperparameters are unknown, we can estimate them using past samples following the empirical Bayes approach, see [33]. Alternatively, the hierarchical Bayes technique, which uses a suitable prior for the hyperparameters, could be utilized, see [34].
Due to the progress in manufacturing devices in the last decades, physicists and engineers may face a problem in assessing the lifetime distribution of these devices if they are connected in a certain mixed system such as a series-parallel system. To overcome this problem, we have introduced a new distribution called PGLD. The mentioned distribution can describe the lifetime distribution of series-parallel systems when the number of series subsystems, as well as the number of their parallel components, are random variables. The PGLD may arise by compounding two discrete (truncated Poisson and geometric) distributions with a mixture of continuous distributions, LD. Some important properties of the PGLD have been investigated, such as the q -th quantile, mode, r -th moment, mean residual lifetime, Bonferroni and Lorenz curves, Rényi and Shannon's entropies, PDF and CDF of the i -th order statistic. The progressive-stress model has been applied to units connected in a series-parallel structure. The lifetimes of these units under normal stress conditions are assumed to follow LD which is considered a mixture of exponential and gamma distributions. The progressive-stress model was used when the stress is increasing nonlinearly over time, and the inverse power law model had established a relationship between the stress and the proposed distribution's scale parameter. Based on progressively type-II censored data with binomial eliminations, two estimation methods were performed to estimate the unknown parameters. The Bayesian estimation was performed using two asymmetric (LINEX and GE) loss functions. CIs, symmetric, and HPD credible intervals for the unknown parameters were established. The numerical results showed that the Bayes estimates were performed well than the MLEs. An illustrative example, based on two real data sets, demonstrated the superiority of the proposed distribution over some other four distributions.
Finally, the features and motivations to the PGLD can be summarized as follows:
(1) The CDF of the PGLD has a closed-form. This feature simplifies its use.
(2) The four parameters included in the CDF of PGLD give it the ability to fit several data.
(3) The CDF of PGLD includes the CDF of GLD, PLD, and LD as special cases.
(4) The PGLD can represent the failure time of a series-parallel system. This is considered a motivation to introduce the PGLD.
(5) The HRF of PGLD has a unimodal shape. This feature gives it more flexibility to fit and analyze several data arising from increasing and decreasing hazard rates.
(6) The PGLD can represent the non-stationary data. This feature may be useful for the experimenter to forecast how some products would behave in different environments.
(7) The PGLD is better to fit the data than some other distributions such as PLD, GLD, ELD, and LD.
(8) Some other distributions may be emerged from Theorem 2.1 by choosing some other continuous distributions rather than LD.
Based on the above features and motivations, we hope that the PGLD will attract more attention from physicists and engineers in the near future.
The following points may be investigated in future work:
(1) The inference procedure may be implemented based on a general progressive censoring scheme such as hybrid progressive censoring.
(2) Some other estimation methods such as the moments and probability weighted moments may be discussed.
(3) Some other types of ALTs such as the step-stress ALT may be considered.
(4) Prediction of future order statistics based on the PGLD may be investigated.
The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by grant code (19-SCI-1-03-0009). They are also grateful to the editor and reviewers for their valuable comments and suggestions, which improved the paper.
The authors declare that they have no conflicts of interest regarding the publication of this article.
[1] |
R. Khalil, M. Al Horani, A. Yousef, M. Sababheh, A new definition of fractional derivative, J. Comput. Appl. Math., 264 (2014), 65–70. https://doi.org/10.1016/j.cam.2014.01.002 doi: 10.1016/j.cam.2014.01.002
![]() |
[2] |
S. Khirsariya, S. Rao, J. Chauhan, Solution of fractional modified Kawahara equation: a semi-analytic approach, Math. Appl. Sci. Eng., 4 (2023), 249–350. https://doi.org/10.5206/mase/16369 doi: 10.5206/mase/16369
![]() |
[3] |
S. R. Khirsariya, J. P. Chauhan, S. B. Rao, A robust computational analysis of residual power series involving general transform to solve fractional differential equations, Math. Comput. Simul., 216 (2024), 168–186. https://doi.org/10.1016/j.matcom.2023.09.007 doi: 10.1016/j.matcom.2023.09.007
![]() |
[4] | S. R. Khirsariya, S. B. Rao, J. P. Chauhan, Semi-analytic solution of time-fractional Korteweg-de Vries equation using fractional residual power series method, Results Nonlinear Anal., 5 (2022), 222–234. |
[5] |
L. K. B. Kuroda, A. V. Gomes, R. Tavoni, P. F. de Arruda Mancera, N. Varalta, R. de Figueiredo Camargo, Unexpected behavior of Caputo fractional derivative, Comp. Appl. Math., 36 (2017), 1173–1183. https://doi.org/10.1007/s40314-015-0301-9 doi: 10.1007/s40314-015-0301-9
![]() |
[6] | R. Almeida, N. Bastos, M. Teresa, T. Monteiro, A prelude to the fractional calculus applied to tumor dynamic, Math. Methods Appl. Sci., 39 (2016), 4846–4855. |
[7] | J. Losada, J. J. Nieto, Properties of a new fractional derivative without singular kernel, Progr. Fract. Differ. Appl., 1 (2015), 87–92. |
[8] | M. Caputo, M. Fabrizio, A new definition of fractional derivative without singular kernel, Progr. Fract. Differ. Appl., 1 (2015), 73–85. |
[9] |
S. Alshammari, M. M. Al-Sawalha, R. Shah, Approximate analytical methods for a fractional-order nonlinear system of Jaulent–Miodek equation with energy-dependent Schrodinger potential, Fractal Fract., 7 (2023), 140. https://doi.org/10.3390/fractalfract7020140 doi: 10.3390/fractalfract7020140
![]() |
[10] |
A. Atangana, J. F. Gomez-Aguilar, A new derivative with normal distribution kernel: theory, methods and applications. Phys. A: Stat. Mech. Appl., 476 (2017), 1–14. https://doi.org/10.1016/j.physa.2017.02.016 doi: 10.1016/j.physa.2017.02.016
![]() |
[11] |
M. Alesemi, N. Iqbal, M. S. Abdo, Novel investigation of fractional-order Cauchy-reaction diffusion equation involving Caputo-Fabrizio operator, J. Funct. Spaces, 2022 (2022), 1–14. https://doi.org/10.1155/2022/4284060 doi: 10.1155/2022/4284060
![]() |
[12] |
N. Iqbal, H. Yasmin, A. Rezaiguia, J. Kafle, A. O. Almatroud, T. S. Hassan, Analysis of the fractional-order Kaup-Kupershmidt equation via novel transforms, J. Math., 2021 (2021), 1–13. https://doi.org/10.1155/2021/2567927 doi: 10.1155/2021/2567927
![]() |
[13] |
N. Iqbal, H. Yasmin, A. Ali, A. Bariq, M. M. Al-Sawalha, W. W. Mohammed, Numerical methods for fractional-order Fornberg-Whitham equations in the sense of Atangana-Baleanu derivative, J. Funct. Spaces, 2021 (2021), 1–10. https://doi.org/10.1155/2021/2197247 doi: 10.1155/2021/2197247
![]() |
[14] |
P. Sunthrayuth, A. M. Zidan, S. W. Yao, R. Shah, M. Inc, The comparative study for solving fractional-order Fornberg-Whitham equation via \rho-Laplace transform, Symmetry, 13 (2021), 784. https://doi.org/10.3390/sym13050784 doi: 10.3390/sym13050784
![]() |
[15] |
T. Kakutani, H. Ono, Weak non-linear hydromagnetic waves in a cold collision-free plasma, J. Phys. Soc. Jpn., 26 (1969), 1305–1318. https://doi.org/10.1143/JPSJ.26.1305 doi: 10.1143/JPSJ.26.1305
![]() |
[16] |
R. Shah, H. Khan, D. Baleanu, P. Kumam, M. Arif, A novel method for the analytical solution of fractional Zakharov-Kuznetsov equations, Adv. Differ. Equ., 2019 (2019), 1–14. https://doi.org/10.1186/s13662-019-2441-5 doi: 10.1186/s13662-019-2441-5
![]() |
[17] |
A. Goswami, J. Singh, D. Kumar, Numerical simulation of fifth order KdV equations occurring in magneto-acoustic waves, Ain Shams Eng. J., 9 (2018), 2265–2273. https://doi.org/10.1016/j.asej.2017.03.004 doi: 10.1016/j.asej.2017.03.004
![]() |
[18] | M. O. Miansari, M. E. Miansari, A. Barari, D. D. Ganji, Application of He's variational iteration method to nonlinear Helmholtz and fifth-order KdV equations, J. Appl. Math., Stat. Inf. (JAMSI), 5 (2009). |
[19] |
S. Abbasbandy, F. S. Zakaria, Soliton solutions for the fifth-order KdV equation with the homotopy analysis method. Nonlinear Dyn., 51 (2008), 83–87. https://doi.org/10.1007/s11071-006-9193-y doi: 10.1007/s11071-006-9193-y
![]() |
[20] |
A. M. Wazwaz, Solitons and periodic solutions for the fifth-order KdV equation, Appl. Math. Lett., 19 (2006), 1162–1167. https://doi.org/10.1016/j.aml.2005.07.014 doi: 10.1016/j.aml.2005.07.014
![]() |
[21] |
M. T. Darvishi, F. Khani, Numerical and explicit solutions of the fifth-order Korteweg-de Vries equations, Chaos, Solitons Fract., 39 (2009), 2484–2490. https://doi.org/10.1016/j.chaos.2007.07.034 doi: 10.1016/j.chaos.2007.07.034
![]() |
[22] |
I. Ahmad, H. Seno, An epidemic dynamics model with limited isolation capacity, Theory Biosci., 142 (2023), 259–273. https://doi.org/10.1007/s12064-023-00399-9 doi: 10.1007/s12064-023-00399-9
![]() |
[23] |
H. Khan, R. Shah, P. Kumam, D. Baleanu, M. Arif, Laplace decomposition for solving nonlinear system of fractional order partial differential equations, Adv. Differ. Equ., 2020 (2020), 1–18. https://doi.org/10.1186/s13662-020-02839-y doi: 10.1186/s13662-020-02839-y
![]() |
[24] |
S. Zhang, Application of Exp-function method to a KdV equation with variable coefficients, Phys. Lett. A, 365 (2007), 448–453. https://doi.org/10.1016/j.physleta.2007.02.004 doi: 10.1016/j.physleta.2007.02.004
![]() |
[25] | O. A. Arqub, Series solution of fuzzy differential equations under strongly generalized differentiability, J. Adv. Res. Appl. Math., 5 (2013), 31–52. |
[26] |
O. Abu Arqub, Z. Abo-Hammour, R. Al-Badarneh, S. Momani, A reliable analytical method for solving higher-order initial value problems, Discrete Dyn. Nat. Soc., 2013 (2013), 673829. https://doi.org/10.1155/2013/673829 doi: 10.1155/2013/673829
![]() |
[27] |
J. Zhang, Z. Wei, L. Li, C. Zhou, Least-squares residual power series method for the time-fractional differential equations, Complexity, 2019 (2019), 1–15. https://doi.org/10.1155/2019/6159024 doi: 10.1155/2019/6159024
![]() |
[28] |
I. Jaradat, M. Alquran, R. Abdel-Muhsen, An analytical framework of 2D diffusion, wave-like, telegraph, and Burgers' models with twofold Caputo derivatives ordering, Nonlinear Dyn., 93 (2018), 1911–1922. https://doi.org/10.1007/s11071-018-4297-8 doi: 10.1007/s11071-018-4297-8
![]() |
[29] |
Y. Xie, I. Ahmad, T. I. S. Ikpe, E. F. Sofia, H. Seno, What influence could the acceptance of visitors cause on the epidemic dynamics of a reinfectious disease?: a mathematical model, Acta Biotheor., 72 (2024), 3. https://doi.org/10.1007/s10441-024-09478-w doi: 10.1007/s10441-024-09478-w
![]() |
[30] |
S. Mukhtar, M. Sohaib, I. Ahmad, A numerical approach to solve volume-based batch crystallization model with fines dissolution unit, Processes, 7 (2019), 453. https://doi.org/10.3390/pr7070453 doi: 10.3390/pr7070453
![]() |
[31] | M. F. Zhang, Y. Q. Liu, X. S. Zhou, Efficient homotopy perturbation method for fractional non-linear equations using Sumudu transform, Therm. Sci., 19 (2015), 1167–1171. |
[32] |
M. I. Liaqat, S. Etemad, S. Rezapour, C. Park, A novel analytical Aboodh residual power series method for solving linear and nonlinear time-fractional partial differential equations with variable coefficients, AIMS Math., 7 (2022), 16917–16948. https://doi.org/10.3934/math.2022929 doi: 10.3934/math.2022929
![]() |
[33] |
M. I. Liaqat, A. Akgul, H. Abu-Zinadah, Analytical investigation of some time-fractional Black-Scholes models by the Aboodh residual power series method, Mathematics, 11 (2023), 276. https://doi.org/10.3390/math11020276 doi: 10.3390/math11020276
![]() |
[34] |
G. O. Ojo, N. I. Mahmudov, Aboodh transform iterative method for spatial diffusion of a biological population with fractional-order, Mathematics, 9 (2021), 155. https://doi.org/10.3390/math9020155 doi: 10.3390/math9020155
![]() |
[35] |
M. A. Awuya, G. O. Ojo, N. I. Mahmudov, Solution of space-time fractional differential equations using Aboodh transform iterative method, J. Math., 2022 (2022), 4861588. https://doi.org/10.1155/2022/4861588 doi: 10.1155/2022/4861588
![]() |
[36] |
M. A. Awuya, D. Subasi, Aboodh transform iterative method for solving fractional partial differential equation with Mittag-Leffler kernel, Symmetry, 13 (2021), 2055. https://doi.org/10.3390/sym13112055 doi: 10.3390/sym13112055
![]() |
[37] | K. S. Aboodh, The new integral transform'Aboodh transform, Global J. Pure Appl. Math., 9 (2013), 35–43. |
[38] | S. Aggarwal, R. Chauhan, A comparative study of Mohand and Aboodh transforms, Int. J. Res. Adv. Technol., 7 (2019), 520–529. |
[39] |
M. E. Benattia, K. Belghaba, Application of the Aboodh transform for solving fractional delay differential equations, Univ. J. Math. Appl., 3 (2020), 93–101. https://doi.org/10.32323/ujma.702033 doi: 10.32323/ujma.702033
![]() |
[40] |
B. B. Delgado, J. E. Macias-Diaz, On the general solutions of some non-homogeneous Div-curl systems with Riemann-Liouville and Caputo fractional derivatives, Fractal Fract., 5 (2021), 117. https://doi.org/10.3390/fractalfract5030117 doi: 10.3390/fractalfract5030117
![]() |
[41] |
S. Alshammari, M. Al-Smadi, I. Hashim, M. A. Alias, Residual power series technique for simulating fractional Bagley-Torvik problems emerging in applied physics, Appl. Sci., 9 (2019), 5029. https://doi.org/10.3390/app9235029 doi: 10.3390/app9235029
![]() |
1. | Jin Gao, Lihua Dai, Min Xiao, Dynamical analysis of almost periodic solutions for delayed octonion-valued BAM neural networks, 2025, 44, 2238-3603, 10.1007/s40314-025-03112-2 |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |