
It is of great importance for physicists and engineers to assess a lifetime distribution of a series-parallel system when its components' lifetimes are subject to a finite mixture of distributions. The present article addresses this problem by introducing a new distribution called "Poisson-geometric-Lomax distribution". Important properties of the proposed distribution are discussed. When the stress is an increasing nonlinear function of time, the progressive-stress model is considered and the inverse power-law model has suggested a relationship between the stress and the scale parameter of the proposed distribution. Based on the progressive type-II censoring with binomial removals, estimation of the included parameters is discussed using maximum likelihood and Bayes methods. An example, based on two real data sets, demonstrates the superiority of the proposed distribution over some other known distributions. To compare the performance of the implemented estimation methods, a simulation study is carried out. Finally, some concluding remarks followed by certain features and motivations to the proposed distribution are presented.
Citation: Tahani A. Abushal, Alaa H. Abdel-Hamid. Inference on a new distribution under progressive-stress accelerated life tests and progressive type-II censoring based on a series-parallel system[J]. AIMS Mathematics, 2022, 7(1): 425-454. doi: 10.3934/math.2022028
[1] | Mohamed S. Eliwa, Essam A. Ahmed . Reliability analysis of constant partially accelerated life tests under progressive first failure type-II censored data from Lomax model: EM and MCMC algorithms. AIMS Mathematics, 2023, 8(1): 29-60. doi: 10.3934/math.2023002 |
[2] | Naif Alotaibi, A. S. Al-Moisheer, Ibrahim Elbatal, Salem A. Alyami, Ahmed M. Gemeay, Ehab M. Almetwally . Bivariate step-stress accelerated life test for a new three-parameter model under progressive censored schemes with application in medical. AIMS Mathematics, 2024, 9(2): 3521-3558. doi: 10.3934/math.2024173 |
[3] | A. M. Abd El-Raheem, Ehab M. Almetwally, M. S. Mohamed, E. H. Hafez . Accelerated life tests for modified Kies exponential lifetime distribution: binomial removal, transformers turn insulation application and numerical results. AIMS Mathematics, 2021, 6(5): 5222-5255. doi: 10.3934/math.2021310 |
[4] | Mustafa M. Hasaballah, Oluwafemi Samson Balogun, M. E. Bakr . Frequentist and Bayesian approach for the generalized logistic lifetime model with applications to air-conditioning system failure times under joint progressive censoring data. AIMS Mathematics, 2024, 9(10): 29346-29369. doi: 10.3934/math.20241422 |
[5] | Refah Alotaibi, Mazen Nassar, Zareen A. Khan, Ahmed Elshahhat . Statistical analysis of stress–strength in a newly inverted Chen model from adaptive progressive type-Ⅱ censoring and modelling on light-emitting diodes and pump motors. AIMS Mathematics, 2024, 9(12): 34311-34355. doi: 10.3934/math.20241635 |
[6] | Hanan Haj Ahmad, Ehab M. Almetwally, Dina A. Ramadan . A comparative inference on reliability estimation for a multi-component stress-strength model under power Lomax distribution with applications. AIMS Mathematics, 2022, 7(10): 18050-18079. doi: 10.3934/math.2022994 |
[7] | Refah Alotaibi, Mazen Nassar, Zareen A. Khan, Ahmed Elshahhat . Analysis of reliability index R=P(Y<X) for newly extended xgamma progressively first-failure censored samples with applications. AIMS Mathematics, 2024, 9(11): 32200-32231. doi: 10.3934/math.20241546 |
[8] | Bing Long, Zaifu Jiang . Estimation and prediction for two-parameter Pareto distribution based on progressively double Type-II hybrid censored data. AIMS Mathematics, 2023, 8(7): 15332-15351. doi: 10.3934/math.2023784 |
[9] | Hassan Okasha, Mazen Nassar, Saeed A. Dobbah . E-Bayesian estimation of Burr Type XII model based on adaptive Type-Ⅱ progressive hybrid censored data. AIMS Mathematics, 2021, 6(4): 4173-4196. doi: 10.3934/math.2021247 |
[10] | Xue Hu, Haiping Ren . Statistical inference of the stress-strength reliability for inverse Weibull distribution under an adaptive progressive type-Ⅱ censored sample. AIMS Mathematics, 2023, 8(12): 28465-28487. doi: 10.3934/math.20231457 |
It is of great importance for physicists and engineers to assess a lifetime distribution of a series-parallel system when its components' lifetimes are subject to a finite mixture of distributions. The present article addresses this problem by introducing a new distribution called "Poisson-geometric-Lomax distribution". Important properties of the proposed distribution are discussed. When the stress is an increasing nonlinear function of time, the progressive-stress model is considered and the inverse power-law model has suggested a relationship between the stress and the scale parameter of the proposed distribution. Based on the progressive type-II censoring with binomial removals, estimation of the included parameters is discussed using maximum likelihood and Bayes methods. An example, based on two real data sets, demonstrates the superiority of the proposed distribution over some other known distributions. To compare the performance of the implemented estimation methods, a simulation study is carried out. Finally, some concluding remarks followed by certain features and motivations to the proposed distribution are presented.
Because of advancements in manufacturing and technical development, products and devices are becoming increasingly reliable, making typical life tests under normal conditions difficult, if not impossible. For the industrial markets, such tests are too time-consuming and costly to get sufficient information about a product's lifetime distribution or even a prototype. As a result, the accelerated life test (ALT) is becoming more popular as well as important, as it gives information on a highly reliable product's lifetime in a short amount of time, see [1,2]. More failures can be collected quickly by conducting the life test at higher stress levels than normal operating conditions. A suitable stress-response regression model is then used to estimate the lifetime distribution at the usage stress. As a particular class of ALT, the progressive-stress test implements a special stress-loading scheme where the stress increases in time, see [3,4,5,6].
Censoring often occurs when some lifetimes of products are missing or when some experimental design purposes are being implemented. Type-I and type-II censoring are the most prevalent schemes. These two types of censoring do not have the flexibility to allow the experimenter to withdraw units from a life test at different stages during the experiment. Because of this lack of flexibility, progressive censoring is proposed as a more general censoring technique. This technique enables the experimenter to withdraw units from a life test at different stages, other than the final point, through the test experiment.
One of various risk factors could cause the units to fail in reliability, engineering, biomedicine, physical studies, and other fields. If the risks are unclear, an issue may occur regarding the factor that caused the unit to fail, and as a result, the lifetime associated with this particular risk cannot be determined. In such cases, the maximum (minimum) lifetime value with all risks can only be observed. This is referred to as a "series (parallel) system" in the literature. These two systems can be joined to form a new system known as a "series-parallel system", see its description in Figure 1. In a series system, components are connected in such a way that the failure of a single component leads to the failure of the system. On the other hand, a system in which failure of all components leads to the failure of the system is called a parallel system. Finally, a series-parallel system is a system in which m subsystems are connected in series and each subsystem consists of k components connected in parallel, see Figure 1. For such a system, failure is observed if a subsystem fails. The technique of compounding of distribution functions may be used to construct the lifetime distribution of the series-parallel system.
Abdel-Hamid and Hashem [7] introduced a new distribution for the series-parallel system by compounding two Poisson distributions (truncated at zero) with an exponential distribution. They used six estimation methods to estimate the included parameters. In [8], they obtained a new distribution by compounding two discrete distributions with a mixture of continuous distributions based on a parallel-series system. Nadarajah et al. [9] applied the progressive-stress ALT technique using type-I progressively hybrid censored data with binomial removal to components connected as a parallel-series structure. Hu et al. [10], suggested an ideally distributed series-parallel system, as well as two analytical reliability assessment approaches to analyze the reliability of a distributed power system. Hashem and Alyami [11] introduced a new distribution based on a parallel-series system.
In this article, we introduce a new distribution that describes the lifetime of a series-parallel system when the lifetimes of the included components are subject to a mixture of exponential and gamma distributions. The distribution will be called Poisson-geometric-Lomax distribution (PGLD) which can be obtained by compounding truncated Poisson, geometric, and Lomax distributions
The following is how the rest of the article is put together: The distribution of a series-parallel system based on a finite mixture of distributions is constructed in Section 2. In Section 3, the model is described based on progressive-stress ALT and progressive type-II censoring. In Section 4, estimation of the parameters is obtained using the maximum likelihood and Bayes methods. In Section 5, the parameters are estimated numerically using maximum likelihood and Bayes estimation methods. An illustrative example, based on two real data sets, is studied in Section 6. A simulation study is worked done in Section 7. Concluding remarks followed by certain features and motivations of the PGLD as well as future work are presented in Section 8.
A mixture of distributions is considered a combination of statistical distributions, that occur when sampling from inhomogeneous populations (or mixed populations) with a different probability density function (PDF) in each component.
Let F(y∣ω) be the cumulative distribution function (CDF) of a continuous random variable (RV) Y, with realization y, that is dependent on a continuous RV Ω, with realization ω, and Π(ω) be the CDF of Ω. The marginal CDF F(y) is given by
F(y)=∫ΩF(y∣ω)dΠ(ω), | (2.1) |
which is called a mixture (according to Teicher [12]) of the CDFs F(y∣ω) and Π(ω). Fisher [13] called F(y) "compound distribution".
The corresponding PDF is given by
f(x)=∫Ωf(y∣ω)π(ω)dω. | (2.2) |
If the RV Ω assumes only a finite number of points {ωj,j=1,…,κ}, Π(ω) is then a mass function and assigns positive probabilities to only ωj. The integral in (2.1) is then replaced by a sum to give a { finite mixture} of the form
F(y)=κ∑j=1F(y;ωj)Π(ωj). | (2.3) |
Suppose, in (2.3), Π(ωj)=pj, j=1,…,κ, and F(y;ωj)=Fj(y). Then (2.3) takes the form
F(y)=κ∑j=1pjFj(y). | (2.4) |
A corresponding finite mixture of PDFs is given by
f(y)=κ∑j=1pjfj(y). | (2.5) |
In (2.4) and (2.5) the masses pj are called mixing proportions with the conditions:
0≤pj≤1,j=1,…,κ,andκ∑j=1pj=1. |
The functions Fj and fj are called jth components in the finite mixture of CDFs (2.4) and PDFs (2.5), respectively. For more details on finite mixture of distributions, see [14,15]. Several distributions may arise due to compound distribution (2.2) and mixture distribution (2.5), see [16], such as
(i) If in (2.2), f(y∣ω) has exp(y∣ω) distribution and π(ω) has gamma(α,γ) distribution, then we get Lomax(α,γ) distribution (LD) as follows:
Lomax(α,γ)≡f(y;α,γ)=∫∞0ωe−ωyγαΓ(α)ωα−1e−γωdω=αγ(1+yγ)−(α+1),y>0,(α,γ>0). | (2.6) |
Therefore, the corresponding CDF is given by
F(y;α,γ)=1−(1+yγ)−α,x>0,(α,γ>0). | (2.7) |
(ii) If in (2.5), κ=2,p1=θθ+1, f1(y) has exp(θ) distribution and f2(y) has gamma(2,θ) distribution, then we get Lindley distribution as follows:
Lindley(θ)≡f(y;θ)=θθ+1[θe−θy]+1θ+1[θ2ye−θy]=θ2θ+1(1+y)e−θy,y>0,(θ>0). |
If in (2.2), f(y∣Ω=ω)=ωf1(y)+(1−ω)f2(y) and the RV Ω subjects to a beta distribution, B(a,b), then
f(y)=∫10f(y∣ω)π(ω)dω=1B(a,b)∫10(ωf1(y)+(1−ω)f2(y))ωa−1(1−ω)b−1dω=aa+bf1(y)+ba+bf2(y),t>0,(a,b>0). | (2.8) |
Several new distributions of the series-parallel system may emerge by choosing different distributions for the number of subsystems as well as those describing the lifetimes of the included components, as explained in [7,8], but in exchange, several questions may arise, for example:
● Does the presented distribution have any interesting properties?
● Does it arise naturally in some observable process like natural phenomena?
● Does it have any motivations?
There is no need for new distributions if there are no positive answers to the above questions. A response to these questions will be illustrated in Section 8.
According to [7,8], the following theorem gives the CDF and PDF of the PGLD which represents the failure time distribution of the series-parallel system when the number of series subsystems and the number of their components, that are connected as a parallel structure, are RVs subject to truncated Poisson and geometric distributions, respectively. At the same time, the failure times of the components have a finite mixture of distribution functions each of which may be responsible for a different cause of failure.
Theorem 2.1. Suppose that, for i=1,2,…,kj, j=1,2,…,m, a mixed system of series-parallel structure type has the lifetime X=minj(maxiYij), where Yij are IID RVs, see Figure 1, with PDF(CDF) fY(y)(FY(y)) of Lomax distribution given by (2.6) ((2.7)). Consider M and Kj are two discrete RVs subject to truncated Poisson and geometric distributions with PMFs P(M=m)=e−λλmm!(1−e−λ),m=1,2,…,(λ>0) and P(Kj=kj)=(1−θ)θkj−1,kj=1,2,…,(0<θ<1), respectively. Then the distribution of X has the PGLD with CDF and PDF given, respectively, by
FX(x)=1−e−λΩ(x)1−e−λ, | (2.9) |
fX(x)=λΩ2(x)e−λΩ(x)fY(x)(1−θ)(1−e−λ)F2Y(x), | (2.10) |
where
Ω(x)=(1−θ)FY(x)1−θFY(x),x>0. | (2.11) |
Proof. The proof is similarly as in [7].
Remark 2.1. It is worth noting that while γ is a scale parameter for LD with CDF (2.7), it is also a scale parameter for PGLD with CDF (2.9).
The PDF and hazard rate function (HRF), h(x)=f(x)/(1−F(x)), of the PGLD are plotted in Figure 2, in which one can observe that the PDF exhibits decreasing and unimodal shapes while the HRF exhibits a unimodal shape and may be sudden fluctuation at its end. These fluctuations usually imply that the product's performance has degraded over time. Non-stationary data can exhibit this characteristic, and the PGLD can help to represent such data. The non-stationary nature of failure times may help the researcher forecast how some items will behave in the environment.
The features and motivations of the PGLD with CDF and PDF (2.9) and (2.10) are summarized in the last section.
In the following section, some important properties of the PGLD are given.
In this section, some important properties of the PGLD, such as the q-th quantile, mode, r-th moment, mean residual lifetime, Bonferroni and Lorenz curves, Rényi and Shannon's entropies, PDF and CDF of the i-th order statistic, are given.
Theorem 3.1. The q-th quantile xq of the PGLD with CDF (2.9) can be obtained as
xq=γ[(1+ln[1−q(1−e−λ)]λ−λθ−θln[1−q(1−e−λ)])−1α−1]. | (3.1) |
Proof. By solving the equation FX(xq)=q with respect to xq, the proof can be achieved immediately.
Remark 3.1. As a particular case, the median of PGLD with CDF (2.9) can be obtained by putting q=1/2 in Eq (3.1).
Theorem 3.2. Let X be a RV subject to the PGLD with PDF (2.10). Then the mode is given by
x∗=γ((−D2+√D22−4D1D32D1)1α−1), | (3.2) |
where
D1=(α+1)(1−θ)2,D2=2θ(1−θ)(α+1)+α(λ−2θ)(1−θ),D3=θ2(1−α). |
Proof. The mode can be directly obtained by solving dln[fX(x)]dx=0 with respect to x.
Theorem 3.3. Let X be a RV subject to PGLD with PDF (2.10). Then, for r=1,2,…, the r-th moment of X is given by
m(r)=N∑ı=02νı(1−yı)2Φ(r)(y∗ı), |
where yı,νı are the zeros and the corresponding Christoffel numbers of the Legendre-Gauss quadrature formula on the interval (-1, 1), see Canuto et al. [17].
Proof. The r-th moment of X is given by
m(r)=∫∞0xrfX(x)dx=αλγ(1−θ)(1−e−λ)∫1−12(1−y)2Φ(r)(y∗)dy, |
where y∗=1+y1−y, and
Φ(r)(y)=yrΩ2(y)e−λΩ(y)(1+yγ)−α−1(1−(1+yγ)−α)2. | (3.3) |
The last integral can be approximated, by using Legendre-Gauss quadrature formula, as
m(r)=N∑ı=02νı(1−yı)2Φ(r)(y∗ı), | (3.4) |
where
νı=2(1−y2ı)[L′N+1(yı)]2andL′N+1(yı)=dLN+1(y)dyaty=yı, | (3.5) |
and LN(.) is the Legendre polynomial of degree N.
Theorem 3.4. The mean residual lifetime of the PGLD is given by
MRL(x0)=2x0e−λΩ(x0)−e−λN∑ı=0νı(1−yı)2(e−λΩ(2x01−yı)−e−λ), |
where νı is given by (3.5).
Proof. The mean residual lifetime of the PGLD is given by
MRL(x0)=E[X−x0∣X>x0]=1S(x0)∫∞x0S(x)dx=1S(x0)∫1−12x0(1−y)2S(2x01−y)dy, |
where S(x)=1−F(x) is the survival function.
The last integral can be approximated, by using Legendre-Gauss quadrature formula, as
MRL(x0)=2x0e−λΩ(x0)−e−λN∑ı=0νı(1−yı)2(e−λΩ(2x01−yı)−e−λ), |
where νı is given by (3.5).
Theorem 3.5. Let X be a RV subject to the PGLD with PDF (2.10). Then, the Bonferroni curve (BC) and Lorenz curve (LC) are given, respectively, by
BC(η)=AηN∑ı=0νıΦ(1)(pη2(yı+1)),LC(η)=ηAηN∑ı=0νıΦ(1)(pη2(yı+1)), | (3.6) |
where νı is given by (3.5), 0<η<1 and
Aη=αλpη2m(1)ηγ(1−θ)(1−e−λ), |
pη=F−1(η)=γ((1+ln[1−η(1−e−λ)]λ−λθ−θln[1−η(1−e−λ)])−1α−1), |
and Φ(1)(.) and m(1) are given, respectively, by (3.3) and (3.4) at r=1.
Proof. The Bonferroni curve of PGLD is given by
BC(η)=1ηm(1)∫pη0xfX(x)dx=Aη∫1−1Φ(1)(pη2(y+1))dy=AηN∑ı=0νıΦ(1)(pη2(yı+1)). |
The Lorenz curve of PGLD is given by
LC(η)=1m(1)∫pη0xfX(x)dx=ηAηN∑ı=0νıΦ(1)(pη2(yı+1)). |
The Bonferroni and Lorenz curves are plotted in Figure 3.
Theorem 3.6. Let X be a RV subject to the PGLD with PDF (2.10). Then, the Rényi and Shannon's entropies of X are given, respectively, by
RE(ℓ)=11−ℓ(ℓln[αλγ(1−θ)(1−e−λ)]+ln[N∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı)]),SHE=ln[γ(1−θ)(1−e−λ)αλ]+2αλγ(1−θ)(1−e−λ)N∑ı=02νı(1−yı)2W∗(1+yı1−yı)Φ(0)(1+yı1−yı), | (3.7) |
where νı is given by (3.5) and
Wℓ(y)=Ω2ℓ(y)e−λℓΩ(y)(1+yγ)−αℓ−ℓ[1−(1+yγ)−α]2ℓ, |
W∗(y)=−2ln[Ω(y)]+λΩ(y)+(α+1)ln[1+yγ]+2ln[1−(1+yγ)−α]. |
Proof. The Rényi entropy of X is given by
RE(ℓ)=11−ℓln[∫∞0fℓX(x)dx], |
where ℓ>0 and ℓ≠1.
Based on PDF (2.10), we obtain
∫∞0fℓX(x)dx=[αλγ(1−θ)(1−e−λ)]ℓ∫∞0Wℓ(x)dx=[αλγ(1−θ)(1−e−λ)]ℓN∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı). |
Then,
RE(ℓ)=11−ℓ(ℓln[αλγ(1−θ)(1−e−λ)]+ln[N∑ı=0νı2(1−yı)2Wℓ(1+yı1−yı)]). |
The Shannon's entropy of T is given by
SHE=E[−ln[fX(x)]]=ln[α(1−θ)(1−e−λ)αλ]+E[W∗(x)]=ln[γ(1−θ)(1−e−λ)αλ]+2αλγ(1−θ)(1−e−λ)N∑ı=02νı(1−yı)2W∗(1+yı1−yı)Φ(0)(1+yı1−yı). |
Theorem 3.7. Let X1,…,Xn is a random sample from the PGLD with CDF (2.9) and PDF (2.10). Then, the PDF and CDF of the i-th order statistic, say Xi:n, are given, respectively, by
\begin{equation} \begin{split} f_{j:n}(x) = &j\; \binom{n}{j} \frac{\alpha\lambda}{\gamma (1-\theta)} \frac{\Omega^{2}(x) \left(1+\frac{x}{\gamma} \right)^{-\alpha-1}}{\left[ 1-\left(1+\frac{x}{\gamma} \right)^{-\alpha}\right]^{2}} \\ \times& \sum\limits_{r_{1} = 0}^{n-j} \sum\limits_{r_{2} = 0}^{j+r_{1}-1} (-1)^{r_{1}+r_{2}} \binom{n-j}{r_{1}} \binom{j+r_{1}-1}{r_{2}} \frac{e^{-\lambda (1+r_{2}) \Omega(x)}}{(1-e^{-\lambda})^{r_{1}+j}}, \end{split} \end{equation} | (3.8) |
\begin{equation} \begin{split} F_{j:n}(x)& = \sum\limits_{r_{3} = j}^{n} \sum\limits_{r_{4} = 0}^{n-r_{3}}\sum\limits_{r_{5} = 0}^{r_{3}+r_{4}}(-1)^{r_{4}+r_{5}} \binom{n}{r_{3}} \binom{n-r_{3}}{r_{4}}\binom{r_{3}+r_{4}}{r_{5}} \frac{e^{-\lambda r_{5} \Omega(x)}}{(1-e^{-\lambda})^{r_{3}+r_{4}}}. \end{split} \end{equation} | (3.9) |
Proof. The PDF f_{j:n}(t) of the j -th order statistic, see [18,19], is given by
\begin{equation} \begin{split} f_{j:n}(x)& = j \binom{n}{j} f_{X}(x) [F_{X}(x)]^{j-1} [1-F_{X}(x)]^{n-j}, \end{split} \end{equation} | (3.10) |
where F_{X}(x) and f_{X}(x) are given by Eqs (2.9) and (2.10), respectively.
Therefore,
\begin{split} f_{j:n}(x) = &j\; \binom{n}{j}\sum\limits_{r_{1} = 0}^{n-j}(-1)^{r_{1}}\binom{n-j}{r_{1}} f_{X}(x) [F_{X}(x)]^{j+r_{1}-1}\\ = &j\; \binom{n}{j} \frac{\alpha\lambda}{\gamma (1-\theta)} \frac{\Omega^{2}(x) \left(1+\frac{x}{\gamma} \right)^{-\alpha-1}}{\left[ 1-\left(1+\frac{x}{\gamma} \right)^{-\alpha}\right]^{2}}\\ \times& \sum\limits_{r_{1} = 0}^{n-j} \sum\limits_{r_{2} = 0}^{j+r_{1}-1} (-1)^{r_{1}+r_{2}} \binom{n-j}{r_{1}} \binom{j+r_{1}-1}{r_{2}} \frac{e^{-\lambda (1+r_{2}) \Omega(x)}}{(1-e^{-\lambda})^{r_{1}+j}}. \end{split} |
The CDF F_{j:n}(x) , corresponding to PDF (3.10), is given by
\begin{split} F_{j:n}(x)& = \sum\limits_{r_{3} = j}^{n} \binom{n}{r_{3}} [F_{X}(x)]^{r_{3}} [1-F_{X}(x)]^{n-r_{3}}\\ & = \sum\limits_{r_{3} = j}^{n} \sum\limits_{r_{4} = 0}^{n-r_{3}}\sum\limits_{r_{5} = 0}^{r_{3}+r_{4}}(-1)^{r_{4}+r_{5}} \binom{n}{r_{3}} \binom{n-r_{3}}{r_{4}}\binom{r_{3}+r_{4}}{r_{5}} \frac{e^{-\lambda r_{5} \Omega(x)}}{(1-e^{-\lambda})^{r_{3}+r_{4}}}. \end{split} |
Based on units connected in a series-parallel structure, we discuss, in this section, the application of a progressive-stress model to units whose lifetime distribution subjects to the PGLD with CDF (2.9). We assume that the units are subject to progressive type-II censoring and that the number of surviving units eliminated follows a binomial distribution.
It is well-known in the previous literature regarding the progressive-stress model that the stress is considered an increasing linear function of time. Now, in the current article, the stress is supposed to be represented by an increasing nonlinear function of time.
(1) The lifetime of units under normal conditions is governed by CDF (2.9) of PGLD with scale parameter \gamma .
(2) According to the progressive-stress model, it is assumed that the stress s is a function of time t and affects the scale parameter \gamma of CDF (2.7) which is also a scale parameter of CDF (2.9).
(3) The parameter \gamma follows the inverse power law with two parameters c and d . This means that \gamma \equiv \gamma(t) = \frac{1}{c(s(t))^{d}} . For the sake of simplicity, we'll assume that the parameter d assigns the value 1 from now on.
(4) The progressive-stress s(t) assigns an increasing nonlinear function of time, s(t) = \text{sinh}(v\, t) , which is also continuous and differentiable for t > 0 .
(5) To start the testing procedure, the total N units to be tested are split into \hbar(\geq 2) groups, each of them consists of n_i units under progressive-stress s_i(t) = \text{sinh}(v_i t), \; i = 1, 2, \dots, \hbar, such that the stress rates satisfy 0 < v_1 < v_2 < \dots < v_\hbar.
(6) At whatever stress level v_i, \; i = 1, 2, \dots, \hbar , a unit's failure mechanisms are the same.
(7) The cumulative exposure model holds for the effect of changing stress, see [1].
According to Steps 1, 3, and 5, and using Assumption 7, we can write
\begin{equation} \Phi_i(x) = \int_{0}^{x}\frac{1}{\gamma (s_i(t))}\text{d}t = \frac{c}{v_i}(\text{cosh}(v_i x)-1), i = 1,2,\dots, \hbar, \end{equation} | (4.1) |
where c is a parameter that should be estimated.
Therefore, if G_{iY}(.) denotes the CDF for a unit in group i under progressive-stress model, then using Assumption 7, it takes the form
\begin{equation} G_{iY}(x) = F_{iY}(\Phi_{i}(x)) = 1-\left[1+\frac{c}{v_i}(\text{cosh}(v_i x)-1)\right]^{-\alpha} = 1-[\Psi_i(x)]^{-\alpha}, \end{equation} | (4.2) |
where F_{iY}(.) is the CDF (2.7) of LD, included in (2.9), under group i , with scale parameter value equal to 1 and
\begin{equation} \Psi_i(x) = \Phi_i(x)+1, \end{equation} | (4.3) |
where \Phi_i(x) is given by (4.1).
The corresponding PDF of (4.2) is given by
\begin{equation} g_{iY}(x) = \alpha\, c\, \text{sinh}(v_ix) \left[\Psi_i(x)\right]^{-\alpha-1}. \end{equation} | (4.4) |
Then, by replacing F_Y(y) and f_Y(y) with CDF (4.2) and PDF (4.4), respectively, CDF (2.9) under progressive-stress ALT becomes
\begin{equation} F_{iX}(x) = \frac{1-e^{-\lambda \Omega_i(x)}}{1-e^{-\lambda}}. \end{equation} | (4.5) |
The corresponding PDF of (4.5) is given by
\begin{equation} f_{iX}(x) = \frac{\lambda\, \alpha\, c\, \Omega_i^{2}(x)\text{ sinh}(v_i(x))\Psi_{i}^{-\alpha-1}(x)}{(1-\theta)(1-e^{-\lambda})e^{\lambda \Omega_i(x)}(1-\Psi_{i}^{-\alpha}(x))^{2}}, \end{equation} | (4.6) |
where
\begin{equation} \Omega_i(x) = \frac{(1-\theta)(1-\Psi_{i}^{-\alpha}(x))}{1-\theta(1-\Psi_{i}^{-\alpha}(x))}, x > 0, \end{equation} | (4.7) |
and \Psi_i(x) is given by (4.3).
There are a variety of censored tests available. Progressive type-II censoring is one of the most widely used censored tests. It is implemented under progressive-stress model as follows: Suppose that, in the i -th group, i = 1, 2, \dots, \hbar , n_i units are put through a life test and the experimenter specifies beforehand quantity w_i , the number of failure units that will be observed. Now, when the first failure occurs, r_{i1} of the remaining n_i-1 surviving units are eliminated from the experiment at random. Continuing, when the second failure occurs, r_{i2} of the remaining {n_i}-r_{i1}-2 surviving units are eliminated from the experiment at random. Finally, at the time of occurring the w_i -th failure, all the remaining r_{iw_i} = n_i-w_i-r_{i1}-r_{i2}-\dots-r_{i\, w_i-1} surviving units are eliminated from the experiment at random. Note that, in this scheme, r_{i1}, r_{i2}, \dots, r_{iw_i} are all prefixed. However, these numbers may occur at random, in some practical situations. For example, in some reliability experiments, an experimenter may assess that it is unsuitable or too risky to perform the test on some of the tested units even though these units have not failed. In such situations, the pattern of elimination at each failure is random. Such a situation leads to progressive censoring with random removals, see [20,21].
We discuss, in this section, two estimation methods (maximum likelihood estimation (MLE) and Bayesian estimation (BE)) for the parameters included in CDF (4.5) under progressive type-II censoring with binomial removals.
For group i , i = 1, 2, \dots, \hbar , suppose x_{i1}, x_{i2}, \dots, x_{in_i} are n_i lifetimes from a population with CDF (4.5) and PDF (4.6), and {\bf x} = \left(x_{i1:w_i:n_i}^{(r_{i1}\dots, r_{iw_i})}, x_{i2:w_i:n_i}^{(r_{i1}\dots, r_{iw_i})}\right. , \left.\dots, x_{iw_i:w_i:n_i}^{(r_{i1}\dots, r_{iw_i})}\right) is a vector of w_i progressively type-II ordered lifetimes out of n_i with progressive censoring scheme (r_{i1}\dots, r_{iw_i}) . Suppose also w_i and r_{ij} are all predetermined before the test. From now on, we will write x_{ij} instead of x_{ij:w_i:n_i}^{(r_{i1}\dots, r_{iw_i})} , j = 1, 2, \dots, w_i , for simplicity. Then, based on Eqs (4.2)–(4.7), the conditional likelihood function is given by
\begin{equation} \begin{split} L_1(\alpha,\lambda,\theta,c; {\bf x} \mid {\bf R} = {\bf r})\propto &\prod\limits_{i = 1}^{\hbar} \prod\limits_{j = 1}^{w_i} f_{iX}(x_{ij})[1-F_{iX}(x_{ij})]^{r_{ij}}\\ = &\left(\frac{\lambda\, \alpha\, c}{1-\theta}\right)^{\sum\limits_{i = 1}^{\hbar}w_i} \\ &\times \prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i}\frac{\, \Omega_i^{2}(x_{ij})\text{ sinh}(v_i(x_{ij}))\Psi_{i}^{-\alpha-1}(x_{ij})}{(1-e^{-\lambda})^{r_i+1}e^{\lambda \Omega_i(x_{ij})}(1-\Psi_{i}^{-\alpha}(x_{ij}))^{2}} (e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}. \end{split} \end{equation} | (5.1) |
The numbers r_{ij} may occur at random in some practical scenarios as a result of unanticipated dropout experimental units. Therefore, we assume that R_{ij} \; (i = 1, 2, \dots, \hbar, \; j = 1, 2, \dots, w_i-1) are RVs, with realizations r_{ij} , subject to the following binomial distributions
R_{i1}\sim \text{b}(n_i-w_i,p), |
whereas,
(R_{ij}\mid r_{i1},r_{i2},\dots, r_{i(j-1)})\sim\text{b}(n_i-w_i-\sum\limits_{k = 1}^{j-1}r_{ik},p),\; j = 2,3,\dots,w_i-1. |
Then
\begin{equation} \begin{split} L(\alpha,\lambda,\theta,c;{\bf x}) = &L_1(\alpha,\lambda,\theta,c;{\bf x} \mid{\bf R} = {\bf r})P(R_{i1} = r_{i1})\\ \times& \prod\limits_{k = 2}^{w_i-1}P(R_{ik} = r_{ik}\mid R_{i1} = r_{i1},\dots, R_{i(k-1)} = r_{i(k-1)}), \end{split} \end{equation} | (5.2) |
where, for fixed i and j , r_{ij} is an integer number satisfying 0\leq r_{ij} \leq n_i-w_i-(r_{i1}+r_{i2}+\dots, r_{i(j-1)}), \; i = 1, 2, \dots, \hbar, \, j = 2, 3, \dots, w_i-1, and
P(R_{ij} = r_{ij} \mid R_{i(j-1)} = r_{i(j-1)}, \dots R_{i1} = r_{i1}) = \left( \begin{array}{c} n_i-w_i-\sum_{k = 1}^{j-1}r_{ik} \\ r_{ij} \\ \end{array} \right)p^{r_{ij}}(1-p)^{n_i-w_i-\sum_{k}^{j}r_{ik}}. |
Furthermore, suppose that R_{ij} is independent of X_{ij} for i = 1, \dots, \hbar , j = 1, \dots, w_i . Then
\begin{equation} \begin{split} P({\bf R} = {\bf r}) = &\prod\limits_{i = 1}^{\hbar}[P(R_{i(w_i-1)} = r_{i(w_i-1)}\mid R_{i(w_i-2)} = r_{i(w_i-2)},\dots, R_{i1} = r_{i1})\dots P(R_{i1} = r_{i1})]\\ = & \prod\limits_{i = 1}^{\hbar}\left[\frac{(n_i-w_i)!}{(n_i-w_i-\sum_{k = 1}^{w_i-1}r_{ik})!\prod\limits_{k = 1}^{w_i-1}r_{ik}!}p^{\sum_{k = 1}^{w_i-1}r_{ik}}(1-p)^{(w_i-1)(n_i-w_i)- \sum_{k = 1}^{w_i-1}r_{ik}(w_i-k)}\right]. \end{split} \end{equation} | (5.3) |
Since P({\bf R} = {\bf r}) does not depend on the parameters (\alpha, \lambda, \theta, c) , hence the MLE of them can be derived by maximizing (5.1). Similarly, (5.1) does not involve the binomial parameter p , the MLE of p can be found by maximizing (5.3) directly. Thus
\begin{equation*} \frac{\partial(\text{ln} L)}{\partial p} = 0 = \frac{1}{p}\sum\limits_{i = 1}^{\hbar}\sum\limits_{k = 1}^{w_i-1}r_{ik}-\sum\limits_{i = 1}^{\hbar}\left\{\frac{(w_i-1)(n_i-w_i)-\sum_{k = 1}^{w_i-1}(w_i-k)r_{ik}}{1-p}\right\}. \end{equation*} |
Therefore,
\begin{equation} \hat{p} = \frac{\sum_{i = 1}^{\hbar} \sum_{k = 1}^{w_i-1}r_{ik}}{\sum_{i = 1}^{\hbar}(w_i-1)(n_i-w_i)-\sum_{i = 1}^{\hbar}\sum_{k = 1}^{w_i-1}(w_i-k-1)r_{ik}}. \end{equation} | (5.4) |
The local Fisher information matrix, {\bf{I}} , for (\hat{\alpha}, \hat{\lambda}, \hat{\theta}, \hat{c}) is the 4\times 4 symmetric matrix of negative second partial derivatives of £ = \text{ln} L_1 with respect to \alpha , \lambda , \theta , and c , see [1]. Let (\varphi_1 = \alpha, \, \varphi_2 = \lambda, \, \varphi_3 = \theta, \, \varphi_4 = c) . Therefore, matrix {\bf{I}} is given by
\begin{equation*} {\bf{I}} = -\left(\frac{\partial^2\hat{£}}{\partial \varphi_{i} \partial \varphi_{j}} \right)_{4\times4}, i,j = 1,\dots,4, \end{equation*} |
where the caret \hat{} denotes that the derivative is computed at (\hat{\varphi_1} = \hat{\alpha}, \, \hat{\varphi_2} = \hat{\lambda}, \, \hat{\varphi_3} = \hat{\theta}, \, \hat{\varphi_4} = \hat{c}) . It is easy to get the matrix's elements.
The local estimate {\bf{V}} of the asymptotic variance-covariance matrix of (\hat{\alpha}, \hat{\lambda}, \hat{\theta}, \hat{c}) can be obtained by inverting matrix {\bf{I}} . Therefore,
\begin{equation} {\bf{V}} = {\bf{I^{-1}}} = \left(\text{cov}\left(\varphi_{i},\varphi_{j}\right) \right)_{4\times4}, i,j = 1,\dots,4. \end{equation} | (5.5) |
The sampling distribution of \frac{\hat{\varphi_i}-\varphi_i}{\sqrt{\text{var}(\hat{\varphi_i})}}, \; i = 1, \dots, 4, follows the general asymptotic theory of MLEs and hence it may be approximated by a standard normal distribution which can be used to create confidence intervals (CIs) for unknown parameters.
Therefore, a two-sided (1-\eta^{\star})100\% asymptotic CIs for the parameters \varphi_i, \; i = 1, \dots, 4 , can then be created as follows:
\hat{\varphi_i}\mp z_{\eta^{\star}/2}\sqrt{\text{var}(\hat{\varphi_i})} |
where z_{\eta^{\star}/2} is the value of a standard normal RV that leaves an area \eta^{\star}/2 to the right and \sqrt{\text{var}(\hat{\varphi_i})} can be determined from (5.5).
It's commonly beneficial in practical applications to have a concept of how long a test should last. Because the time it takes to complete the test is proportional to the expense. For i = 1, \dots, \hbar , the expected termination time under progressive type-II censoring scheme with binomial removals is given by, see [22].
\begin{equation} \begin{split} E[X_{iw_i} \mid {{\bf R}_i} = {{\bf r}_i}] = &C({{\bf r}_i})\sum\limits_{\ell_1 = 0}^{r_{i1}}\dots \sum\limits_{\ell_{w_i} = 0}^{r_{iw_i}}(-1)^{\mathcal{A}_i} \frac{ \left( \begin{array}{c} {r_{i1}}\\ {\ell_1} \\ \end{array} \right) \dots \left( \begin{array}{c} {r_{iw_i}}\\ {\ell_{w_i}} \\ \end{array} \right) }{\prod_{k = 1}^{w_i-1}h(\ell_k)} \int_{0}^{\infty}x f_X(x) F_X^{h(\ell_{w_i})-1}(x)\text{d}x, \end{split} \end{equation} | (5.6) |
where {{\bf R}_i} = (R_{i1} = r_{i1}, \dots, R_{i(w_i-1)} = r_{i(w_i-1)}) , ( F_X(x), f_X(x) ) is given by ((4.5), (4.6)), \mathcal{A}_i = \sum_{k = 1}^{w_i}\ell_k , h(\ell_k) = \ell_1+\dots+ \ell_k+k , and C({{\bf r}_i}) = n_i(n_i-r_{i1}-1)(n_i-r_{i1}-r_{i2}-2)\dots(n_i-\sum_{k = 1}^{w_i-1}(r_{ik}+1)) . Therefore,
\begin{equation*} \begin{split} E[X_{iw_i}] = &E_{{\bf R}}[E(X_{iw_i} \mid {{\bf R}_i})]\\ = & \sum\limits_{r_{i1} = 0}^{q(r_{i1})} \dots \sum\limits_{r_{i(w_i-1)} = 0}^{q(r_{i(w_i-1)})}P({\bf R} = {\bf r})E(X_{iw_i}\mid {\bf R}), \end{split} \end{equation*} |
where q(r_{i1}) = n_i-w_i, \, q(r_{ij}) = n_i-w_i-r_{i1}-r_{i2}-\dots-r_{i(j-1)}, \, i = 1, 2, \dots, \hbar, \, j = 2, 3, \dots, w_i-1, and P({\bf R} = {\bf r}) is given by Eq (5.3). Then compute the ratio of expected experiment time (REET) as follows:
REET = \frac{E(X_{iw_i})}{E(X_{in_i})}. |
Based on two asymmetric loss functions (general entropy (GE) and linear exponential (LINEX)), we discuss the Bayesian estimation of the parameters of CDF (4.5). Because they offer overestimation or underestimation of the parameters, symmetric loss functions may be unsuitable in many real-life situations. Overestimating the parameters can have worse or worse repercussions than underestimating them or vice versa. As a result, asymmetric loss functions have been the subject of research, see [23,24].
The LINEX loss function was suggested by Varian [25]. It is given by
\mathcal{L}(\xi)\propto e^{\nu \xi}- \nu \xi-1, \nu\neq 0, |
where \xi = \tilde{\Theta}-\Theta and \tilde{\Theta} is the LINEX estimator of \Theta .
The Bayes estimate under LINEX (BEL) loss function of \Theta is given by
\begin{equation} \begin{split} \tilde{\Theta} = \frac{-1}{\nu}\ln[E({e^{-\nu \Theta}}|{{\bf{x}}})]. \end{split} \end{equation} | (5.7) |
The GE loss function was introduced by Calabria and Pulcini [26]. It is given by
\mathcal{L}(\ddot{\Theta},\Theta)\propto \left(\frac{\ddot{\Theta}}{\Theta}\right)^{\nu}-\nu\ln\left(\frac{\ddot{\Theta}}{\Theta}\right)-1, \nu\neq 0. |
The Bayes estimate under GE (BEG) loss function of \Theta is given by
\begin{equation} \begin{split} \ddot{\Theta} = \left[E(\Theta^{-\nu})\right]^{\frac{-1}{\nu}}. \end{split} \end{equation} | (5.8) |
Assume that the experimenter's prior belief is measured by a function \vartheta(\alpha, \lambda, \theta, c) , with all parameters being independent and having log-normal distributions except \theta , which has a beta distribution. Therefore, if (\varphi_1 = \alpha, \, \varphi_2 = \lambda, \varphi_3 = c, \varphi_4 = \theta) , then the prior function of \varphi_i , i = 1, 2, 3 , is given by
\begin{equation} \vartheta_i(\varphi_i) = \frac{1}{\sigma_i\varphi_i \sqrt{2 \pi}}\text{exp}\left\{-\frac{1}{2}\left(\frac{\text{ln } \varphi_i-\mu_i}{\sigma_i}\right)^{2}\right\},\; \varphi_i > 0,\; (-\infty < \mu_i < \infty,\, \sigma_i > 0), \end{equation} | (5.9) |
and the prior function of \varphi_4 is given by
\begin{equation} \vartheta_4(\varphi_4) = \frac{1}{B(a_1,a_2)}\varphi_4^{a_1-1}(1-\varphi_4)^{a_2-1},\; 0 < \varphi_4 < 1, \, (a_1,a_2 > 0). \end{equation} | (5.10) |
The joint prior density function is then calculated as follows:
\begin{equation} \vartheta(\alpha,\lambda,\theta,c) = \vartheta_{1}(\alpha)\, \vartheta_{2}(\lambda)\, \vartheta_{3}(c)\, \vartheta_{4}(\theta)\propto \frac{1}{\alpha\lambda c}e^{-\Delta}\theta^{a_1-1}(1-\theta)^{a_2-1}, \end{equation} | (5.11) |
\Delta = \frac{1}{2}\left\{\left(\frac{\text{ln } \alpha-\mu_1}{\sigma_1}\right)^{2}+\left(\frac{\text{ln } \lambda-\mu_2}{\sigma_2}\right)^{2}+\left(\frac{\text{ln } c-\mu_3}{\sigma_3}\right)^{2}\right\}. |
From (5.1) and (5.11), the joint posterior density function is then calculated as follows:
\begin{equation} \begin{split} \vartheta^{*}(\alpha,\lambda,\theta,c|{{\bf{x}}}, {{\bf{r}}})& = K^{-1}\frac{\alpha^{\sum\limits_{i = 1}^\hbar w_i-1} \lambda^{\sum\limits_{i = 1}^\hbar w_i-1} c^{\sum\limits_{i = 1}^\hbar w_i-1}}{(1-e^{-\lambda})^{\sum\limits_{i = 1}^{\hbar}(w_i+\sum\limits_{j = 1}^{w_i}r_{ij})}} \frac{\theta^{a_1-1}e^ {-\Delta}}{(1-\theta)^{\sum\limits_{i = 1}^{\hbar}w_i-a_2+1}}\\\\& \times \prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{\Omega_{i}^{2}(x_{ij})\text{ sinh}(v_i\,x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}, \end{split} \end{equation} | (5.12) |
where
K = \int_{0}^{\infty}\int_{0}^{1}\int_{0}^{\infty}\int_{0}^{\infty}\vartheta(\alpha,\lambda,\theta,c) L(\alpha,\lambda,\theta,c)\text{ d}\alpha\text{ d}\lambda\text{ d}\theta\text{ d}c. |
It can be noted that K involves quad integral and it is not reducible in a closed form, and hence generating samples directly from the joint posterior density is not possible. The MCMC algorithm, presented in the following subsection, can be implemented in this case in which we need the following conditional posterior distributions of the parameters \alpha , \lambda , \theta and c ,
\begin{equation} \left. \begin{split} \vartheta^{*}(\alpha\mid \lambda,\theta,c)& \propto \alpha^{\sum\limits_{i = 1}^{\hbar}w_i-1}\text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }\alpha-\mu_1}{\sigma_1}\right)^{2}\right] \\& \times\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i}\frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(\lambda\mid \alpha,\theta,c)& \propto \frac{\lambda^{\sum\limits_{i = 1}^\hbar w_i-1} }{(1-e^{-\lambda})^{\sum\limits_{i = 1}^{\hbar}(w_i+\sum\limits_{j = 1}^{w_i}r_{ij})}} \text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }\lambda-\mu_2}{\sigma_2}\right)^{2}\right] \\ & \times \prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(\theta\mid \alpha,\lambda,c)& \propto \frac{\theta^{a_1-1}}{(1-\theta)^{\sum\limits_{i = 1}^{\hbar}w_i-a_2+1}}\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i} \frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{e^{\lambda \Omega_i(x_{ij})}}, \\ \vartheta^{*}(c\mid \alpha,\lambda,\theta)&\propto c^{\sum\limits_{i = 1}^{\hbar}w_i-1}\text{exp}\left[-\frac{1}{2}\left(\frac{\text{ln }c-\mu_3}{\sigma_3}\right)^{2}\right] \\ & \times\prod\limits_{i = 1}^{\hbar}\prod\limits_{j = 1}^{w_i}\frac{\Omega_{i}^{2}(x_{ij})(e^{-\lambda \Omega_i(x_{ij})}-e^{-\lambda})^{r_{ij}}}{[1-(\Psi_{i}(x_{ij}))^{-\alpha}]^2[\Psi_{i}(x_{ij})]^{\alpha+1}e^{\lambda \Omega_i(x_{ij})}}. \end{split}\right\} \end{equation} | (5.13) |
The Metropolis-Hasting technique can be used if the conditional posterior distribution isn't one of the known parametric distributions. Then, it can be used to generate samples from conditional posterior distributions using the MCMC algorithm, see [27].
The following procedure can be used to compute BELs and BEGs of \alpha , \lambda , \theta , and c ,
(1) Assign some initial values of \alpha , \lambda , \theta and c say \alpha_{0} , \lambda_{0} , \theta_{0} and c_{0} .
(2) For i = 1 , using Metropolis-Hastings technique, generate \alpha_{i} , \lambda_{i} , \theta_{i} and c_{i} from the conditional posterior distributions presented in (5.13).
(3) Repeat Step 2, \mathbb{M} times.
(4) Calculate the BELs of \alpha , \lambda , \theta and c , using Eq (5.7), as
\begin{equation} \left.\begin{split} \tilde{\alpha}& = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \alpha_{i}}\right], \tilde{\lambda} = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \lambda_{i}}\right],\\ \tilde{\theta}& = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu \theta_{i}}\right], \tilde{c} = \frac{-1}{\nu}\ln\left[ \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}e^{-\nu c_{i}}\right], \end{split}\right\} \end{equation} | (5.14) |
where \mathbb{W} is the burn-in period.
(5) Calculate the BEGs of \alpha , \lambda , \theta and c , using Eq (5.8), as follows:
\begin{equation} \left.\begin{split} \ddot{\alpha}& = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\alpha_{i}^{-\nu}\right)^{\frac{-1}{\nu}}, \ddot{\lambda} = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\lambda_{i}^{-\nu}\right)^{\frac{-1}{\nu}},\\ \ddot{\theta}& = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}\theta_{i}^{-\nu}\right)^{\frac{-1}{\nu}}, \ddot{c} = \left( \frac{1}{\mathbb{M}-\mathbb{W}}\sum\limits_{i = \mathbb{W}+1}^{\mathbb{M}}c_{i}^{-\nu}\right)^{\frac{-1}{\nu}}. \end{split}\right\} \end{equation} | (5.15) |
(6) The highest posterior density (HPD) credible interval of parameter \alpha can be computed by arranging in an ascending order \alpha_{\mathbb{W}+1} , \dots, \alpha_{\mathbb{M}} as \alpha_{[1]} < \dots < \alpha_{[\mathbb{M}-\mathbb{W}]} , then compute the lower and upper bounds of all (1-\eta^{\star})100\% credible intervals of \alpha as follows:
\left(\alpha_{[1]},\alpha_{[(\mathbb{M}-\mathbb{W})(1-\eta^{\star})+1]}\right),\dots,\left(\alpha_{[(\mathbb{M}-\mathbb{W})\eta^{\star}]}, \alpha_{[\mathbb{M}-\mathbb{W}]}\right), |
where [x] denotes the largest integer number less than or equal to x . Then the HPD credible interval of \alpha is the one with the shortest length. The HPD credible interval of \lambda , \theta and c can also be computed in the same way.
(7) The symmetric credible intervals of \alpha can be computed as follows:
\left(\alpha_{[(\mathbb{M}-\mathbb{W})\frac{\eta^{\star}}{2}]}, \alpha_{[(\mathbb{M}-\mathbb{W})(1-\frac{\eta^{\star}}{2})]}\right). |
The symmetric credible intervals of \lambda , \theta and c can also be computed in the same way.
Two real data sets are proposed, in this section, for fitting and comparing the PGLD, Poisson-Lomax distribution (PLD) [28], geometric-Lomax distribution (GLD), exponentiated LD (ELD) [29] and LD with CDF (2.7).
The CDFs of PLD, GLD, and ELD are given, respectively, by
\begin{equation} \left. \begin{split} F_{PLD}& = \frac{e^{-\lambda \left(1+\frac{x}{\gamma}\right)^{-\alpha} } -e^{-\lambda}} {1-e^{-\lambda}}, \\ F_{GLD}& = \frac{(1-\theta)\left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]}{1-\theta\left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]},\\ F_{ELD}& = \left[1-\left(1+\frac{x}{\gamma}\right)^{-\alpha}\right]^{\lambda}.\\ \end{split}\right\} \end{equation} | (6.1) |
It can be noticed that the GLD is a special case of the PGLD as \lambda\rightarrow 0^+ .
● The first data set is taken from [30]. It consists of 72 exceedances for the years from 1958 to 1984, rounded to one decimal place. The data are given as follows:
1.7, 2.2, 14.4, 1.1, 0.4, 20.6, 5.3, 0.7, 1.9, 13.0, 12.0, 9.3, 1.4, 18.7, 8.5, 25.5, 11.6, 14.1, 22.1, 1.1, 2.5, 14.4, 1.7, 37.6, 0.6, 2.2, 39.0, 0.3, 15.0, 11.0, 7.3, 22.9, 1.7, 0.1, 1.1, 0.6, 9.0, 1.7, 7.0, 20.1, 0.4, 2.8, 14.1, 9.9, 10.4, 10.7, 30.0, 3.6, 5.6, 30.8, 13.3, 4.2, 25.5, 3.4, 11.9, 21.5, 27.6, 36.4, 2.7, 64.0, 1.5, 2.5, 27.4, 1.0, 27.1, 20.2, 16.8, 5.3, 9.7, 27.5, 2.5, 27.0.
● The second data set represents the marks of 48 slow space students in Mathematics in the final examination of the Indian Institute of Technology, Kanpur in year 2003, [31]. The data are given as follows:
29, 25, 50, 15, 13, 27, 15, 18, 7, 7, 8, 19, 12, 18, 5, 21, 15, 86, 21, 15, 14, 39, 15, 14, 70, 44, 6, 23, 58, 19, 50, 23, 11, 6, 34, 18, 28, 34, 12, 37, 4, 60, 20, 23, 40, 65, 19, 31.
The Kolmogorov-Smirnov (K-S) statistic and its corresponding p-value are used to check the validity of the PGLD, PLD, GLD, ELD, and LD to fit the above two data sets. The results are shown in Table 1. Also, the results of a comparison among these five distributions using some criteria such as the Akaike information criterion (AIC), consistent AIC (CAIC), and Bayesian information criterion (BIC), are shown in Table 1, where
\text{AIC} = 2\,r-2£(\hat{\beta}), \text{CAIC} = \frac{2\,r\, w}{w-r-1}-2£(\hat{\beta}), \text{BIC} = r\,\ln[w]-2£(\hat{\beta}), |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
where £(\hat{\beta}) stands for the log-likelihood function calculated at the MLE \hat{\beta} of \beta , r and w denote the number of parameters and the sample size, respectively.
According to the values of K-S statistic and its corresponding p-value, presented in Table 1, it's worth noting that the PGLD has the smallest(largest) K-S (p-)values than those for the PLD, GLD, ELD, and LD. Therefore, the PGLD is a better fit for the data than the other four distributions. Since the PGLD has the smallest values of AIC, CAIC, and BIC, then this is considered another indicator of the superiority of the PGLD. Figure 4 shows the comparison graphically by plotting the empirical CDF against the CDF of PGLD, PLD, GLD, ELD, and LD. In Table 1, we note that the PLD, ELD, and LD (GLD, ELD, and LD) do not fit well the first (second) data set (based on the p-value ( < 0.05 )) but we use them for comparison purposes.
In this section, a simulation study is performed to evaluate the performance of the estimation methods presented in Section 5. The MLEs, BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed and compared, through their mean squared errors (MSEs) and relative absolute biases (RABs), via a Monte Carlo simulation. The 95\% CIs, symmetric and HPD credible intervals are also computed and compared through their average interval lengths (AILs). The following algorithm is used to perform a simulation study:
(1) For i = 1, \dots, \hbar , generate the values of progressive censoring with binomial removals, r_{ij} , such that R_{ij}\sim Binomial (n_{i}-w_{i}-\sum_{k = 1}^{j-1}R_{ik}, p) distribution, where p is the removal probability, j = 1, \dots, w_{i}-1 and R_{i w_{i}} \; = \; n_{i} \; - \; w_{i} \; - \; \sum_{j = 1}^{w_{i}-1} \; R_{ij} .
(2) For given values of the prior parameters ( \mu_{1} , \mu_{2} , \mu_{3} , \sigma_{1} , \sigma_{2} , \sigma_{3} , a_{1} , a_{2} ), generate values for the parameters ( \alpha , \lambda , \theta, c ).
(3) For i = 1, \dots, \hbar , generate a progressively type-II censored sample of size w_{i} from PGLD with CDF (2.9), according to the algorithm given in [32].
(4) The MLEs, BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed as shown in Section 5. The BELs and BEGs of the parameters \alpha , \lambda , \theta and c are computed based on \mathbb{M}( = 5500) MCMC samples and discard the first 500 values as burn-in period.
(5) Repeat the above steps \mathbb{N}( = 1,000) times.
(6) If \hat{\Theta} is an estimate of \Theta , then the average of estimates, MSE and RAB of \hat{\Theta} over \mathbb{N} samples are given, respectively, by
\begin{split} \overline{\widehat{\Theta}} = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}\hat{\Theta}_{i}, \text{MSE}(\hat{\Theta}) = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}(\hat{\Theta}_{i}-\Theta)^{2}, \text{RAB}(\hat{\Theta}) = \frac{1}{\mathbb{N}}\sum\limits_{i = 1}^{\mathbb{N}}\frac{|\hat{\Theta}_{i}-\Theta|}{\Theta}. \end{split} |
(7) Calculate the average of estimates of the parameters \alpha , \lambda , \theta and c and their MSEs and RABs as shown in Step 6. Calculate also the average of the MSEs (MMSE) and the average of the RABs (MRAB) over the four parameter estimates.
(8) Calculate the 95\% CIs (as shown in Subsection 5.1), symmetric and HPD credible intervals (as shown in Subsection 5.3) of the parameters and then calculate the average interval lengths (AILs) of them. Calculate also the average of the AILs (MAIL) over the four parameter estimates.
Tables 2–4 provide the computational results, which take into account the prior parameter values: \mu_1 = 0.154 , \mu_2 = 0.8 , \mu_3 = -1.65 , \sigma_1 = 0.8 , \sigma_2 = 0.07 , \sigma_3 = 0.3 , a_{1} = 2.3 and a_{2} = 0.25 to generate population parameter values: \alpha = 1.6 , \lambda = 2.3 , \theta = 0.9 and c = 0.2 . Three distinct removal probabilities p = 0.15, 0.55, and 0.95 are considered.
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |
The total number of observations N is divided into two groups, \hbar = 2 , and then into four groups, \hbar = 4 , for comparison among MLEs, BELs, and BEGs.
● When there are two groups (\hbar = 2) ,
\begin{split} n_{1}& = n_{2} = N/2 \text{and} w_{1} = w_{2} = 50\% \,\, \text{and} \,\, 80\% \,\,\text{of the sample size},\\ v_{1}& = 0.5 \,\, \text{and}\,\, v_{2} = 1. \end{split} |
● When there are four groups (\hbar = 4) ,
\begin{split} n_{1}& = n_{2} = n_{3} = n_{4} = N/4 \; \text{and} \; w_{1} = w_{2} = w_{3} = w_{4} = 50\% \,\, \text{and} \,\, 80\% \,\,\text{of the sample size}\\ v_{1}& = 0.5, v_{2} = 1.0, \, v_{3} = 1.5, \,\, \text{and}\,\, v_{4} = 2.0. \end{split} |
The MLEs (BELs and BEGs) of the parameters \alpha , \lambda , \theta and c with their MSEs, RABs, MMSEs, MRABs are displayed in Table 2 (Table 3). The HPD credible intervals, symmetric credible intervals, and CIs with their AILs and MAILs are presented in Table 4.
We can observe from Tables 2–4 that:
(1) Through the MMSEs and MRABs, the BELs and BEGs outperform the MLEs.
(2) Through the MAILs, the HPD and symmetric credible intervals are superior to CIs. This confirms that BELs and BEGs are superior to MLEs.
(3) The HPD is better than symmetric credible intervals via the MAILs.
(4) For fixed p , \hbar , N , and n_{i} (or w_{i} ), i = 1, \dots, \hbar , by increasing w_{i} (or n_{i} ) the MSEs, RABs, MMSEs, MRABs, AILs and MAILs decrease. This assures that the more data we collect, the more accurate our results will be.
(5) The BELs and BEGs are better at \nu = 0.5 than those at \nu = -0.5 through comparing the MMSE.
(6) The BELs are better than BEGs at \nu = 0.5 but the converse is true at \nu = -0.5 through comparing the MMSE.
(7) The MLEs are better at p = 0.15 than those at p = 0.55, 0.95 through comparing the MMSE.
(8) Better results have been obtained at \hbar = 2 than those obtained at \hbar = 4 since the number of observations in the subgroups became smaller than those at \hbar = 2 .
Except in a few rare cases, the above results are satisfactory, which could be related to data fluctuations.
Furthermore, if the hyperparameters are unknown, we can estimate them using past samples following the empirical Bayes approach, see [33]. Alternatively, the hierarchical Bayes technique, which uses a suitable prior for the hyperparameters, could be utilized, see [34].
Due to the progress in manufacturing devices in the last decades, physicists and engineers may face a problem in assessing the lifetime distribution of these devices if they are connected in a certain mixed system such as a series-parallel system. To overcome this problem, we have introduced a new distribution called PGLD. The mentioned distribution can describe the lifetime distribution of series-parallel systems when the number of series subsystems, as well as the number of their parallel components, are random variables. The PGLD may arise by compounding two discrete (truncated Poisson and geometric) distributions with a mixture of continuous distributions, LD. Some important properties of the PGLD have been investigated, such as the q -th quantile, mode, r -th moment, mean residual lifetime, Bonferroni and Lorenz curves, Rényi and Shannon's entropies, PDF and CDF of the i -th order statistic. The progressive-stress model has been applied to units connected in a series-parallel structure. The lifetimes of these units under normal stress conditions are assumed to follow LD which is considered a mixture of exponential and gamma distributions. The progressive-stress model was used when the stress is increasing nonlinearly over time, and the inverse power law model had established a relationship between the stress and the proposed distribution's scale parameter. Based on progressively type-II censored data with binomial eliminations, two estimation methods were performed to estimate the unknown parameters. The Bayesian estimation was performed using two asymmetric (LINEX and GE) loss functions. CIs, symmetric, and HPD credible intervals for the unknown parameters were established. The numerical results showed that the Bayes estimates were performed well than the MLEs. An illustrative example, based on two real data sets, demonstrated the superiority of the proposed distribution over some other four distributions.
Finally, the features and motivations to the PGLD can be summarized as follows:
(1) The CDF of the PGLD has a closed-form. This feature simplifies its use.
(2) The four parameters included in the CDF of PGLD give it the ability to fit several data.
(3) The CDF of PGLD includes the CDF of GLD, PLD, and LD as special cases.
(4) The PGLD can represent the failure time of a series-parallel system. This is considered a motivation to introduce the PGLD.
(5) The HRF of PGLD has a unimodal shape. This feature gives it more flexibility to fit and analyze several data arising from increasing and decreasing hazard rates.
(6) The PGLD can represent the non-stationary data. This feature may be useful for the experimenter to forecast how some products would behave in different environments.
(7) The PGLD is better to fit the data than some other distributions such as PLD, GLD, ELD, and LD.
(8) Some other distributions may be emerged from Theorem 2.1 by choosing some other continuous distributions rather than LD.
Based on the above features and motivations, we hope that the PGLD will attract more attention from physicists and engineers in the near future.
The following points may be investigated in future work:
(1) The inference procedure may be implemented based on a general progressive censoring scheme such as hybrid progressive censoring.
(2) Some other estimation methods such as the moments and probability weighted moments may be discussed.
(3) Some other types of ALTs such as the step-stress ALT may be considered.
(4) Prediction of future order statistics based on the PGLD may be investigated.
The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by grant code (19-SCI-1-03-0009). They are also grateful to the editor and reviewers for their valuable comments and suggestions, which improved the paper.
The authors declare that they have no conflicts of interest regarding the publication of this article.
[1] | W. Nelson, Accelerated testing: Statistical models, test plans and data analysis, New York: Wiley, 1990. doi: 10.1002/9780470316795. |
[2] | V. Bagdonavicius, M. Nikulin, Accelerated life models: Modeling and statistical analysis, USA: CRC Press, 2002. doi: 10.1201/9781420035872. |
[3] |
X. K. Yin, B. Z. Sheng, Some aspects of accelerated life testing by progressive stress, IEEE Trans. Reliab., 36 (1987), 150–155. doi: 10.1109/TR.1987.5222320. doi: 10.1109/TR.1987.5222320
![]() |
[4] |
A. H. Abdel-Hamid, E. K. AL-Hussaini, Inference and optimal design based on step-partially accelerated life tests for the generalized Pareto distribution under progressive type-I censoring, Commun. Statist.-Simul. Comput., 44 (2014), 1750–1769. doi: 10.1080/03610918.2013.826363. doi: 10.1080/03610918.2013.826363
![]() |
[5] |
A. H. Abdel-Hamid, T. A. Abushal, Inference on progressive-stress model for the exponentiated exponential distribution under type-II progressive hybrid censoring, J. Statist. Comput. Simul., 85 (2015), 1165–1186. doi: 10.1080/00949655.2013.868463. doi: 10.1080/00949655.2013.868463
![]() |
[6] |
E. K. AL-Hussaini, A. H. Abdel-Hamid, A. F. Hashem, One-sample Bayesian prediction intervals based on progressively type-II censored data from the half-logistic distribution under progressive-stress model, Metrika, 78 (2015), 771–783. doi: 10.1007/s00184-014-0526-4. doi: 10.1007/s00184-014-0526-4
![]() |
[7] |
A. H. Abdel-Hamid, A. F. Hashem, A new lifetime distribution for a series-parallel system: Properties, applications and estimations under progressive type-II censoring, J. Statist. Comput. Simul., 87 (2017), 993–1024. doi: 10.1080/00949655.2016.1243683. doi: 10.1080/00949655.2016.1243683
![]() |
[8] |
A. H. Abdel-Hamid, A. F. Hashem, A new compound distribution based on a mixture of distributions and a mixed system, C. R. Acad. Bulg. Sci., 71 (2018), 1439–1450. doi: 10.7546/CRABS.2018.11.01. doi: 10.7546/CRABS.2018.11.01
![]() |
[9] |
S. Nadarajah, A. H. Abdel-Hamid, A. F. Hashem, Inference for a geometric-Poisson-Rayleigh distribution under progressive-stress model based on type-I progressive hybrid censoring with binomial removals, Qual. Reliab. Eng. Int., 34 (2018), 649–680. doi: 10.1002/qre.2279. doi: 10.1002/qre.2279
![]() |
[10] |
Y. Hu, Y. Ding, F. Wen, L. Liu, Reliability assessment in distributed multi-state series-parallel systems, Energy Procedia, 159 (2019), 104–110. doi: 10.1016/j.egypro.2018.12.026, doi: 10.1016/j.egypro.2018.12.026,
![]() |
[11] |
A. F. Hashem, S. A. Alyami, Inference on a new lifetime distribution under progressive type II censoring for a parallel-series structure, Complexity, 2021 (2021), 6684918. doi: 10.1155/2021/6684918. doi: 10.1155/2021/6684918
![]() |
[12] |
H. Teicher, On the mixture of distributions, Ann. Math. Statist., 31 (1960), 55–73. doi: 10.1214/aoms/1177705987. doi: 10.1214/aoms/1177705987
![]() |
[13] | R. A. Fisher, The mathematical theory of probabilities and its applications to frequency-curves and statistical methods, New York: The Macmillan Company, 1922. |
[14] | D. M. Titterington, A. F. M. Smith, U. E. Makov, Statistical analysis of finite mixture distributions, New York: Wiley, 1985. |
[15] | G. J. McLachlan, K. E. Basford, Mixture models: Inferences and applications to clustering, New York: Marcel Dekker, 1988. |
[16] |
E. K. AL-Hussaini, A. H. Abdel-Hamid, Generation of distribution functions: A survey, J. Statist. Appl., 7 (2018), 91–103. doi: 10.18576/jsap/070109. doi: 10.18576/jsap/070109
![]() |
[17] | C. Canuto, M. Y. Hussaini, A. Quarteroni, T. A. Zang, Spectral methods: Fundamentals in single domains, New York: Springer-Verlag, 2006. doi: 10.1007/978-3-540-30726-6. |
[18] | B. C. Arnold, N. Balakrishnan, H. N. Nagaraja, A first course in order statistics, Society of Industrial and Applied Mathematics, 1992. doi: 10.1137/1.9780898719062. |
[19] | H. A. David, H. A. Nagaraja, Order statistics, Hoboken (NJ): John Wiley & Sons, 2003. |
[20] |
H. K. Yuen, S. K. Tse, Parameters estimation for Weibull distributed lifetimes under progressive censoring with random removals, J. Stat. Comput. Simul., 55 (1996), 57–71. doi: 10.1080/00949659608811749. doi: 10.1080/00949659608811749
![]() |
[21] |
H. A. Zeinab, Bayesian inference for the pareto lifetime model under progressive censoring with binomial removals, J. Appl. Stat., 35 (2008), 1203–1217. doi: 10.1080/09537280802187634. doi: 10.1080/09537280802187634
![]() |
[22] | N. Balakrishnan, R. Aggarwala, Progressive censoring: Theory, methods, and applications, Boston: Birkh\ddot{\text a}user, 2000. |
[23] |
A. Zellner, Bayesian estimation and prediction using asymmetric loss function, J. Am. Stat. Assoc., 81 (1986), 446–451. doi: 10.1080/01621459.1986.10478289
![]() |
[24] |
R. Srivastava, V. Tanna, An estimation procedure for error variance incorporating PTS for random effects model under LINEX loss function, Commun. Stat. Theor. Meth., 30 (2001), 2583–2599. doi: 10.1081/STA-100108449. doi: 10.1081/STA-100108449
![]() |
[25] | H. R. Varian, Bayesian approach to real estate assessment, In: L. J. Savage, S. E. Feinderg, A. Zellner, Studies in Bayesian econometrics and statistics, North-Holland, Amsterdam, (1975), 195–208. |
[26] |
R. Calabria, G. Pulcini, An engineering approach to Bayes estimation for the Weibull distribution, Microelectron. Reliab., 34 (1994), 789–802. doi: 10.1016/0026-2714(94)90004-3. doi: 10.1016/0026-2714(94)90004-3
![]() |
[27] |
S. K. Upadhyay, A. A. Gupta, Bayes analysis of modified Weibull distribution via Markov chain Monte Carlo simulation, J. Stat. Comput. Simul., 80 (2010), 241–254. doi: 10.1080/00949650802600730. doi: 10.1080/00949650802600730
![]() |
[28] |
B. Al-Zahrani, H. Sagor, The poisson-lomax distribution, Rev. Colomb. Estad., 37 (2014), 223–243. doi: 10.15446/rce.v37n1.44369. doi: 10.15446/rce.v37n1.44369
![]() |
[29] | I. B. Abdul-Moniem, H. F. Abdel-Hameed, On exponentiated Lomax distribution, Int. J. Math. Educ., 3 (2012), 2144–2150. |
[30] |
V. Choulakian, M. A. Stephens, Goodness-of-fit for the generalized Pareto distribution, Technometrics, 43 (2001), 478–484. doi: 10.1198/00401700152672573. doi: 10.1198/00401700152672573
![]() |
[31] |
R. D. Gupta, D. Kundu, A new class of weighted exponential distributions, Statistics, 43 (2009), 621–634. doi: 10.1080/02331880802605346. doi: 10.1080/02331880802605346
![]() |
[32] | N. Balakrishnan, R. A. Sandhu, A simple simulation algorithm for generating progressive type-II censored samples, Am. Stat., 49 (1995), 229–230. |
[33] | J. S. Maritz, T. Lwin, Empirical bayes methods, 2 Eds, London: Chapman and Hall/CRC, 1989. doi: 10.1201/9781351071666. |
[34] | J. M. Bernardo, A. F. M. Smith, Bayesian theory, New York: John Wiley & Sons, 1994. |
1. | Muhammad Amin, Saima Afzal, Muhammad Nauman Akram, Abdisalam Hassan Muse, Ahlam H. Tolba, Tahani A. Abushal, Outlier detection in gamma regression using Pearson residuals: Simulation and an application, 2022, 7, 2473-6988, 15331, 10.3934/math.2022840 | |
2. | Pramote Charongrattanasakul, Wimonmas Bamrungsetthapong, Poom Kumam, Designing Adaptive Multiple Dependent State Sampling Plan for Accelerated Life Tests, 2023, 46, 0267-6192, 1631, 10.32604/csse.2023.036179 | |
3. | Atef F. Hashem, Alaa H. Abdel-Hamid, Ali Sajid, Statistical Prediction Based on Ordered Ranked Set Sampling Using Type-II Censored Data from the Rayleigh Distribution under Progressive-Stress Accelerated Life Tests, 2023, 2023, 2314-4785, 1, 10.1155/2023/5211682 | |
4. | Amulya Kumar Mahto, Yogesh Mani Tripathi, Sanku Dey, Basim S.O. Alsaedi, Marwan H. Alhelali, Fatimah M. Alghamdi, Amani Alrumayh, Etaf Alshawarbeh, Bayesian estimation and prediction under progressive-stress accelerated life test for a log-logistic model, 2024, 101, 11100168, 330, 10.1016/j.aej.2024.05.045 | |
5. | Ehab M. Almetwally, Osama M. Khaled, Haroon M. Barakat, Inference Based on Progressive-Stress Accelerated Life-Testing for Extended Distribution via the Marshall-Olkin Family Under Progressive Type-II Censoring with Optimality Techniques, 2025, 14, 2075-1680, 244, 10.3390/axioms14040244 |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |
The first data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 1.40525 | 1.34963 | 0.94075 | 0.76069 | 0.109102 | 0.35816 | 522.134 | 522.731 | 531.241 |
PLD | 1.12565 | 5.44626 | --- | 0.07480 | 0.178738 | 0.02010 | 525.967 | 526.320 | 532.797 |
GLD | 1.00741 | 2.49201 | 0.60639 | --- | 0.111083 | 0.33669 | 525.289 | 525.642 | 532.119 |
ELD | 0.66113 | 1.19104 | --- | 1.59821 | 0.165565 | 0.03861 | 536.672 | 537.025 | 543.502 |
LD | 1.10933 | 4.34354 | --- | --- | 0.233859 | 0.00076 | 529.953 | 530.127 | 534.506 |
The second data set | |||||||||
Model | \hat{\alpha} | \hat{\gamma} | \hat{\theta} | \hat{\lambda} | K-S | p-value | AIC | CAIC | BIC |
PGLD | 2.49409 | 0.63477 | 0.99986 | 0.20566 | 0.05039 | 0.99971 | 402.596 | 403.526 | 410.08 |
PLD | 149.855 | 3788.06 | --- | 0.72799 | 0.11165 | 0.58773 | 412.330 | 412.875 | 417.943 |
GLD | 1.42011 | 5.41074 | 0.84701 | --- | 0.21077 | 0.02811 | 425.597 | 426.143 | 431.211 |
ELD | 0.27741 | 2.35554 | --- | 0.79642 | 0.31822 | 0.00012 | 498.599 | 499.145 | 504.213 |
LD | 22.3092 | 552.211 | --- | --- | 0.19730 | 0.04765 | 413.502 | 413.769 | 417.244 |
n_{1} | w_{1} | p=0.15 | p=0.55 | p=0.95 | |||||||||||
. | . | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | \overline{\hat{\alpha}} | MSE(\hat{\alpha}) | RAB(\hat{\alpha}) | MMSE | ||
. | . | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | \overline{\hat{\lambda}} | MSE(\hat{\lambda}) | RAB(\hat{\lambda}) | MRAB | ||
. | . | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | \overline{\hat{\theta}} | MSE(\hat{\theta}) | RAB(\hat{\theta}) | |||||
\hbar | N | n_{\hbar} | w_{\hbar} | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | \overline{\hat{c}} | MSE(\hat{c}) | RAB(\hat{c}) | |||
2 | 80 | 40 | 20 | 1.7692 | 0.4273 | 0.2736 | 0.3766 | 1.7793 | 0.4334 | 0.2855 | 0.4536 | 1.7900 | 0.4439 | 0.2821 | 0.4166 |
40 | 20 | 2.2500 | 1.3743 | 0.3825 | 0.3156 | 2.5275 | 1.7338 | 0.4473 | 0.3599 | 2.4325 | 1.5187 | 0.4291 | 0.3729 | ||
0.8399 | 0.0247 | 0.1060 | 0.8214 | 0.0323 | 0.1297 | 0.8232 | 0.0338 | 0.1322 | |||||||
0.2503 | 0.0567 | 0.8159 | 0.2491 | 0.0684 | 0.9367 | 0.2703 | 0.0868 | 1.0212 | |||||||
32 | 1.7112 | 0.2631 | 0.2218 | 0.4094 | 1.7118 | 0.2594 | 0.2231 | 0.4408 | 1.7260 | 0.2935 | 0.2333 | 0.4465 | |||
32 | 2.4709 | 1.6977 | 0.4517 | 0.3352 | 2.4682 | 1.8436 | 0.4612 | 0.3533 | 2.4738 | 1.8411 | 0.4750 | 0.3548 | |||
0.8411 | 0.0218 | 0.1107 | 0.8308 | 0.0274 | 0.1234 | 0.8205 | 0.0298 | 0.1328 | |||||||
0.2607 | 0.0643 | 0.8917 | 0.2649 | 0.0735 | 0.9590 | 0.2546 | 0.0682 | 0.9331 | |||||||
140 | 70 | 35 | 1.7235 | 0.2732 | 0.2152 | 0.3939 | 1.7104 | 0.2902 | 0.2283 | 0.4677 | 1.6972 | 0.2575 | 0.2121 | 0.4528 | |
70 | 35 | 2.5368 | 1.6200 | 0.4369 | 0.3218 | 2.5238 | 1.9496 | 0.4756 | 0.3508 | 2.5618 | 1.9192 | 0.4820 | 0.3414 | ||
0.8555 | 0.0153 | 0.0934 | 0.8402 | 0.0231 | 0.1128 | 0.8418 | 0.0224 | 0.1137 | |||||||
0.2588 | 0.0612 | 0.8638 | 0.2685 | 0.0756 | 0.9375 | 0.2636 | 0.0650 | 0.8993 | |||||||
56 | 1.7144 | 0.2323 | 0.2087 | 0.4615 | 1.6803 | 0.2312 | 0.2047 | 0.4594 | 1.6890 | 0.2324 | 0.2054 | 0.5033 | |||
56 | 2.5501 | 1.9964 | 0.4725 | 0.3284 | 2.4981 | 1.9889 | 0.4798 | 0.3312 | 2.5337 | 2.1999 | 0.4905 | 0.3413 | |||
0.8505 | 0.0184 | 0.1038 | 0.8517 | 0.0177 | 0.1013 | 0.8474 | 0.0201 | 0.1077 | |||||||
0.2606 | 0.0606 | 0.8572 | 0.2670 | 0.0593 | 0.8702 | 0.2702 | 0.0641 | 0.9027 | |||||||
4 | 80 | 20 | 10 | 1.7224 | 0.3564 | 0.2653 | 0.3963 | 1.7933 | 0.4206 | 0.2728 | 0.4094 | 1.7689 | 0.4255 | 0.2812 | 0.4622 |
20 | 10 | 2.1620 | 1.5467 | 0.4146 | 0.3086 | 2.4281 | 1.5124 | 0.4312 | 0.3627 | 2.5324 | 1.7754 | 0.4676 | 0.3609 | ||
20 | 10 | 0.8276 | 0.0223 | 0.1074 | 0.8293 | 0.0285 | 0.1225 | 0.8083 | 0.0379 | 0.1416 | |||||
20 | 10 | 0.2309 | 0.0559 | 0.7555 | 0.2638 | 0.0856 | 0.9872 | 0.2405 | 0.0723 | 0.9142 | |||||
16 | 1.7504 | 0.3104 | 0.2411 | 0.3767 | 1.7214 | 0.2853 | 0.2379 | 0.4418 | 1.6927 | 0.2635 | 0.2251 | 0.4439 | |||
16 | 2.4045 | 1.4904 | 0.4204 | 0.3243 | 2.6169 | 1.8321 | 0.4668 | 0.3464 | 2.4949 | 1.8477 | 0.4627 | 0.3608 | |||
16 | 0.8357 | 0.0229 | 0.1121 | 0.8406 | 0.0222 | 0.1121 | 0.8407 | 0.0263 | 0.1165 | ||||||
16 | 0.2460 | 0.0595 | 0.8481 | 0.2565 | 0.0694 | 0.9153 | 0.2862 | 0.0817 | 0.9996 | ||||||
140 | 35 | 18 | 1.6976 | 0.2483 | 0.2171 | 0.3170 | 1.7063 | 0.2604 | 0.2181 | 0.4347 | 1.7152 | 0.2896 | 0.2212 | 0.4394 | |
35 | 18 | 2.3230 | 1.2643 | 0.3762 | 0.2965 | 2.4760 | 1.8169 | 0.4667 | 0.3469 | 2.5477 | 1.8211 | 0.4638 | 0.3350 | ||
35 | 18 | 0.8554 | 0.0137 | 0.0869 | 0.8357 | 0.0240 | 0.1178 | 0.8412 | 0.0219 | 0.1093 | |||||
35 | 18 | 0.2548 | 0.0588 | 0.8022 | 0.2669 | 0.0723 | 0.9320 | 0.2561 | 0.0645 | 0.8805 | |||||
28 | 1.6775 | 0.1812 | 0.1846 | 0.4331 | 1.6684 | 0.1758 | 0.1888 | 0.4781 | 1.6867 | 0.1760 | 0.1864 | 0.4443 | |||
28 | 2.5388 | 1.9152 | 0.4670 | 0.3084 | 2.5493 | 2.1323 | 0.4914 | 0.3330 | 2.4572 | 1.9641 | 0.4774 | 0.3251 | |||
28 | 0.8499 | 0.0179 | 0.0998 | 0.8572 | 0.0176 | 0.0985 | 0.8508 | 0.0179 | 0.1009 | ||||||
28 | 0.2490 | 0.0510 | 0.7907 | 0.2744 | 0.0648 | 0.8863 | 0.2647 | 0.0633 | 0.8609 |
n_{1} | m_{1} | BEL | BEG | |||||||||||||||||
. | . | \nu=-0.5 | \nu=0.5 | \nu=-0.5 | \nu=0.5 | |||||||||||||||
. | . | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\tilde{\alpha}} | MSE(\tilde{\alpha}) | RAB(\tilde{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | \overline{\ddot{\alpha}} | MSE(\ddot{\alpha}) | RAB(\ddot{\alpha}) | MMSE | |||
. | . | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\tilde{\lambda}} | MSE(\tilde{\lambda}) | RAB(\tilde{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | \overline{\ddot{\lambda}} | MSE(\ddot{\lambda}) | RAB(\ddot{\lambda}) | MRAB | |||
. | . | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\tilde{\theta}} | MSE(\tilde{\theta}) | RAB(\tilde{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | \overline{\ddot{\theta}} | MSE(\ddot{\theta}) | RAB(\ddot{\theta}) | |||||||
\hbar | N | n_{\hbar} | m_{\hbar} | p | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\tilde{c}} | MSE(\tilde{c}) | RAB(\tilde{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | \overline{\ddot{c}} | MSE(\ddot{c}) | RAB(\ddot{c}) | ||||
2 | 80 | 40 | 20 | 0.15 | 1.6520 | 0.1912 | 0.1819 | 0.0505 | 1.5559 | 0.1342 | 0.1819 | 0.0369 | 1.5742 | 0.1662 | 0.0161 | 0.0448 | 1.5138 | 0.1567 | 0.0539 | 0.0433 |
40 | 20 | 2.2385 | 0.0042 | 0.0268 | 0.0790 | 2.2262 | 0.0058 | 0.0321 | 0.0819 | 2.2296 | 0.0053 | 0.0306 | 0.0356 | 2.2241 | 0.0061 | 0.0330 | 0.0593 | |||
0.8493 | 0.0064 | 0.0675 | 0.8444 | 0.0073 | 0.0723 | 0.8432 | 0.0076 | 0.0631 | 0.8334 | 0.0098 | 0.0740 | |||||||||
0.1990 | 0.0002 | 0.0398 | 0.1970 | 0.0002 | 0.0414 | 0.1935 | 0.0003 | 0.0325 | 0.1847 | 0.0004 | 0.0764 | |||||||||
0.55 | 1.6766 | 0.1464 | 0.1786 | 0.0391 | 1.5945 | 0.1086 | 0.1786 | 0.0302 | 1.6098 | 0.1170 | 0.0061 | 0.0323 | 1.5596 | 0.1139 | 0.0253 | 0.0322 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0775 | 2.2264 | 0.0057 | 0.0320 | 0.0801 | 2.2297 | 0.0052 | 0.0306 | 0.0317 | 2.2242 | 0.0060 | 0.0329 | 0.0506 | |||||
0.8556 | 0.0056 | 0.0634 | 0.8511 | 0.0064 | 0.0676 | 0.8502 | 0.0066 | 0.0554 | 0.8418 | 0.0084 | 0.0647 | |||||||||
0.1986 | 0.0003 | 0.0411 | 0.1966 | 0.0002 | 0.0420 | 0.1930 | 0.0002 | 0.0349 | 0.1841 | 0.0004 | 0.0796 | |||||||||
0.95 | 1.6936 | 0.2794 | 0.1926 | 0.0728 | 1.5992 | 0.1265 | 0.1926 | 0.0351 | 1.6212 | 0.1806 | 0.0132 | 0.0486 | 1.5665 | 0.1388 | 0.0210 | 0.0387 | ||||
2.2370 | 0.0045 | 0.0274 | 0.0833 | 2.2247 | 0.0062 | 0.0327 | 0.0856 | 2.2281 | 0.0057 | 0.0313 | 0.0315 | 2.2226 | 0.0065 | 0.0337 | 0.0482 | |||||
0.8545 | 0.0058 | 0.0638 | 0.8501 | 0.0066 | 0.0680 | 0.8491 | 0.0069 | 0.0566 | 0.8406 | 0.0087 | 0.0660 | |||||||||
0.2009 | 0.0015 | 0.0495 | 0.1987 | 0.0012 | 0.0491 | 0.1950 | 0.0011 | 0.0250 | 0.1856 | 0.0007 | 0.0722 | |||||||||
32 | 0.15 | 1.6671 | 0.1794 | 0.1567 | 0.0477 | 1.6099 | 0.1384 | 0.1567 | 0.0379 | 1.6220 | 0.1559 | 0.0137 | 0.0422 | 1.5873 | 0.1452 | 0.0079 | 0.0400 | |||
32 | 2.2369 | 0.0050 | 0.0274 | 0.0744 | 2.2245 | 0.0067 | 0.0328 | 0.0765 | 2.2279 | 0.0062 | 0.0313 | 0.0278 | 2.2224 | 0.0071 | 0.0337 | 0.0395 | ||||
0.8637 | 0.0041 | 0.0533 | 0.8607 | 0.0046 | 0.0561 | 0.8601 | 0.0047 | 0.0443 | 0.8549 | 0.0057 | 0.0501 | |||||||||
0.2012 | 0.0022 | 0.0603 | 0.1992 | 0.0021 | 0.0605 | 0.1956 | 0.0020 | 0.0218 | 0.1867 | 0.0019 | 0.0663 | |||||||||
0.55 | 1.6684 | 0.0972 | 0.1476 | 0.0265 | 1.6131 | 0.0793 | 0.1476 | 0.0225 | 1.6235 | 0.0837 | 0.0147 | 0.0235 | 1.5901 | 0.0810 | 0.0062 | 0.0233 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0697 | 2.2263 | 0.0058 | 0.0321 | 0.0717 | 2.2297 | 0.0053 | 0.0306 | 0.0297 | 2.2241 | 0.0061 | 0.0330 | 0.0409 | |||||
0.8654 | 0.0040 | 0.0523 | 0.8623 | 0.0045 | 0.0551 | 0.8617 | 0.0046 | 0.0425 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1995 | 0.0006 | 0.0522 | 0.1974 | 0.0004 | 0.0520 | 0.1938 | 0.0004 | 0.0311 | 0.1848 | 0.0004 | 0.0760 | |||||||||
0.95 | 1.6736 | 0.1876 | 0.1626 | 0.0493 | 1.6072 | 0.1008 | 0.1626 | 0.0281 | 1.6207 | 0.1193 | 0.0129 | 0.0326 | 1.5849 | 0.1060 | 0.0094 | 0.0297 | ||||
2.2375 | 0.0044 | 0.0272 | 0.0756 | 2.2250 | 0.0061 | 0.0326 | 0.0777 | 2.2285 | 0.0056 | 0.0311 | 0.0284 | 2.2229 | 0.0065 | 0.0335 | 0.0409 | |||||
0.8645 | 0.0041 | 0.0546 | 0.8614 | 0.0046 | 0.0576 | 0.8608 | 0.0047 | 0.0436 | 0.8553 | 0.0057 | 0.0497 | |||||||||
0.2004 | 0.0009 | 0.0583 | 0.1984 | 0.0008 | 0.0581 | 0.1948 | 0.0008 | 0.0259 | 0.1858 | 0.0006 | 0.0708 | |||||||||
2 | 140 | 70 | 35 | 0.15 | 1.6590 | 0.1425 | 0.1527 | 0.0379 | 1.5986 | 0.0848 | 0.1527 | 0.0239 | 1.6106 | 0.0964 | 0.0066 | 0.0267 | 1.5760 | 0.0870 | 0.0150 | 0.0248 |
70 | 35 | 2.2388 | 0.0042 | 0.0266 | 0.0732 | 2.2264 | 0.0059 | 0.0320 | 0.0750 | 2.2298 | 0.0054 | 0.0305 | 0.0261 | 2.2242 | 0.0063 | 0.0329 | 0.0411 | |||
0.8654 | 0.0037 | 0.0501 | 0.8627 | 0.0040 | 0.0527 | 0.8622 | 0.0041 | 0.0420 | 0.8575 | 0.0050 | 0.0472 | |||||||||
0.2006 | 0.0011 | 0.0634 | 0.1985 | 0.0010 | 0.0628 | 0.1950 | 0.0010 | 0.0251 | 0.1861 | 0.0009 | 0.0694 | |||||||||
0.55 | 1.6450 | 0.1210 | 0.1422 | 0.0325 | 1.5941 | 0.0912 | 0.1422 | 0.0256 | 1.6039 | 0.0985 | 0.0025 | 0.0273 | 1.5735 | 0.0942 | 0.0166 | 0.0266 | ||||
2.2374 | 0.0046 | 0.0272 | 0.0697 | 2.2249 | 0.0063 | 0.0326 | 0.0717 | 2.2283 | 0.0058 | 0.0312 | 0.0255 | 2.2228 | 0.0066 | 0.0336 | 0.0420 | |||||
0.8664 | 0.0035 | 0.0486 | 0.8637 | 0.0038 | 0.0511 | 0.8632 | 0.0039 | 0.0409 | 0.8586 | 0.0047 | 0.0460 | |||||||||
0.2000 | 0.0012 | 0.0606 | 0.1980 | 0.0010 | 0.0606 | 0.1945 | 0.0010 | 0.0276 | 0.1857 | 0.0010 | 0.0717 | |||||||||
0.95 | 1.6732 | 0.1466 | 0.1516 | 0.0392 | 1.6171 | 0.1051 | 0.1516 | 0.0292 | 1.6285 | 0.1147 | 0.0178 | 0.0315 | 1.5960 | 0.1067 | 0.0025 | 0.0298 | ||||
2.2373 | 0.0045 | 0.0273 | 0.0743 | 2.2249 | 0.0062 | 0.0327 | 0.0758 | 2.2283 | 0.0057 | 0.0312 | 0.0279 | 2.2227 | 0.0065 | 0.0336 | 0.0381 | |||||
0.8660 | 0.0038 | 0.0504 | 0.8631 | 0.0042 | 0.0530 | 0.8626 | 0.0043 | 0.0416 | 0.8577 | 0.0052 | 0.0470 | |||||||||
0.2018 | 0.0018 | 0.0681 | 0.1995 | 0.0013 | 0.0660 | 0.1958 | 0.0012 | 0.0211 | 0.1861 | 0.0007 | 0.0693 | |||||||||
56 | 0.15 | 1.6251 | 0.0461 | 0.1064 | 0.0135 | 1.5991 | 0.0443 | 0.1064 | 0.0135 | 1.6038 | 0.0451 | 0.0024 | 0.0136 | 1.5871 | 0.0459 | 0.0081 | 0.0140 | |||
56 | 2.2382 | 0.0047 | 0.0275 | 0.0676 | 2.2258 | 0.0061 | 0.0327 | 0.0693 | 2.2292 | 0.0057 | 0.0308 | 0.0286 | 2.2237 | 0.0064 | 0.0332 | 0.0354 | ||||
0.8732 | 0.0026 | 0.0437 | 0.8717 | 0.0027 | 0.0451 | 0.8715 | 0.0028 | 0.0317 | 0.8694 | 0.0031 | 0.0340 | |||||||||
0.1920 | 0.0007 | 0.0926 | 0.1914 | 0.0007 | 0.0929 | 0.1901 | 0.0007 | 0.0497 | 0.1867 | 0.0008 | 0.0665 | |||||||||
0.55 | 1.6321 | 0.0503 | 0.1114 | 0.0144 | 1.6059 | 0.0477 | 0.1114 | 0.0143 | 1.6108 | 0.0486 | 0.0067 | 0.0144 | 1.5943 | 0.0486 | 0.0036 | 0.0147 | ||||
2.2383 | 0.0041 | 0.0268 | 0.0688 | 2.2261 | 0.0058 | 0.0321 | 0.0707 | 2.2295 | 0.0053 | 0.0307 | 0.0280 | 2.2240 | 0.0061 | 0.0330 | 0.0325 | |||||
0.8742 | 0.0027 | 0.0441 | 0.8727 | 0.0028 | 0.0455 | 0.8725 | 0.0029 | 0.0305 | 0.8703 | 0.0032 | 0.0330 | |||||||||
0.1931 | 0.0007 | 0.0930 | 0.1925 | 0.0007 | 0.0937 | 0.1912 | 0.0007 | 0.0439 | 0.1879 | 0.0008 | 0.0603 | |||||||||
0.95 | 1.6471 | 0.0524 | 0.1129 | 0.0150 | 1.6211 | 0.0491 | 0.1129 | 0.0146 | 1.6260 | 0.0501 | 0.0163 | 0.0147 | 1.6098 | 0.0497 | 0.0061 | 0.0149 | ||||
2.2388 | 0.0041 | 0.0266 | 0.0692 | 2.2266 | 0.0057 | 0.0319 | 0.0708 | 2.2299 | 0.0052 | 0.0305 | 0.0310 | 2.2245 | 0.0060 | 0.0328 | 0.0339 | |||||
0.8754 | 0.0026 | 0.0438 | 0.8739 | 0.0028 | 0.0451 | 0.8737 | 0.0028 | 0.0292 | 0.8716 | 0.0031 | 0.0316 | |||||||||
0.1924 | 0.0009 | 0.0935 | 0.1917 | 0.0008 | 0.0934 | 0.1904 | 0.0008 | 0.0481 | 0.1870 | 0.0008 | 0.0652 | |||||||||
4 | 80 | 20 | 10 | 0.15 | 1.6816 | 0.2985 | 0.2183 | 0.0779 | 1.5568 | 0.1760 | 0.2183 | 0.0480 | 1.5815 | 0.2294 | 0.0116 | 0.0613 | 1.5056 | 0.2066 | 0.0590 | 0.0566 |
20 | 10 | 2.2379 | 0.0047 | 0.0270 | 0.0922 | 2.2255 | 0.0064 | 0.0324 | 0.0951 | 2.2289 | 0.0059 | 0.0309 | 0.0344 | 2.2234 | 0.0067 | 0.0333 | 0.0617 | |||
20 | 10 | 0.8448 | 0.0075 | 0.0751 | 0.8391 | 0.0087 | 0.0808 | 0.8376 | 0.0091 | 0.0693 | 0.8255 | 0.0122 | 0.0827 | |||||||
20 | 10 | 0.2006 | 0.0010 | 0.0484 | 0.1986 | 0.0009 | 0.0487 | 0.1949 | 0.0009 | 0.0256 | 0.1856 | 0.0008 | 0.0718 | |||||||
0.55 | 1.6668 | 0.1261 | 0.1730 | 0.0340 | 1.5845 | 0.1048 | 0.1730 | 0.0292 | 1.5992 | 0.1108 | 0.0005 | 0.0307 | 1.5476 | 0.1108 | 0.0328 | 0.0314 | ||||
2.2381 | 0.0041 | 0.0269 | 0.0753 | 2.2259 | 0.0057 | 0.0322 | 0.0780 | 2.2292 | 0.0053 | 0.0308 | 0.0303 | 2.2237 | 0.0061 | 0.0332 | 0.0523 | |||||
0.8549 | 0.0055 | 0.0628 | 0.8505 | 0.0063 | 0.0671 | 0.8495 | 0.0065 | 0.0561 | 0.8411 | 0.0083 | 0.0654 | |||||||||
0.1987 | 0.0001 | 0.0385 | 0.1968 | 0.0001 | 0.0399 | 0.1932 | 0.0001 | 0.0340 | 0.1844 | 0.0003 | 0.0779 | |||||||||
0.95 | 1.6906 | 0.2507 | 0.1933 | 0.0656 | 1.5998 | 0.1302 | 0.1933 | 0.0360 | 1.6194 | 0.1676 | 0.0121 | 0.0453 | 1.5667 | 0.1410 | 0.0208 | 0.0393 | ||||
2.2378 | 0.0044 | 0.0271 | 0.0835 | 2.2254 | 0.0061 | 0.0324 | 0.0858 | 2.2288 | 0.0056 | 0.0310 | 0.0302 | 2.2233 | 0.0064 | 0.0334 | 0.0471 | |||||
0.8568 | 0.0059 | 0.0631 | 0.8524 | 0.0066 | 0.0670 | 0.8515 | 0.0069 | 0.0539 | 0.8431 | 0.0087 | 0.0632 | |||||||||
0.2011 | 0.0016 | 0.0507 | 0.1989 | 0.0013 | 0.0504 | 0.1952 | 0.0012 | 0.0238 | 0.1858 | 0.0009 | 0.0709 | |||||||||
16 | 0.15 | 1.6605 | 0.1649 | 0.1614 | 0.0439 | 1.5951 | 0.1106 | 0.1614 | 0.0308 | 1.6083 | 0.1254 | 0.0052 | 0.0344 | 1.5705 | 0.1164 | 0.0185 | 0.0326 | |||
16 | 2.2374 | 0.0048 | 0.0272 | 0.0770 | 2.2249 | 0.0066 | 0.0327 | 0.0790 | 2.2283 | 0.0061 | 0.0312 | 0.0271 | 2.2227 | 0.0069 | 0.0336 | 0.0442 | ||||
16 | 0.8620 | 0.0042 | 0.0546 | 0.8587 | 0.0047 | 0.0577 | 0.8581 | 0.0049 | 0.0466 | 0.8523 | 0.0060 | 0.0530 | ||||||||
16 | 0.2007 | 0.0016 | 0.0648 | 0.1986 | 0.0014 | 0.0643 | 0.1949 | 0.0014 | 0.0255 | 0.1856 | 0.0012 | 0.0718 | ||||||||
0.55 | 1.6862 | 0.2183 | 0.1632 | 0.0570 | 1.6170 | 0.1042 | 0.1632 | 0.0290 | 1.6330 | 0.1422 | 0.0206 | 0.0384 | 1.5955 | 0.1136 | 0.0028 | 0.0316 | ||||
2.2372 | 0.0046 | 0.0273 | 0.0761 | 2.2246 | 0.0063 | 0.0328 | 0.0781 | 2.2280 | 0.0058 | 0.0313 | 0.0315 | 2.2224 | 0.0067 | 0.0337 | 0.0406 | |||||
0.8645 | 0.0040 | 0.0524 | 0.8614 | 0.0045 | 0.0552 | 0.8609 | 0.0046 | 0.0435 | 0.8556 | 0.0056 | 0.0493 | |||||||||
0.1997 | 0.0011 | 0.0616 | 0.1975 | 0.0009 | 0.0612 | 0.1939 | 0.0008 | 0.0306 | 0.1847 | 0.0007 | 0.0765 | |||||||||
0.95 | 1.6709 | 0.1323 | 0.1523 | 0.0353 | 1.6149 | 0.1007 | 0.1523 | 0.0279 | 1.6261 | 0.1143 | 0.0163 | 0.0312 | 1.5927 | 0.1076 | 0.0046 | 0.0300 | ||||
2.2373 | 0.0044 | 0.0273 | 0.0709 | 2.2249 | 0.0061 | 0.0327 | 0.0732 | 2.2283 | 0.0056 | 0.0312 | 0.0314 | 2.2227 | 0.0065 | 0.0336 | 0.0415 | |||||
0.8652 | 0.0040 | 0.0512 | 0.8621 | 0.0045 | 0.0540 | 0.8615 | 0.0046 | 0.0427 | 0.8562 | 0.0056 | 0.0486 | |||||||||
0.1985 | 0.0005 | 0.0529 | 0.1965 | 0.0004 | 0.0537 | 0.1929 | 0.0004 | 0.0353 | 0.1841 | 0.0005 | 0.0794 | |||||||||
4 | 140 | 35 | 18 | 0.15 | 1.6763 | 0.2148 | 0.1731 | 0.0564 | 1.6019 | 0.1085 | 0.1731 | 0.0303 | 1.6175 | 0.1303 | 0.0109 | 0.0356 | 1.5767 | 0.1137 | 0.0145 | 0.0318 |
35 | 18 | 2.2362 | 0.0048 | 0.0278 | 0.0825 | 2.2236 | 0.0067 | 0.0332 | 0.0841 | 2.2270 | 0.0061 | 0.0317 | 0.0269 | 2.2213 | 0.0071 | 0.0342 | 0.0416 | |||
35 | 18 | 0.8644 | 0.0041 | 0.0524 | 0.8615 | 0.0045 | 0.0551 | 0.8609 | 0.0046 | 0.0434 | 0.8559 | 0.0056 | 0.0491 | |||||||
35 | 18 | 0.2016 | 0.0022 | 0.0768 | 0.1993 | 0.0017 | 0.0750 | 0.1957 | 0.0016 | 0.0216 | 0.1863 | 0.0010 | 0.0685 | |||||||
0.55 | 1.6899 | 0.1523 | 0.1532 | 0.0403 | 1.6366 | 0.0997 | 0.1532 | 0.0277 | 1.6488 | 0.1222 | 0.0305 | 0.0332 | 1.6177 | 0.1049 | 0.0111 | 0.0292 | ||||
2.2373 | 0.0047 | 0.0273 | 0.0727 | 2.2248 | 0.0064 | 0.0327 | 0.0745 | 2.2282 | 0.0059 | 0.0312 | 0.0304 | 2.2227 | 0.0068 | 0.0336 | 0.0383 | |||||
0.8710 | 0.0032 | 0.0467 | 0.8685 | 0.0035 | 0.0488 | 0.8682 | 0.0035 | 0.0354 | 0.8643 | 0.0042 | 0.0397 | |||||||||
0.2006 | 0.0012 | 0.0638 | 0.1986 | 0.0010 | 0.0634 | 0.1951 | 0.0010 | 0.0246 | 0.1862 | 0.0009 | 0.0688 | |||||||||
0.95 | 1.6470 | 0.1052 | 0.1398 | 0.0284 | 1.5957 | 0.0697 | 0.1398 | 0.0200 | 1.6053 | 0.0744 | 0.0033 | 0.0211 | 1.5757 | 0.0717 | 0.0152 | 0.0208 | ||||
2.2387 | 0.0041 | 0.0266 | 0.0689 | 2.2262 | 0.0058 | 0.0321 | 0.0707 | 2.2296 | 0.0053 | 0.0306 | 0.0264 | 2.2240 | 0.0061 | 0.0330 | 0.0424 | |||||
0.8657 | 0.0036 | 0.0495 | 0.8630 | 0.0040 | 0.0520 | 0.8625 | 0.0041 | 0.0417 | 0.8580 | 0.0048 | 0.0466 | |||||||||
0.1997 | 0.0007 | 0.0596 | 0.1976 | 0.0006 | 0.0589 | 0.1940 | 0.0005 | 0.0301 | 0.1850 | 0.0005 | 0.0749 | |||||||||
28 | 0.15 | 1.6453 | 0.0505 | 0.1087 | 0.0144 | 1.6178 | 0.0469 | 0.1087 | 0.0140 | 1.6230 | 0.0480 | 0.0143 | 0.0141 | 1.6058 | 0.0475 | 0.0036 | 0.0143 | |||
28 | 2.2397 | 0.0039 | 0.0262 | 0.0682 | 2.2273 | 0.0056 | 0.0316 | 0.0701 | 2.2307 | 0.0051 | 0.0301 | 0.0312 | 2.2252 | 0.0059 | 0.0325 | 0.0339 | ||||
28 | 0.8757 | 0.0024 | 0.0420 | 0.8742 | 0.0026 | 0.0433 | 0.8740 | 0.0026 | 0.0288 | 0.8720 | 0.0029 | 0.0311 | ||||||||
28 | 0.1915 | 0.0008 | 0.0961 | 0.1910 | 0.0008 | 0.0968 | 0.1897 | 0.0008 | 0.0516 | 0.1863 | 0.0009 | 0.0683 | ||||||||
0.55 | 1.6337 | 0.0485 | 0.1090 | 0.0139 | 1.6079 | 0.0459 | 0.1090 | 0.0137 | 1.6127 | 0.0468 | 0.0079 | 0.0138 | 1.5964 | 0.0467 | 0.0023 | 0.0141 | ||||
2.2386 | 0.0041 | 0.0267 | 0.0676 | 2.2265 | 0.0057 | 0.0320 | 0.0694 | 2.2298 | 0.0052 | 0.0305 | 0.0280 | 2.2244 | 0.0060 | 0.0329 | 0.0319 | |||||
0.8756 | 0.0024 | 0.0417 | 0.8741 | 0.0026 | 0.0431 | 0.8739 | 0.0027 | 0.0290 | 0.8717 | 0.0030 | 0.0315 | |||||||||
0.1929 | 0.0007 | 0.0930 | 0.1923 | 0.0007 | 0.0935 | 0.1911 | 0.0007 | 0.0447 | 0.1878 | 0.0008 | 0.0611 | |||||||||
0.95 | 1.6310 | 0.0469 | 0.1062 | 0.0135 | 1.6051 | 0.0445 | 0.1062 | 0.0134 | 1.6099 | 0.0453 | 0.0062 | 0.0135 | 1.5935 | 0.0454 | 0.0041 | 0.0138 | ||||
2.2386 | 0.0040 | 0.0267 | 0.0675 | 2.2264 | 0.0057 | 0.0320 | 0.0693 | 2.2298 | 0.0052 | 0.0305 | 0.0294 | 2.2243 | 0.0060 | 0.0329 | 0.0342 | |||||
0.8733 | 0.0026 | 0.0439 | 0.8718 | 0.0028 | 0.0453 | 0.8715 | 0.0028 | 0.0316 | 0.8693 | 0.0031 | 0.0341 | |||||||||
0.1921 | 0.0007 | 0.0934 | 0.1915 | 0.0007 | 0.0938 | 0.1902 | 0.0007 | 0.0491 | 0.1869 | 0.0008 | 0.0657 |
n_{1} | w_{1} | Credible interval | ||||||||
. | . | CI | Symmetric | HPD | ||||||
. | . | AIL(\alpha) | AIL(\alpha) | AIL(\alpha) | ||||||
. | . | AIL(\lambda) | AIL(\lambda) | AIL(\lambda) | ||||||
. | . | AIL(\theta) | AIL(\theta) | AIL(\theta) | ||||||
\hbar | N | n_{\hbar} | w_{\hbar} | p | AIL(c) | AIL(c) | AIL(c) | |||
2 | 80 | 40 | 20 | 0.15 | 3.4146 | 3.0673 | 1.6523 | 0.7144 | 1.5547 | 0.6651 |
40 | 20 | 9.3383 | 0.6109 | 0.5824 | ||||||
1.0622 | 0.3577 | 0.3046 | ||||||||
1.5215 | 0.2369 | 0.2186 | ||||||||
0.55 | 3.2483 | 3.0103 | 1.5301 | 0.6792 | 1.4479 | 0.6347 | ||||
9.0989 | 0.6125 | 0.5825 | ||||||||
1.1630 | 0.3370 | 0.2883 | ||||||||
1.5413 | 0.2372 | 0.2199 | ||||||||
0.95 | 3.2459 | 3.009 | 1.5365 | 0.6822 | 1.4581 | 0.6387 | ||||
9.0252 | 0.6136 | 0.5840 | ||||||||
1.1423 | 0.3394 | 0.2911 | ||||||||
1.6318 | 0.2392 | 0.2214 | ||||||||
32 | 0.15 | 2.7872 | 2.7832 | 1.2546 | 0.5955 | 1.1861 | 0.5567 | |||
32 | 8.7304 | 0.6134 | 0.5833 | |||||||
1.0233 | 0.2796 | 0.2412 | ||||||||
1.3750 | 0.2345 | 0.2164 | ||||||||
0.55 | 2.8056 | 2.7834 | 1.2551 | 0.5965 | 1.1847 | 0.5568 | ||||
8.6419 | 0.6141 | 0.5831 | ||||||||
1.0474 | 0.2795 | 0.2408 | ||||||||
1.4224 | 0.2372 | 0.2188 | ||||||||
0.95 | 2.8068 | 2.765 | 1.2616 | 0.5985 | 1.1935 | 0.5598 | ||||
8.5702 | 0.6150 | 0.5858 | ||||||||
1.0997 | 0.2818 | 0.2425 | ||||||||
1.3484 | 0.2357 | 0.2174 | ||||||||
140 | 70 | 35 | 0.15 | 2.8299 | 2.7787 | 1.2480 | 0.5913 | 1.1732 | 0.5512 | |
70 | 35 | 8.8112 | 0.6169 | 0.5869 | ||||||
0.9287 | 0.2635 | 0.2278 | ||||||||
1.3237 | 0.2366 | 0.2171 | ||||||||
0.55 | 2.6886 | 2.7106 | 1.1792 | 0.5731 | 1.1093 | 0.5345 | ||||
8.5189 | 0.6155 | 0.5842 | ||||||||
0.9782 | 0.2632 | 0.2284 | ||||||||
1.3673 | 0.2346 | 0.2160 | ||||||||
0.95 | 2.7318 | 2.7344 | 1.2057 | 0.5815 | 1.1348 | 0.5427 | ||||
8.6176 | 0.6126 | 0.5838 | ||||||||
0.9898 | 0.2684 | 0.2324 | ||||||||
1.3326 | 0.2392 | 0.2198 | ||||||||
56 | 0.15 | 2.4536 | 2.5300 | 0.8753 | 0.4421 | 0.8257 | 0.4148 | |||
56 | 8.1685 | 0.6119 | 0.5802 | |||||||
0.8347 | 0.1885 | 0.1668 | ||||||||
1.1931 | 0.0926 | 0.0864 | ||||||||
0.55 | 2.3747 | 2.5060 | 0.8788 | 0.4427 | 0.8261 | 0.4143 | ||||
8.0651 | 0.6095 | 0.5785 | ||||||||
0.8548 | 0.1898 | 0.1667 | ||||||||
1.2352 | 0.0927 | 0.0858 | ||||||||
0.95 | 2.3900 | 2.5214 | 0.8759 | 0.4417 | 0.8271 | 0.4148 | ||||
8.1405 | 0.6118 | 0.5818 | ||||||||
0.8549 | 0.1877 | 0.1661 | ||||||||
1.2216 | 0.0913 | 0.0844 | ||||||||
4 | 80 | 20 | 10 | 0.15 | 3.5514 | 2.9879 | 1.8141 | 0.7625 | 1.7046 | 0.7083 |
20 | 10 | 9.0555 | 0.6117 | 0.5818 | ||||||
20 | 10 | 0.9926 | 0.3838 | 0.3253 | ||||||
20 | 10 | 1.3399 | 0.2405 | 0.2214 | ||||||
0.55 | 3.3072 | 3.0253 | 1.5485 | 0.6838 | 1.4686 | 0.6395 | ||||
9.1629 | 0.6124 | 0.5826 | ||||||||
1.1114 | 0.3375 | 0.2875 | ||||||||
1.5448 | 0.2368 | 0.2192 | ||||||||
0.95 | 3.3013 | 3.0365 | 1.5334 | 0.6802 | 1.4531 | 0.6366 | ||||
9.2126 | 0.6123 | 0.5846 | ||||||||
1.2443 | 0.3335 | 0.2853 | ||||||||
1.4244 | 0.2416 | 0.2234 | ||||||||
16 | 0.15 | 3.0018 | 2.8487 | 1.2939 | 0.6085 | 1.2209 | 0.5677 | |||
16 | 8.8203 | 0.6151 | 0.5846 | |||||||
16 | 1.0622 | 0.2872 | 0.2464 | |||||||
16 | 1.3593 | 0.2380 | 0.2187 | |||||||
0.55 | 2.7877 | 2.8122 | 1.2833 | 0.6045 | 1.2083 | 0.5638 | ||||
8.8808 | 0.6181 | 0.5875 | ||||||||
1.0125 | 0.2804 | 0.2416 | ||||||||
1.3797 | 0.2362 | 0.2180 | ||||||||
0.95 | 2.6809 | 2.7709 | 1.2539 | 0.5955 | 1.1762 | 0.5541 | ||||
8.6718 | 0.6139 | 0.5841 | ||||||||
1.0059 | 0.2782 | 0.2396 | ||||||||
1.4957 | 0.2359 | 0.2165 | ||||||||
140 | 35 | 18 | 0.15 | 2.9768 | 2.8485 | 1.3322 | 0.6132 | 1.2466 | 0.5702 | |
35 | 18 | 8.9141 | 0.6170 | 0.5870 | ||||||
35 | 18 | 0.9581 | 0.2688 | 0.2318 | ||||||
35 | 18 | 1.3934 | 0.2349 | 0.2153 | ||||||
0.55 | 2.7722 | 2.727 | 1.1958 | 0.5740 | 1.1280 | 0.5365 | ||||
8.5091 | 0.6177 | 0.5868 | ||||||||
0.9857 | 0.2487 | 0.2158 | ||||||||
1.3680 | 0.2336 | 0.2155 | ||||||||
0.95 | 2.7294 | 2.7466 | 1.1760 | 0.5731 | 1.1088 | 0.5356 | ||||
8.7286 | 0.6159 | 0.5867 | ||||||||
0.9769 | 0.2643 | 0.2290 | ||||||||
1.2979 | 0.2362 | 0.2177 | ||||||||
28 | 0.15 | 2.4586 | 2.5493 | 0.9017 | 0.4483 | 0.8475 | 0.4203 | |||
28 | 8.2247 | 0.6144 | 0.5847 | |||||||
28 | 0.8842 | 0.1858 | 0.1640 | |||||||
28 | 1.1792 | 0.0913 | 0.0849 | |||||||
0.55 | 2.3382 | 2.5347 | 0.8769 | 0.4407 | 0.8265 | 0.4129 | ||||
8.2832 | 0.6080 | 0.5768 | ||||||||
0.8100 | 0.1871 | 0.1651 | ||||||||
1.2424 | 0.0907 | 0.0834 | ||||||||
0.95 | 2.3738 | 2.5307 | 0.8765 | 0.4429 | 0.8254 | 0.4150 | ||||
8.1678 | 0.6098 | 0.5791 | ||||||||
0.8702 | 0.1914 | 0.1689 | ||||||||
1.2415 | 0.0937 | 0.0867 |