The moving average processes Xk=∑∞i=−∞ai+kYi are studied, where {Yi,−∞<i<∞} is a double infinite sequence of negatively dependent random variables under sub-linear expectations, and {ai,−∞<i<∞} is an absolutely summable sequence of real numbers. We establish the complete moment convergence of a moving average process under proper conditions, extending the corresponding results in classic probability space to those in sub-linear expectation space.
Citation: Mingzhou Xu. On the complete moment convergence of moving average processes generated by negatively dependent random variables under sub-linear expectations[J]. AIMS Mathematics, 2024, 9(2): 3369-3385. doi: 10.3934/math.2024165
[1] | Lunyi Liu, Qunying Wu . Complete integral convergence for weighted sums of negatively dependent random variables under sub-linear expectations. AIMS Mathematics, 2023, 8(9): 22319-22337. doi: 10.3934/math.20231138 |
[2] | Mingzhou Xu, Xuhang Kong . Note on complete convergence and complete moment convergence for negatively dependent random variables under sub-linear expectations. AIMS Mathematics, 2023, 8(4): 8504-8521. doi: 10.3934/math.2023428 |
[3] | Mingzhou Xu . Complete convergence and complete moment convergence for maximal weighted sums of extended negatively dependent random variables under sub-linear expectations. AIMS Mathematics, 2023, 8(8): 19442-19460. doi: 10.3934/math.2023992 |
[4] | Mingzhou Xu . Complete convergence of moving average processes produced by negatively dependent random variables under sub-linear expectations. AIMS Mathematics, 2023, 8(7): 17067-17080. doi: 10.3934/math.2023871 |
[5] | Shuyan Li, Qunying Wu . Complete integration convergence for arrays of rowwise extended negatively dependent random variables under the sub-linear expectations. AIMS Mathematics, 2021, 6(11): 12166-12181. doi: 10.3934/math.2021706 |
[6] | Xiaocong Chen, Qunying Wu . Complete convergence and complete integral convergence of partial sums for moving average process under sub-linear expectations. AIMS Mathematics, 2022, 7(6): 9694-9715. doi: 10.3934/math.2022540 |
[7] | Mingzhou Xu, Kun Cheng, Wangke Yu . Complete convergence for weighted sums of negatively dependent random variables under sub-linear expectations. AIMS Mathematics, 2022, 7(11): 19998-20019. doi: 10.3934/math.20221094 |
[8] | He Dong, Xili Tan, Yong Zhang . Complete convergence and complete integration convergence for weighted sums of arrays of rowwise $ m $-END under sub-linear expectations space. AIMS Mathematics, 2023, 8(3): 6705-6724. doi: 10.3934/math.2023340 |
[9] | Chengcheng Jia, Qunying Wu . Complete convergence and complete integral convergence for weighted sums of widely acceptable random variables under the sub-linear expectations. AIMS Mathematics, 2022, 7(5): 8430-8448. doi: 10.3934/math.2022470 |
[10] | Haiwu Huang, Yuan Yuan, Hongguo Zeng . An extension on the rate of complete moment convergence for weighted sums of weakly dependent random variables. AIMS Mathematics, 2023, 8(1): 622-632. doi: 10.3934/math.2023029 |
The moving average processes Xk=∑∞i=−∞ai+kYi are studied, where {Yi,−∞<i<∞} is a double infinite sequence of negatively dependent random variables under sub-linear expectations, and {ai,−∞<i<∞} is an absolutely summable sequence of real numbers. We establish the complete moment convergence of a moving average process under proper conditions, extending the corresponding results in classic probability space to those in sub-linear expectation space.
Since Peng [1,2] initiated the concept of the sub-linear expectations space to study the uncertainty in probability, many scholars try to investigate the limit theorems under sub-linear expectations. Zhang [3,4,5] studied the famous exponential inequalities, Rosenthal's inequalities, and Donsker's invariance principle under sub-linear expectations. Chen and Wu [6] investigated complete convergence theorems for a moving average process generated by independent random variables under sub-linear expectations. Under sub-linear expectations, Xu et al. [7], Xu and Kong [8] obtained complete convergence and complete moment convergence of weighted sums of negatively dependent random variables under sub-linear expectations. For more limit theorems under sub-linear expectations, the readers could refer to Zhang [9], Xu and Zhang [10,11], Wu and Jiang[12], Zhang and Lin [13], Zhong and Wu [14], Hu et al. [15], Gao and Xu [16], Kuczmaszewska [17], Zhang [5], Chen [18], Zhang [19], Chen and Wu [20], Xu and Cheng [21,22], Xu et al. [23], Xu [24,25], and references therein.
Guo et al. [26] studied the complete moment convergence of moving average processes under negative association assumptions. For more results about complete moment convergence of moving average processes, the interested reader could refer to Hossenni and Nezakati [27] and references therein. Motivated by the work of Guo et al. [26], Chen and Wu [6], and Xu et al. [23], we try to prove complete moment convergence of moving average processes generated by negatively dependent random variables under sub-linear expectations, complementing the corresponding results obtained in Guo et al. [26]. The differences between the works of Xu et al. [7], Xu and Kong [8], Xu [24,25], and the results in this article are that under sub-linear expectations the comlete convergence of weighted sums of negatively dependent or extended negatively dependent random variables are studied in Xu et al. [7], Xu and Kong [8], and Xu [], the complete convergence of moving average processes produced by negatively dependent random variables is studied in Xu [24], and the complete moment convergence of moving average processes generated by negatively dependent random variables is investigated here. The novelty here is that the results in this paper could imply those in Xu and Kong [8] and Xu [24] in some sense, and the results here extend the corresponding ones in probability space.
The rest of this paper is organized as follows. We present some necessary basic notions, concepts and corresponding properties, and give necessary lemmas under sublinear expectations in the next section. In Section 3, we present our results, Theorems 3.1–3.3, and the proofs of which are given in Section 4.
Hereafter, we use notions similar to that in the works by Peng [2], Zhang [4]. Assume that (Ω,F) is a given measurable space. Suppose that H is a set of all random variables on (Ω,F) fulfilling φ(X1,⋯,Xn)∈H for X1,⋯,Xn∈H, and each φ∈Cl,Lip(Rn), where Cl,Lip(Rn) is the set of φ fulfilling
|φ(x)−φ(y)|≤C(1+|x|m+|y|m)(|x−y|),∀x,y∈Rn |
for C>0, m∈N relying on φ.
Definition 2.1. A sub-linear expectation E on H is a functional E:H↦ˉR:=[−∞,∞] fulfilling the following: for every X,Y∈H,
(a) X≥Y implies E[X]≥E[Y];
(b) E[c]=c, ∀c∈R;
(c) E[λX]=λE[X], ∀λ≥0;
(d) E[X+Y]≤E[X]+E[Y] whenever E[X]+E[Y] is not of the form ∞−∞ or −∞+∞.
Definition 2.2. We say that {Xn;n≥1} is stochastically dominated by a random variable X in (Ω,H,E), if there exists a constant C such that ∀n≥1, for all non-negative h∈Cl,Lip(R), E(h(Xn))≤CE(h(X)).
V:F↦[0,1] is named to be a capacity if
(a) V(∅)=0, V(Ω)=1;
(b) V(A)≤V(B), A⊂B, A,B∈F.
Furthermore, if V is continuous, then V obeys
(c) An↑A yields V(An)↑V(A).
(d) An↓A yields V(An)↓V(A).
V is said to be sub-additive when V(A⋃B)≤V(A)+V(B), A,B∈F.
In (Ω,H,E), set V(A):=inf{E[ξ]:IA≤ξ,ξ∈H}, ∀A∈F (cf. Zhang [3]). V is a sub-additive capacity. Write
CV(X):=∫∞0V(X>x)dx+∫0−∞(V(X>x)−1)dx. |
As in 4.3 of Zhang [3], throughout this paper, define an extension of E on the space of all random variables by
E∗(X)=inf{E[Y]:X≤Y,Y∈H}. |
Then E∗ is a sublinear expectation on the space of all random variables, E[X]=E∗[X], ∀X∈H, and V(A)=E∗(IA), ∀A∈F.
Suppose X=(X1,⋯,Xm), Xi∈H and Y=(Y1,⋯,Yn), Yi∈H are two random vectors on (Ω,H,E). Y is named to be negatively dependent to X, if for ψ1 on Cl,Lip(Rm), ψ2 on Cl,Lip(Rn), E[ψ1(X)ψ2(Y)]≤E[ψ1(X)]E[ψ2(Y)] whenever ψ1(X)≥0, E[ψ2(Y)]≥0, E[|ψ1(X)ψ2(Y)|]<∞, E[|ψ1(X)|]<∞, E[|ψ2(Y)|]<∞, and either ψ1 and ψ2 are coordinatewise nondecreasing or ψ1 and ψ2 are coordinatewise nonincreasing (see Definition 2.3 of Zhang [3], Definition 1.5 of Zhang [4]). {Xn}∞n=−∞ is said to be negatively dependent, if Xn+l is negatively dependent to (Xl,Xl+1,⋯,Xl+n−1) for each n≥1, −∞<l<∞.
Suppose X1 and X2 are two n-dimensional random vectors in (Ω1,H1,E1) and (Ω2,H2,E2) respectively. They are said to be identically distributed if for every ψ∈Cl,Lip(Rn),
E1[ψ(X1)]=E2[ψ(X2)]. |
{Xn;n≥1} is called to be identically distributed if for every i≥1, Xi and X1 are identically distributed.
Throughout this paper, we suppose that E is countably sub-additive, i.e., E(X)≤∑∞n=1E(Xn) could be implied by X≤∑∞n=1Xn, X,Xn∈H, and X≥0, Xn≥0, n=1,2,…. Therefore E∗ is also countably sub-additive. Moreover V is also countably sub-additive (cf. Zhang [3]). Let C denote a positive constant which may change from line to line. I(A) or IA is the indicator function of A. The symbol ax≈bx means that there exists two positive constants C1, C2 fulfilling C1|bx|≤|ax|≤C2|bx|, x+ stands for max{x,0}, x−=(−x)+, for x∈R, a⋁b=max{a,b}, for a,b∈R.
As in Zhang [4], if X1,X2,…,Xn are negatively dependent random variables and f1(x),f2(x),…,fn(x)∈Cl,Lip(R) are all non increasing (or non decreasing) functions, then f1(X1), f2(X2),…,fn(Xn) are negatively dependent random variables.
We cite the following under sub-linear expectations.
Lemma 2.1. (cf. Lemma 4.5 (iii) of Zhang [3]) If E is countably sub-additive under (Ω,H,E), then for X∈H,
E|X|≤CV(|X|). |
Lemma 2.2. (cf. Theorem 2.1 in Zhang [4]) Write Sk=Y1+⋯+Yk, S0=0. Suppose that Yk+1 is negatively dependent to (Y1,…,Yk) for k=1,2,…,n−1, or Yk is negatively dependent to (Yk+1,…,Yn) for k=0,…,n−1 in sub-linear expectation space (Ω,H,E). Then for p≥2,
E[maxk≤n|Sk|p]≤Cp{n∑k=1E[|Yk|p]+(n∑k=1E[|Yk|2])p/2+(n∑k=1[|E(Yk)|+|E(−Yk)|])p}. | (2.1) |
By Lemma 2.2 of Zhong and Wu [14], the following lemma holds.
Lemma 2.3. Suppose Y∈H, r>0, p>0, and l(x) is a slowly varying function. (i) Then for any c>0,
CV{|Y|rl(|Y|p)}<∞⇔∞∑n=1nr/p−1l(n)V(|Y|>cn1/p)<∞. |
(ii) Suppose CV{|Y|rl(|Y|p)}<∞. Then for any θ>1 and c>0,
∞∑k=1θkr/pl(θk)V(|Y|>cθk/p)<∞. |
Theorem 3.1. Assume that Xn=∑∞i=−∞ai+nYi, n≥1, where {ai,−∞<i<∞} is a sequence of real numbers fulfilling ∑∞i=−∞|ai|<∞, {Yi,−∞<i<∞} is a sequence of negatively dependent random variables, and {Yi,−∞<i<∞} is stochastically dominated by Y in sub-linear expectation space (Ω,H,E). Let l(x) be a slowly varying function and 1≤p<2, r≥1+p/2. Suppose that E(Yi)=E(−Yi)=0 for all −∞<i<∞, and CV(|Y|r(1⋁l(|Y|p)))<∞. Then
∞∑n=1nr/p−2−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵn1/(pt)]+}<∞, for all ϵ>0 and t>1r, | (3.1) |
and
∞∑n=1nr/p−2l(n)CV{[supk≥n|k−1/pk∑i=1Xi|1/t−ϵ]+}<∞, for all ϵ>0 and t>1r. | (3.2) |
Theorem 3.2. Suppose that Xn=∑∞i=−∞ai+nYi, n≥1, where {ai,−∞<i<∞} is a sequence of real numbers fulfilling ∑∞i=−∞|ai|<∞, {Yi,−∞<i<∞} is a sequence of negatively dependent random variables, and {Yi,−∞<i<∞} is stochastically dominated by Y in sub-linear expectation space (Ω,H,E). Let l(x) be a non-decreasing and slowly varying function. Assume 1≤p<2, r>1+p/2. Suppose that E(Yi)=E(−Yi)=0, for all −∞<i<∞ and CV(|Y|1/t(1⋁l(|Y|p)))<∞. Then
∞∑n=1nr/p−2−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵn1/(pt)]+}<∞, for all ϵ>0 and t<1r, | (3.3) |
and
∞∑n=1nr/p−2l(n)CV{[supk≥n|k−1/pk∑i=1Xi|1/t−ϵ]+}<∞, for all ϵ>0 and t<1r. | (3.4) |
Theorem 3.3. Assume that Xn=∑∞i=−∞ai+nYi, n≥1, where {ai,−∞<i<∞} is a sequence of real numbers fulfilling ∑∞i=−∞|ai|<∞, {Yi,−∞<i<∞} is a sequence of negatively dependent random variables, and {Yi,−∞<i<∞} is stochastically dominated by Y in sub-linear expectation space (Ω,H,E). Assume that l(x) is a slowly varying function and 1<p<2. Suppose E(Yi)=E(−Yi)=0 for −∞<i<∞ and CV(|Y|p(1⋁l(|Y|p)))<∞. Then
∞∑n=1n−1−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵn1/(pt)]+}<∞, for all ϵ>0 and t>1p. | (3.5) |
As in Remark 2.3 of Guo et al. [26] and Remark 1.2 of Li and Zhang [28], by Theorems 3.1, 3.2, we could obtain the following corollaries.
Corollary 3.1. Under the assumptions of Theorem 3.1, and assume that CV(|Y|r(1⋁l(|Y|p)))<∞. Then
∞∑n=1nr/p−2l(n)V{max1≤k≤n|k∑i=1Xi|>ϵn1/p}<∞ for ϵ>0; |
∞∑n=1nr/p−2l(n)V{supk≥n|k−1/pk∑i=1Xi|>ϵ}<∞ for ϵ>0. |
Corollary 3.2. Under the assumptions of Theorem 3.3, and assume that CV(|Y|p(1⋁l(|Y|p)))<∞. Then
∞∑n=1n−1l(n)V{max1≤k≤n|k∑i=1Xi|>ϵn1/p}<∞ for ϵ>0. |
Remark 3.1. In Theorems 3.1, 3.2, 3.3, Corollaries 3.1, 3.2, we all assume that E(Yj)=E(−Yj)=0, ∀j≥1. Readers may wonder what the intrinsic difference between the sub-linear expectation and linear expectation in probability space is? The following example heuristically implies the diffenrence in some extent. Suppose that Y1 is G-normally distributed, i.e., for a,b>0, aY1+bˉY1 and √a2+b2Y1 are identically distributed, where ˉY1 and Y1 are independent and identically distributed (cf. Definition 2.2.8 and Remark 2.2.9 of Peng [2]). We know that E(Y1)=E(−Y1)=0 (cf. Remark 2.2.5 of Peng [2]). Assume that E(Y21)=1>−E(−Y21)>0. Then by the Remarks 3 and 14 of Hu [29], we know that E(Y2n+11)=E(−Y2n+11)>0, ∀n≥1. Hence, for any n≥2, E(Yn1)≠−E(−Yn1) (cf. Proposition 2.2.15 of Peng [2]).
Hereafter, as in Chen and Wu [6], we define some useful functions. Assume that 2−1/p<μ<1, g(y)∈Cl,Lip(R) is a decreasing function for y≥0, 0≤g(y)≤1 for all y and g(y)=1 if |y|≤μ, g(y)=0 if |y|>1. We see that
I(|y|≤μ)≤g(|y|)≤I(|y|≤1),I(|y|>1)≤1−g(|y|)≤I(|y|>μ). | (4.1) |
Define gj(y)∈Cl,Lip(R), j≥1 such that 0≤gj(y)≤1 for all y and gj(|y|2j/p)=1 if 2(j−1)/p<|y|≤2j/p, gj(|y|2j/p)=0 if |y|≤μ2(j−1)/p or |y|>(1+μ)2j/p. We see that
I(2(j−1)/p<|Y|≤2j/p)≤gj(|Y|2j/p)≤I(μ2(j−1)/p<|Y|≤(1+μ)2j/p), | (4.2) |
|Y|αg(|Y|2k/p)≤1+k∑j=1|Y|αgj(|Y|2j/p),∀α>0, | (4.3) |
|Y|α(1−g(|Y|2k/p))≤∞∑j=k|Y|αgj(|Y|2j/p),∀α>0. | (4.4) |
Proof of Theorem 3.1. Here we adopt some ideas from the proofs of Theorem 2.1 in Guo et al. [26]. Write Y(1)xi=YiI(|Yi|<x)−xI(Yi≤−x)+xI(Yi≥x), Y(2)xi=Yi−Y(1)xi, Y(1)x=YI(|Y|<x)−xI(Y≤−x)+xI(Y≥x), Y(2)x=Y−Y(1)x for any x≥0 and −∞<i<∞. Note that
n∑k=1Xk=n∑k=1∞∑i=−∞ai+kYi=∞∑i=−∞ain∑k=1Yi−k=∞∑i=−∞aii−1∑j=i−nYj. |
We see that
∞∑n=1nr/p−2−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵn1/(pt)]+}=∞∑n=1nr/p−2−1/(pt)l(n)∫∞ϵn1/(pt)V{max1≤k≤n|k∑i=1Xi|>xt}dx(letting y=(x/ϵ)t)=∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/pV{max1≤k≤n|k∑i=1Xi|>ϵty}ϵty1t−1dy≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1V{max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(2)xj|≥xϵt2}dx+C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1V{max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(1)xj|≥xϵt2}dx:=I1+I2. | (4.5) |
For I1, observe that r/p−1−1/(pt)>−1 and CV(|Y|rl(|Y|p))<∞, by Lemmas 2.2 and 2.3, Markov inequality under sub-linear expectations, (4.1), (4.4), we get
I1≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−2E∗[max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(2)xj|]dx≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−2max−∞<i<∞E∗[|i−1∑j=i−n|Yj|(1−g(|Yj|x))|]dx=C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−2max−∞<i<∞E[|i−1∑j=i−n|Yj|(1−g(|Yj|x))|]dx≤C∞∑n=1nr/p−1−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px1t−2E(|Y|(1−g(|Y|x)))dx≤C∞∑n=1nr/p−1−1/(pt)l(n)∞∑k=nk1/(pt)−1/p−1E(|Y|(1−g(|Y|k1/p)))≤C∞∑k=1k1/(pt)−1/p−1E(|Y|(1−g(|Y|k1/p)))k∑n=1nr/p−1−1/(pt)l(n)≤C∞∑k=1kr/p−1−1/pl(k)E(|Y|(1−g(|Y|k1/p)))=C∞∑n=02n+1−1∑k=2nkr/p−1−1/pl(k)E(|Y|(1−g(|Y|k1/p)))≤C∞∑n=12n(r/p−1/p)l(2n)E(|Y|(1−g(|Y|2n/p)))≤C∞∑n=12n(r/p−1/p)l(2n)E∗(∞∑j=n|Y|gj(|Y|2j/p))≤C∞∑n=12n(r/p−1/p)l(2n)∞∑j=nE∗(|Y|gj(|Y|2j/p))=C∞∑j=1E(|Y|gj(|Y|2j/p))j∑n=12n(r/p−1/p)l(2n)≤C∞∑j=12jr/pl(2j)V{|Y|>μ2(j−1)/p}<∞. | (4.6) |
Next we establish I2. By Lemma 2.2, Markov's inequality under sub-linear expectations, Hölder inequality, we see that for q≥2,
I2≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1x−qE∗[max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(1)xj|q]dx≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−qE∗[max1≤k≤n∞∑i=−∞(|ai|1−1/q)(|ai|1/q)|i−1∑j=i−kY(1)xj|q]dx≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−q(∞∑i=−∞|ai|)q−1(∞∑i=−∞|ai|E∗(max1≤k≤n|i−1∑j=i−kY(1)xj|q))dx≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−qmax−∞<i<∞E(max1≤k≤n|i−1∑j=i−kY(1)xj|q)dx≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−qmax−∞<i<∞(i−1∑j=i−nE|Y(1)xj|q)dx+C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−qmax−∞<i<∞(i−1∑j=i−nE(|Y(1)xj|2))q/2dx+C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−qmax−∞<i<∞(i−1∑j=i−n[|E(Y(1)xj)|+|E(−Y(1)xj)|])qdx:=I21+I22+I23. |
For I21, take q>max{r,2}, by Lemma 2.3, (4.1), (4.2) and (4.3), and ∀x>0, f(⋅):=|⋅|qI(|⋅|≤x)+xqI(|⋅|>x)∈Cl,Lip(R), we see that
I21≤C∞∑n=1nr/p−2−1/(pt)l(n)∫∞n1/px1t−1−q(nE|Y(1)x|q)dx≤C∞∑n=1nr/p−1−1/(pt)l(n)∫∞n1/px1t−1−q[xqE(1−g(|Y|x))+E(|Y|qg(μ|Y|x))]dx=C∞∑n=1nr/p−1−1/(pt)l(n)∞∑m=n∫(m+1)1/pm1/px1t−1E(1−g(|Y|x))dx+C∞∑n=1nr/p−1−1/(pt)l(n)∞∑m=n∫(m+1)1/pm1/px1t−1−qE(|Y|qg(μ|Y|x))dx≤C∞∑n=1nr/p−1−1/(pt)l(n)∞∑m=nm1tp−1V{|Y|>μm1/p}+C∞∑n=1nr/p−1−1/(pt)l(n)∞∑m=nm1tp−1−q/pE(|Y|qg(μ|Y|(m+1)1/p))≤∞∑m=1mrp−1l(m)V{|Y|>μm1/p}+C∞∑m=1m1tp−1−q/pE(|Y|qg(μ|Y|(m+1)1/p))m∑n=1nr/p−1−1/(pt)l(n)≤C∞∑k=02k+1−1∑m=2kmrp−1−q/pl(m)E(|Y|qg(μ|Y|(m+1)1/p))≤C∞∑k=12krp−kq/pl(2k)E(|Y|qg(μ|Y|2(k+1)/p))≤C∞∑k=12krp−kq/pl(2k)E(1+k∑j=1|Y|qgj(μ|Y|2(j+1)/p))≤C∞∑k=12k(r/p−q/p)l(2k)+C∞∑k=12krp−kq/pl(2k)k∑j=1E(|Y|qgj(μ|Y|2(j+1)/p))≤C∞∑j=12jq/pV{|Y|>2j/p}∞∑k=j2k(r/p−q/p)l(2k)≤C∞∑j=12jr/pl(2j)V{|Y|>2j/p}<∞. | (4.7) |
For I22, we study the following two cases. If r≤2, we take q>2. Note that r/p−(r/p−1)q/2<1 and r/p−2−1/(pt)+q/2>−1. We get
I22≤C∞∑n=1nr/p−2−1/(pt)+q/2l(n)∫∞n1/px1t−1−q(E|Y(1)x|2)q/2dx≤C∞∑n=1nr/p−2−1/(pt)+q/2l(n)∫∞n1/px1t−1−qx(2−r)q/2(E|Y(1)x|r)q/2dx≤C∞∑n=1nr/p−(r/p−1)q/2−2(E|Y|r)q/2≤C∞∑n=1nr/p−(r/p−1)q/2−2(CV(|Y|r))q/2<∞. | (4.8) |
If r>2, we take q>max{2p(r/p−1)/(2−p),t−1}, then r/p−q/p+q/2<1. Note that E(Y2)<CV(Y2)≤CCV(|Y|rl(|Y|p))<∞ in this case. Therefore, we get
I22≤C∞∑n=1nr/p−2−1/(pt)+q/2l(n)∫∞n1/px1t−1−qdx≤C∞∑n=1nr/p−2−q/p+q/2l(n)<∞. | (4.9) |
Combining (4.8) and (4.9) results in I22<∞.
For I23, we take q>2. Observe that r≥1+p/2>p. By E(Yi)=E(−Yi)=0, Proposition 1.3.7 of Peng (2019), and Lemma 2.1, we see that
I23≤C∞∑n=1nr/p−2−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px1t−1−qmax−∞<i<∞(i−1∑j=i−n[E|Y(1)xj−Yj|+E|−Y(1)xj+Yj|])qdx≤C∞∑n=1nr/p−2−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px1t−1−qmax−∞<i<∞(i−1∑j=i−nE|Y(1)xj−Yj|)qdx≤C∞∑n=1nr/p−2−1/(pt)+ql(n)∞∑k=n∫(k+1)1/pk1/px1t−1−q(E|Y|(1−g(|Y|x)))qdx≤C∞∑k=1k1tp−1−q/p(E|Y|(1−g(|Y|k1/p)))qk∑n=1nr/p−2−1/(pt)+ql(n)≤C∞∑k=1k1/(pt)−1−q/p(E|Y|rl(|Y|p)/(k(r−1)/pl(k)))qkr/p−1−1/(pt)+ql(k)≤C∞∑k=1k−(r/p−1)(q−1)−1/l(k)q−1(CV{|Y|rl(|Y|p)})q<∞. | (4.10) |
Hence, by (4.5) and (4.6)–(4.10), we establish (3.1).
Now we prove (3.2). By r/p>1 and the countable sub-additivity of V, we obtain
∞∑n=1nr/p−2l(n)CV{[supk≥n|k−1/pk∑i=1Xi|1/t−ϵ]+}=∞∑n=1nr/p−2l(n)∫∞ϵV{supk≥n|k−1/pk∑i=1Xi|1/t>x}dx=∞∑j=02j+1−1∑n=2jnr/p−2l(n)∫∞ϵV{supk≥n|k−1/pk∑i=1Xi|1/t>x}dx≤C∞∑j=02j(r/p−1)l(2j)∫∞ϵV{supk≥2j|k−1/pk∑i=1Xi|1/t>x}dx≤C∞∑j=02j(r/p−1)l(2j)∞∑ℓ=j∫∞ϵV{sup2ℓ≤k≤2ℓ+1|k∑i=1Xi|1/t>x2ℓ/(pt)}dx≤C∞∑ℓ=02ℓ(r/p−1)l(2ℓ)∫∞ϵV{sup2ℓ≤k≤2ℓ+1|k∑i=1Xi|1/t>x2ℓ/(pt)}dx≤C∞∑n=0nr/p−2l(n)∫∞ϵ′V{sup1≤k≤n|k∑i=1Xi|1/t>xn1/(pt)}dx(letting ϵ′=ϵ2−1/(pt))≤C∞∑n=0nr/p−2−1/(pt)l(n)∫∞ϵ′n1/(pt)V{sup1≤k≤n|k∑i=1Xi|1/t>x}dx≤C∞∑n=0nr/p−2−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵ′n1/(pt)]+}<∞. | (4.11) |
Hence (3.2) is proved.
Proof of Theorem 3.2. As in the proof of Theorem 3.1, it is sufficient to prove that I1<∞, I21<∞, I22<∞, I23<∞. Indeed, observe that r/p−1−1/(pt)<−1 yields ∑∞n=1nr/p−1−1/(pt)<∞. Therefore, by the proofs of (4.6) and (4.4), and Lemma 2.3, we get
I1≤C∞∑k=1k1/(pt)−1/p−1E[|Y|(1−g(|Y|k1/p))]k∑n=1nr/p−1−1/(pt)l(n)≤C∞∑k=1k1/(pt)−1/p−1E[|Y|(1−g(|Y|k1/p))]=C∞∑n=02n+1−1∑k=2nk1/(pt)−1/p−1E[|Y|(1−g(|Y|k1/p))]≤C∞∑n=12n(1/(pt)−1/p)E[|Y|(1−g(|Y|2n/p))]≤C∞∑n=12n(1/(pt)−1/p)E∗[∞∑j=n|Y|gj(|Y|2j/p)]≤C∞∑j=1E∗[|Y|gj(|Y|2j/p)]j∑n=12n(1/(pt)−1/p)≤C∞∑j=12j/(pt)E[gj(|Y|2j/p)]≤C∞∑j=12j/(pt)V{|Y|>μ2(j−1)/p}<∞. |
For I22, I23, we take q>max{t−1,2(r−p)/(2−p),2+2(1/t−r)/p}. By the proofs of (4.8), (4.9) and (4.10), we can obtain I22<∞, I23<∞.
For I21, take q>max{2,t−1}, by the proof of (4.7), and (4.3), we see that
I21≤C∞∑m=1m1/(pt)−q/p−1E[|Y|qg(μ|Y|(m+1)1/p)]m∑n=1nr/p−1−1/(pt)l(n)≤C∞∑k=02k+1−1∑m=2km1/(pt)−q/p−1E[|Y|qg(μ|Y|(m+1)1/p)]≤C∞∑k=12k(1/(pt)−q/p)E[|Y|qg(μ|Y|2(k+1)/p)]≤C∞∑k=12k(1/(pt)−q/p)E[1+k∑j=1|Y|qgj(μ|Y|2(j+1)/p)]≤C∞∑k=12k(1/(pt)−q/p)+C∞∑k=12k(1/(pt)−q/p)k∑j=1E[|Y|qgj(μ|Y|2(j+1)/p)]≤C∞∑j=12jq/pV{|Y|>2j/p}∞∑k=j2k(1/(pt)−q/p)≤C∞∑j=12j/(tp)V{|Y|>2j/p}<∞. |
Proof of Theorem 3.3. By the proof of (4.5), we get
∞∑n=1n−1−1/(pt)l(n)CV{[max1≤k≤n|k∑i=1Xi|1/t−ϵn1/(pt)]+}≤C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−1V{max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(2)xj|≥xϵt2}dx+C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−1V{max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(1)xj|≥xϵt2}dx:=J1+J2. | (4.12) |
Observe that pt>1 and CV(|Y|1/tl(|Y|p))<∞, by Markov's inequality under sub-linear expectations and Lemmas 2.2, 2.3, (4.4), we have
J1≤C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−2E∗max1≤k≤n|∞∑i=−∞aii−1∑j=i−kY(2)xj|dx≤C∞∑n=1n−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px1t−2E(|Y(2)x|)dx≤C∞∑n=1n−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px1t−2E(|Y|(1−g(|Y|x)))dx≤C∞∑k=1k1/(tp)−1/p−1E(|Y|(1−g(|Y|k1/p)))k∑n=1n−1/(pt)l(n)≤C∞∑k=1k−1/pl(k)E(|Y|(1−g(|Y|k1/p)))=C∞∑n=02n+1−1∑k=2nk−1/pl(k)E(|Y|(1−g(|Y|k1/p)))≤C∞∑n=12(1−1/p)nl(2n)E(|Y|(1−g(|Y|2n/p)))≤C∞∑n=12(1−1/p)nl(2n)E∗(∞∑j=n|Y|gj(|Y|2j/p))≤C∞∑j=1E∗(|Y|gj(|Y|2j/p))j∑n=12(1−1/p)nl(2n)≤C∞∑j=12jl(2j)V{|Y|>μ2−1/p2j/p}<∞. | (4.13) |
For J2, as in the proof of I2, choose q=2, by (2.1), we get
J2≤C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−3max−∞<i<∞(i−1∑j=i−nE(|Y(1)xj|2))dx+C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−3max−∞<i<∞(i−1∑j=i−n[|EY(1)xj|+|E(−Y(1)xj)|])2dx=:J21+J22. |
By Lemma 2.3, (4.1), (4.3), we conclude that
J21=C∞∑n=1n−1/(pt)l(n)∫∞n1/px1t−3E(|Y(1)x|2)dx=C∞∑n=1n−1/(pt)l(n)∫∞n1/px1t−3[x2E(1−g(|Y|x))+E|Y|2g(μ|Y|x)]dx=C∞∑n=1n−1/(pt)l(n)∞∑m=n∫(m+1)1/pm1/px1t−1E(1−g(|Y|x))dx+C∞∑n=1n−1/(pt)l(n)∞∑m=n∫(m+1)1/pm1/px1t−3E|Y|2g(μ|Y|x)dx≤C∞∑m=1m1tp−1E(1−g(|Y|m1/p))m∑n=1n−1/(pt)l(n)+C∞∑m=1m1tp−2p−1E|Y|2g(μ|Y|(m+1)1/p)m∑n=1n−1/(pt)l(n)≤C∞∑m=1l(m)V{|Y|>μm1/p}+C∞∑m=1m−2pl(m)E|Y|2g(μ|Y|(m+1)1/p)=C∞∑n=02n+1−1∑m=2nm−2pl(m)E|Y|2g(μ|Y|(m+1)1/p)≤C∞∑n=12(1−2/p)nl(2n)E|Y|2g(μ|Y|(2)(n+1)/p)≤C∞∑n=12(1−2/p)nl(2n)E[1+n∑j=1|Y|2gj(μ|Y|(2)(j+1)/p)]≤C∞∑n=12(1−2/p)nl(2n)+C∞∑n=12(1−2/p)nl(2n)n∑j=1E[|Y|2gj(μ|Y|(2)(j+1)/p)]≤C∞∑j=122j/pV{|Y|>2j/p}∞∑n=j2(1−2/p)nl(2n)≤C∞∑j=12jl(2j)V{|Y|>2j/p}<∞. |
By E(−Yi)=E(Yi)=0, Proposition 1.3.7 of Peng (2019), (4.1), and Lemma 2.1, we see that
J22≤C∞∑n=1n−1−1/(pt)l(n)∫∞n1/px1t−3[nE|Y|(1−g(|Y|x))]2dx=C∞∑n=1n1−1/(pt)l(n)∞∑k=n∫(k+1)1/pk1/px(1/t)−3[E|Y|(1−g(|Y|k1/p))]2dx≤C∞∑k=1k1−2/pl(k)[E|Y|(1−g(|Y|k1/p))]2≤C∞∑k=1k1−2/pl(k)[CV(|Y|(1−g(|Y|k1/p)))]2≤C∞∑k=1k1−2/pl(k)[CV(|Y|I(|Y|>μk1/p))]2≤C∞∑k=1k1−2/pl(k)[∫μk1/p0V(|Y|>μk1/p)dy+∫∞μk1/pV(|Y|>y)dy]2≤C∞∑k=1kl(k)[V{|Y|>μk1/p}]2+C∞∑k=1k1−2/pl(k)[∫∞μk1/pV{|Y|>y}dy]2≤C∫∞1xl(x)V2{|Y|>μx1/p}dx+C∫∞1x1−2/pl(x)dx∫∞μx1/pV{|Y|>y}dy∫yμx1/pV{|Y|>z}dz≤C∫∞1(xl(x)V{|Y|pl(|Y|p)>Cxl(x)})V{|Y|p>Cx}dx+C∫∞μV{|Y|>y}dy∫yμV{|Y|>z}dz∫(z/μ)p1x1−2/pl(x)dx≤C∫∞1V{|Y|p>Cx}dx+CC∫∞μV{|Y|>y}dy∫yμV{|Y|>z}z2p−2l(zp)dz≤CCV{|Y|p}+CC∫∞μV{|Y|>y}dy∫yμE(|Y|p)zpzpzp−2l(zp)dz≤C+C∫∞μV{|Y|>y}CV{|Y|p}yp−1l(yp)dy≤CCV{|Y|pl(|Y|p)}<∞. |
Hence, (3.5) is proved.
We have obtained new results about complete moment convergence for maximal partial sums of moving average processes produced by negatively dependent random variables under sub-linear expectations. Results obtained in our article generalize those for negatively dependent random variables in probability space, and Theorems 3.1–3.3 complement the results of Xu et al. [7,23], Xu and Kong [8], and Xu [24] in some sense.
This study was supported by Science and Technology Research Project of Jiangxi Provincial Department of Education of China (No. GJJ2201041), Doctoral Scientific Research Starting Foundation of Jingdezhen Ceramic University (No. 102/01003002031), Academic Achievement Re-cultivation Project of Jingdezhen Ceramic University (Grant No. 215/20506277).
Artificial Intelligence tools were not used.
The author declares that there are no conflicts of interest.
[1] | S. G. Peng, G-expectation, G-Brownian motion and related stochastic calculus of Itô type, In: Stochastic Analysis and Applications, Berlin, Heidelberg: Springer, 2007,541–561. https://doi.org/10.1007/978-3-540-70847-6_25 |
[2] | S. G. Peng, Nonlinear expectations and stochastic calculus under uncertainty, Berlin: Springer, 2019. https://doi.org/10.1007/978-3-662-59903-7 |
[3] |
L. X. Zhang, Exponential inequalities under the sub-linear expectations with applications to laws of the iterated logarithm, Sci. China Math., 59 (2016), 2503–2526. https://doi.org/10.1007/s11425-016-0079-1 doi: 10.1007/s11425-016-0079-1
![]() |
[4] |
L. X. Zhang, Rosenthal's inequalities for independent and negatively dependent random variables under sub-linear expectations with applications, Sci. China Math., 59 (2016), 751–768. https://doi.org/10.1007/s11425-015-5105-2 doi: 10.1007/s11425-015-5105-2
![]() |
[5] |
L. X. Zhang, Strong limit theorems for extended independent and extended negatively dependent random variables under sub-linear expectations, Acta Math. Sci., 42 (2022), 467–490. https://doi.org/10.1007/s10473-022-0203-z doi: 10.1007/s10473-022-0203-z
![]() |
[6] |
X. C. Chen, Q. Y. Wu, Complete convergence theorems for moving average process generated by independent random variables under sub-linear expectations, Commun. Stat.-Theory Methods, 2023. https://doi.org/10.1080/03610926.2023.2220449 doi: 10.1080/03610926.2023.2220449
![]() |
[7] |
M. Z. Xu, K. Cheng, W. K. Yu, Complete convergence for weighted sums of negatively dependent random variables under sub-linear expectations, AIMS Mathematics, 7 (2022), 19998–20019. https://doi.org/10.3934/math.20221094 doi: 10.3934/math.20221094
![]() |
[8] |
M. Z. Xu, X. H. Kong, Note on complete convergence and complete moment convergence for negatively dependent random variables under sub-linear expectations, AIMS Mathematics, 8 (2023), 8504–8521. https://doi.org/10.3934/math.2023428 doi: 10.3934/math.2023428
![]() |
[9] |
L. X. Zhang, Donsker's invariance principle under the sub-linear expectation with an application to Chung's law of the iterated logarithm, Commun. Math. Stat., 3 (2015), 187–214. https://doi.org/10.1007/s40304-015-0055-0 doi: 10.1007/s40304-015-0055-0
![]() |
[10] |
J. P. Xu, L. X. Zhang, Three series theorem for independent random variables under sub-linear expectations with applications, Acta Math. Sin., English Ser., 35 (2019), 172–184. https://doi.org/10.1007/s10114-018-7508-9 doi: 10.1007/s10114-018-7508-9
![]() |
[11] |
J. P. Xu, L. X. Zhang, The law of logarithm for arrays of random variables under sub-linear expectations, Acta Math. Appl. Sin. Engl. Ser., 36 (2020), 670–688. https://doi.org/10.1007/s10255-020-0958-8 doi: 10.1007/s10255-020-0958-8
![]() |
[12] |
Q. Y. Wu, Y. Y. Jiang, Strong law of large numbers and Chover's law of the iterated logarithm under sub-linear expectations, J. Math. Anal. Appl., 460 (2018), 252–270. https://doi.org/10.1016/j.jmaa.2017.11.053 doi: 10.1016/j.jmaa.2017.11.053
![]() |
[13] |
L. X. Zhang, J. H. Lin, Marcinkiewicz's strong law of large numbers for nonlinear expectations, Stat. Probab. Lett., 137 (2018), 269–276. https://doi.org/10.1016/j.spl.2018.01.022 doi: 10.1016/j.spl.2018.01.022
![]() |
[14] |
H. Y. Zhong, Q. Y. Wu, Complete convergence and complete moment convergence for weighted sums of extended negatively dependent random variables under sub-linear expectation, J. Inequal. Appl., 2017 (2017), 261. https://doi.org/10.1186/s13660-017-1538-1 doi: 10.1186/s13660-017-1538-1
![]() |
[15] |
F. Hu, Z. J. Chen, D. F. Zhang, How big are the increments of G-Brownian motion, Sci. China Math., 57 (2014), 1687–1700. https://doi.org/10.1007/s11425-014-4816-0 doi: 10.1007/s11425-014-4816-0
![]() |
[16] |
F. Q. Gao, M. Z. Xu, Large deviations and moderate deviations for independent random variables under sublinear expectations, Sci. China Math., 41 (2011), 337–352. https://doi.org/10.1360/012009-879 doi: 10.1360/012009-879
![]() |
[17] |
A. Kuczmaszewska, Complete convergence for widely acceptable random variables under the sublinear expectations, J. Math. Anal. Appl., 484 (2020), 123662. https://doi.org/10.1016/j.jmaa.2019.123662 doi: 10.1016/j.jmaa.2019.123662
![]() |
[18] |
Z. J. Chen, Strong laws of large numbers for sub-linear expectations, Sci. China Math., 59 (2016), 945–954. https://doi.org/10.1007/s11425-015-5095-0 doi: 10.1007/s11425-015-5095-0
![]() |
[19] |
L. X. Zhang, On the laws of the iterated logarithm under sub-linear expectations, PUQR, 6 (2021), 409–460. https://doi.org/10.3934/puqr.2021020 doi: 10.3934/puqr.2021020
![]() |
[20] |
X. C. Chen, Q. Y. Wu, Complete convergence and complete integral convergence of partial sums for moving average process under sub-linear expectations, AIMS Mathematics, 7 (2022), 9694–9715. https://doi.org/10.3934/math.2022540 doi: 10.3934/math.2022540
![]() |
[21] |
M. Z. Xu, K. Cheng, Convergence for sums of iid random variables under sublinear expectations, J. Inequal. Appl., 2021 (2021), 157. https://doi.org/10.1186/s13660-021-02692-x doi: 10.1186/s13660-021-02692-x
![]() |
[22] |
M. Z. Xu, K. Cheng, How small are the increments of G-Brownian motion, Stat. Probab. Lett., 186 (2022), 109464. https://doi.org/10.1016/j.spl.2022.109464 doi: 10.1016/j.spl.2022.109464
![]() |
[23] |
M. Z. Xu, K. Cheng, W. K. Yu, Convergence of linear processes generated by negatively dependent random variables under sub-linear expectations, J. Inequal. Appl., 2023 (2023), 77. https://doi.org/10.1186/s13660-023-02990-6 doi: 10.1186/s13660-023-02990-6
![]() |
[24] |
M. Z. Xu, Complete convergence of moving average processes produced by negatively dependent random variables under sub-linear expectations, AIMS Mathematics, 8 (2023), 17067–17080. https://doi.org/10.3934/math.2023871 doi: 10.3934/math.2023871
![]() |
[25] |
M. Z. Xu, Complete convergence and complete moment convergence for maximal weighted sums of extended negatively dependent random variables under sub-linear expectations, AIMS Mathematics, 8 (2023), 19442–19460. https://doi.org/10.3934/math.2023992 doi: 10.3934/math.2023992
![]() |
[26] | M. L. Guo, J. J. Dai, D. J. Zhu, Complete moment convergence of moving average processes under negative association assumptions, Math. Appl. (Wuhan), 25 (2012), 118–125. |
[27] |
S. M. Hosseini, A. Nezakati, Complete moment convergence for the dependent linear processes with random coefficients, Acta Math. Sin., English Ser., 35 (2019), 1321–1333. https://doi.org/10.1007/s10114-019-8205-z doi: 10.1007/s10114-019-8205-z
![]() |
[28] |
Y. X. Li, L. X. Zhang, Complete moment convergence of moving-average processes under dependence assumptions, Stat. Probab. Lett., 70 (2004), 191–197. https://doi.org/10.1016/j.spl.2004.10.003 doi: 10.1016/j.spl.2004.10.003
![]() |
[29] |
M. S. Hu, Explicit solutions of the G-heat equation for a class of initial conditions, Nonlinear Anal.: Theory, Methods Appl., 75 (2012), 6588–6595. https://doi.org/10.1016/j.na.2012.08.002 doi: 10.1016/j.na.2012.08.002
![]() |
1. | Mingzhou Xu, Xuhang Kong, Complete qth moment convergence of moving average processes for m-widely acceptable random variables under sub-linear expectations, 2024, 214, 01677152, 110203, 10.1016/j.spl.2024.110203 |