Loading [Contrib]/a11y/accessibility-menu.js
Research article Special Issues

Modeling and analysis of COVID-19 based on a time delay dynamic model

  • Received: 30 August 2020 Accepted: 27 October 2020 Published: 24 November 2020
  • The new type of coronavirus pneumonia is caused by the new type of coronavirus which appeared at the end of 2019. Because of its strong contagiousness, rapid spread and great harm, it has already given countries around the world serious effects. So far there is no clear specific drug. Scientifically grasping the development law of epidemics is extremely important for preventing and controlling epidemics. Since the latent of this epidemic are also highly contagious, traditional infectious disease models cannot accurately describe the regularity of this epidemic transmission. Based on the traditional infectious disease model, an infectious disease model with a time delay is proposed. The time difference is used to characterize the cycle of viral infection and treatment time. Using the epidemic data released in real time, firstly, through the numerical simulation parameter inversion, the minimum error is obtained; then we simulate the development trend of the epidemic according to the dynamics system; finally, we compare and analyze the effectiveness of isolation measures. This article has simulated COVID-19 and analyzed the development of the epidemic in Beijing and Wuhan. By comparing the severity of the epidemic in the two regions, early detection and isolation are still the top priority of epidemic prevention and control.

    Citation: Cong Yang, Yali Yang, Zhiwei Li, Lisheng Zhang. Modeling and analysis of COVID-19 based on a time delay dynamic model[J]. Mathematical Biosciences and Engineering, 2021, 18(1): 154-165. doi: 10.3934/mbe.2021008

    Related Papers:

    [1] Hongyan Guo . Automorphism group and twisted modules of the twisted Heisenberg-Virasoro vertex operator algebra. Electronic Research Archive, 2021, 29(4): 2673-2685. doi: 10.3934/era.2021008
    [2] Agustín Moreno Cañadas, Robinson-Julian Serna, Isaías David Marín Gaviria . Zavadskij modules over cluster-tilted algebras of type $ \mathbb{A} $. Electronic Research Archive, 2022, 30(9): 3435-3451. doi: 10.3934/era.2022175
    [3] Youming Chen, Weiguo Lyu, Song Yang . A note on the differential calculus of Hochschild theory for $ A_{\infty} $-algebras. Electronic Research Archive, 2022, 30(9): 3211-3237. doi: 10.3934/era.2022163
    [4] Xue Yu . Orientable vertex imprimitive complete maps. Electronic Research Archive, 2024, 32(4): 2466-2477. doi: 10.3934/era.2024113
    [5] Yizheng Li, Dingguo Wang . Lie algebras with differential operators of any weights. Electronic Research Archive, 2023, 31(3): 1195-1211. doi: 10.3934/era.2023061
    [6] Ming Ding, Zhiqi Chen, Jifu Li . The properties on F-manifold color algebras and pre-F-manifold color algebras. Electronic Research Archive, 2025, 33(1): 87-101. doi: 10.3934/era.2025005
    [7] Liqian Bai, Xueqing Chen, Ming Ding, Fan Xu . A generalized quantum cluster algebra of Kronecker type. Electronic Research Archive, 2024, 32(1): 670-685. doi: 10.3934/era.2024032
    [8] Xiuhai Fei, Haifang Zhang . Additivity of nonlinear higher anti-derivable mappings on generalized matrix algebras. Electronic Research Archive, 2023, 31(11): 6898-6912. doi: 10.3934/era.2023349
    [9] Doston Jumaniyozov, Ivan Kaygorodov, Abror Khudoyberdiyev . The algebraic classification of nilpotent commutative algebras. Electronic Research Archive, 2021, 29(6): 3909-3993. doi: 10.3934/era.2021068
    [10] Quanguo Chen, Yong Deng . Hopf algebra structures on generalized quaternion algebras. Electronic Research Archive, 2024, 32(5): 3334-3362. doi: 10.3934/era.2024154
  • The new type of coronavirus pneumonia is caused by the new type of coronavirus which appeared at the end of 2019. Because of its strong contagiousness, rapid spread and great harm, it has already given countries around the world serious effects. So far there is no clear specific drug. Scientifically grasping the development law of epidemics is extremely important for preventing and controlling epidemics. Since the latent of this epidemic are also highly contagious, traditional infectious disease models cannot accurately describe the regularity of this epidemic transmission. Based on the traditional infectious disease model, an infectious disease model with a time delay is proposed. The time difference is used to characterize the cycle of viral infection and treatment time. Using the epidemic data released in real time, firstly, through the numerical simulation parameter inversion, the minimum error is obtained; then we simulate the development trend of the epidemic according to the dynamics system; finally, we compare and analyze the effectiveness of isolation measures. This article has simulated COVID-19 and analyzed the development of the epidemic in Beijing and Wuhan. By comparing the severity of the epidemic in the two regions, early detection and isolation are still the top priority of epidemic prevention and control.


    Existing methods and algorithms appeared in some literatures assume that variables are independent, but it is not plausible. In many stochastic models and statistical applications, those variables involved are dependent. Hence, it is important and meaningful to extend the results of independent variables to dependent cases. One of these dependence structures is weakly dependent (i.e., $ {{\rho }^{*}} $-mixing or $ \tilde{\rho} $-mixing), which has attracted the concern by many researchers.

    Definition 1.1. Let $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ be a sequence of random variables defined on a probability space $ \left(\Omega, \mathcal{F}, P \right) $. For any $ S\subset \text{N = }\left\{ 1, 2, \ldots \right\} $, define $ {{\mathcal{F}}_{S}} = \sigma \left({{X}_{i}}, i\in S \right) $. The set $ {{L}_{2}}\left({{\mathcal{F}}_{S}} \right) $ is the class of all $ \mathcal{F} $-measureable random variables with the finite second moment. For some integer $ s\ge 1 $, denote the mixing coefficient by

    $ \begin{equation} {{\rho }^{*}}\left( s \right) = \sup \left\{ \rho \left( {{\mathcal{F}}_{S}}, {{\mathcal{F}}_{T}} \right):S, T\subset \text{N}, \text{dist}\left( S, T \right)\ge s \right\}, \end{equation} $ (1.1)

    where

    $ \begin{equation} \rho \left( {{\mathcal{F}}_{S}}, {{\mathcal{F}}_{T}} \right) = \sup \left\{ \frac{\left| EXY-EXEY \right|}{\sqrt{\operatorname{Var}X}\cdot \sqrt{\operatorname{Var}Y}}:X\in {{L}_{2}}\left( {{\mathcal{F}}_{S}} \right), Y\in {{L}_{2}}\left( {{\mathcal{F}}_{T}} \right) \right\}. \end{equation} $ (1.2)

    Noting that the above fact $ \text{dist}\left(S, T \right)\ge s $ denotes $ \text{dist}\left(S, T \right) = \inf \left\{ \left| i-j \right|:i\in S, j\in T \right\}\ge s $. Obviously, $ 0\le {{\rho }^{*}}\left(s+1 \right)\le {{\rho }^{*}}\left(s \right)\le 1 $ and $ {{\rho }^{*}}\left(0 \right) = 1 $. The sequence $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ is called $ {{\rho }^{*}} $-mixing if there exists $ s\in \text{N} $ such that $ {{\rho }^{*}}\left(s \right) < 1 $. Clearly, if $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ is a sequence of independent random variables, then $ {{\rho }^{*}}\left(s \right) = 0 $ for all $ s\ge 1 $.

    $ {{\rho }^{*}} $-mixing seems similarly to another dependent structure: $ \rho $-mixing, but they are quite different from each other. $ {{\rho }^{*}} $-mixing is also a wide range class of dependent structures, which was firstly introduced to the limit theorems by Bradley [4]. From then on, many scholars investigated the limit theory for $ {{\rho }^{*}} $-mixing random variables, and a number of important applications for $ {{\rho }^{*}} $-mixing have been established. For more details, we refer to [12,16,18,19,21,23,24] among others.

    The concept of complete convergence was firstly given by Hsu and Robbins[9] as follows: A sequence of random variables $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ converges completely to a constant $ \lambda $ if $ \sum\limits_{n = 1}^{\infty }{P\left(\left| {{X}_{n}}-\lambda \right| > \varepsilon \right)} < \infty $ for all $ \varepsilon > 0 $. By the Borel-Cantelli lemma, the above result implies that $ {{X}_{n}}\to \lambda $ almost surely (a.s.). Thus, the complete convergence plays a crucial role in investigating the limit theory for summation of random variables as well as weighted sums.

    Chow [8] introduced the following notion of complete moment convergence: Let $ \left\{ {{Z}_{n}}; n\ge 1 \right\} $ be a sequence of random variables, and $ {{a}_{n}} > 0 $, $ {{b}_{n}} > 0 $, $ q > 0 $. If $ \sum\limits_{n = 1}^{\infty }{{{a}_{n}}E\left(b_{n}^{-1}\left| {{Z}_{n}} \right|-\varepsilon \right)_{+}^{q}} < \infty $ for all $ \varepsilon \ge 0 $, then the sequence $ \left\{ {{Z}_{n}}; n\ge 1 \right\} $ is called to be the complete $ q $-th moment convergence. It will be shown that the complete moment convergence is the more general version of the complete convergence, and is also much stronger than the latter (see Remark 2.1).

    According to the related statements of Rosalsky and Thành[14] as well as that of Thành[17], we recall the definition of stochastic domination as follows.

    Definition 1.2. A sequence of random variables $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ is said to be stochastically dominated by a random variable $ X $ if for all $ x\ge 0 $ and $ n\ge 1 $,

    $ \begin{equation*} {\mathop {\sup }\limits_{n \ge 1} }\, P\left( \left| {{X}_{n}} \right|\ge x \right)\le P\left( \left| X \right|\ge x \right). \end{equation*} $

    The concept of stochastic domination is a slight generalization of identical distribution. It is clearly seen that stochastic dominance of $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ by the random variable $ X $ implies $ E{{\left| {{X}_{n}} \right|}^{p}}\le E{{\left| X \right|}^{p}} $ if the $ p $-th moment of $ \left| X \right| $ exists, i.e. $ E{{\left| X \right|}^{p}} < \infty $.

    As is known to us all, the weighted sums of random variables are used widely in some important linear statistics (such as least squares estimators, nonparametric regression function estimators and jackknife estimates). Based on this respect, many probability statisticians devote to investigate the probability limiting behaviors for weighted sums of random variables. For example, Bai and Cheng[3], Cai[5], Chen and Sung[6], Cheng et al.[7], Lang et al.[11], Peng et al.[13], Sung[15,16] and Wu[20] among others.

    Recently, Li et al.[12] extended the corresponding result of Chen and Sung[6] from negatively associated random variables to $ {{\rho }^{*}} $-mixing cases by a total different method, and obtained the following theorem.

    Theorem A. Let $ \left\{ X, {{X}_{n}}; n\ge 1 \right\} $ be a sequence of identically distributed $ {{\rho }^{*}} $-mixing random variables with $ E{{X}_{n}} = 0 $, and let $ \left\{ {{a}_{ni}}; 1\le i\le n, n\ge 1 \right\} $ be an array of real constants such that $ \sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}} = O\left(n \right) $ for some $ 1 < \alpha \le 2 $. Set $ {{b}_{n}} = {{n}^{1/\alpha }}{{\left(\log n \right)}^{1/\gamma }} $ for $ 0 < \gamma < \alpha $. If $ E{{{\left| X \right|}^{\alpha }}}/{{{\left(\log \left(1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty $, then

    $ \begin{equation} \sum\limits_{n = 1}^{\infty }{\frac{1}{n}}P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > \varepsilon {{b}_{n}} \right) < \infty \quad \text{for} \quad\forall \varepsilon > 0. \end{equation} $ (1.3)

    In addition, Huang et al.[10] proved the following complete $ \alpha $-th moment convergence theorem for weighted sums of $ {{\rho }^{*}} $-mixing random variables under some moment conditions.

    Theorem B. Let $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ be a sequence of $ {{\rho }^{*}} $-mixing random variables, which is stochastically dominated by a random variable $ X $, let $ \left\{ {{a}_{ni}}; 1\le i\le n, n\ge 1 \right\} $ be an array of real constants such that $ \sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}} = O\left(n \right) $ for some $ 0 < \alpha \le 2 $. Set $ {{b}_{n}} = {{n}^{1/\alpha }}{{\left(\log n \right)}^{1/\gamma }} $ for some $ \gamma > 0 $. Assume further that $ E{{X}_{n}} = 0 $ when $ 1 < \alpha \le 2 $. If

    $ \begin{equation} \begin{array}{ll} E{{|X|}^{\alpha }} < \infty, &\;{\rm{for}}\;\quad\alpha > \gamma, \\ E|X|^{\alpha}\log (1+|X|) < \infty, &\;{\rm{for}}\;\quad \alpha = \gamma, \\ E|X|^{\gamma} < \infty, &\;{\rm{for}}\;\quad \alpha < \gamma, \\ \end{array} \end{equation} $ (1.4)

    then

    $ \begin{equation} \sum\limits_{n = 1}^{\infty }{\frac{1}{n}}E\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right|-\varepsilon \right)_{+}^{\alpha } < \infty \quad \text{ for } \forall \varepsilon > 0. \end{equation} $ (1.5)

    It is interesting to find the optimal moment conditions for (1.5). Huang et al.[10] also posed a worth pondering problem whether the result (1.5) holds for the case $ \alpha > \gamma $ under the almost optimal moment condition $ E{{{\left| X \right|}^{\alpha }}}/{{{\left(\log \left(1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty $?

    Mainly inspired by the related results of Li et al.[12], Chen and Sung[6] and Huang et al.[10], the authors will further study the convergence rate for weighted sums of $ {{\rho }^{*}} $-mixing random variables without assumptions of identical distribution. Under the almost optimal moment condition $ E{{{\left| X \right|}^{\alpha }}}/{{{\left(\log \left(1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty $ for $ 0 < \gamma < \alpha $ with $ 1 < \alpha \le 2 $, a version of the complete $ \alpha $-th moment convergence theorem for weighted sums of $ {{\rho }^{*}} $-mixing random variables is established. The main result not only improves the corresponding ones of Li et al.[12], Chen and Sung[6], but also partially settles the open problem posed by Huang et al.[10].

    Now, we state the main result as follows. Some important auxiliary lemmas and the proof of the theorem will be detailed in the next section.

    Theorem 1.1. Let $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ be a sequence of $ {{\rho }^{*}} $-mixing random variables with $ E{{X}_{n}} = 0 $, which is stochastically dominated by a random variable $ X $, let $ \left\{ {{a}_{ni}}; 1\le i\le n, n\ge 1 \right\} $ be an array of real constants such that $ \sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}} = O\left(n \right) $ for some $ 0 < \alpha \le 2 $. Set $ {{b}_{n}} = {{n}^{1/\alpha }}{{\left(\log n \right)}^{1/\gamma }} $ for $ \gamma > 0 $. If $ E{{{\left| X \right|}^{\alpha }}}/{{{\left(\log \left(1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty $ for $ \alpha > \gamma $ with $ 1 < \alpha \le 2 $, then (1.5) holds.

    Throughout this paper, let $ I\left(A \right) $ be the indicator function of the event $ A $ and $ I(A, B) = I(A\bigcap B) $. The symbol $ C $ always presents a positive constant, which may be different in various places, and $ {{a}_{n}} = O\left({{b}_{n}} \right) $ stands for $ {{a}_{n}}\le C{{b}_{n}} $.

    To prove our main result of this paper, we need the following important lemmas.

    Lemma 2.1. (Utev and Peligrad[18]) Let $ p\ge 2 $, $ \left\{ {{X}_{n}}; n\ge 1 \right\} $ be a sequence of $ {{\rho }^{*}} $-mixing random variables with $ E{{X}_{n}} = 0 $ and $ E{{\left| {{X}_{n}} \right|}^{p}} < \infty $ for all $ n\ge 1 $. Then there exists a positive constant $ C $ depending only on $ p $, $ s $ and $ {{\rho }^{*}}\left(s \right) $ such that

    $ \begin{equation} E\left( {\mathop {\max }\limits_{1 \le j \le n} }\, {{\left| \sum\limits_{i = 1}^{j}{{{X}_{i}}} \right|}^{p}} \right)\le C\left( \sum\limits_{i = 1}^{n}{E{{\left| {{X}_{i}} \right|}^{p}}}+{{\left( \sum\limits_{i = 1}^{n}{EX_{i}^{2}} \right)}^{p/2}} \right). \end{equation} $ (2.1)

    In particular, if $ p = 2 $,

    $ \begin{equation} E\left( {\mathop {\max }\limits_{1 \le j \le n} }\, {{\left| \sum\limits_{i = 1}^{j}{{{X}_{i}}} \right|}^{2}} \right)\le C\sum\limits_{i = 1}^{n}{EX_{i}^{2}}. \end{equation} $ (2.2)

    The following one is a basic property for stochastic domination. For the details, one refers to Adler and Rosalsky[1] and Adler et al.[2], or Wu[22]. In fact, we can remove the constant $ C $ in those of Adler and Rosalsky[1] and Adler et al.[2], or Wu[22], since it was proved in Reference [[14], Theorem 2.4] (or [[17], Corollary 2.3]) that this is indeed equivalent to $ C = 1 $.

    Lemma 2.2. Let $ \left\{ {{X}_{n}}, n\ge 1 \right\} $ be a sequence of random variables which is stochastically dominated by a random variable $ X $. For all $ \beta > 0 $ and $ b > 0 $, the following statements hold:

    $ \begin{equation} E{{\left| {{X}_{n}} \right|}^{\beta }}I\left( \left| {{X}_{n}} \right|\le b \right)\le \left( E{{\left| X \right|}^{\beta }}I\left( \left| X \right|\le b \right)+{{b}^{\beta }}P\left( \left| X \right| > b \right) \right), \end{equation} $ (2.3)
    $ \begin{equation} E{{\left| {{X}_{n}} \right|}^{\beta }}I\left( \left| {{X}_{n}} \right| > b \right)\le E{{\left| X \right|}^{\beta }}I\left( \left| X \right| > b \right). \end{equation} $ (2.4)

    Consequently, $ E{{\left| {{X}_{n}} \right|}^{\beta }}\le E{{\left| X \right|}^{\beta }} $.

    Lemma 2.3. Under the conditions of Theorem 1.1, if $ E{{{\left| X \right|}^{\alpha }}}/{{{\left(\log \left(1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty $ for $ 0 < \gamma < \alpha $ with $ 0 < \alpha \le 2 $, then

    $ \begin{equation} \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}} < \infty. \end{equation} $ (2.5)

    Proof. By Definition 1.2, noting that

    $ \begin{eqnarray} \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}}&\le& \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}} \\ &\le& \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{0}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \frac{{{\left| {{a}_{ni}}X \right|}^{\alpha }}}{b_{n}^{\alpha }} > t \right)}dt}} \\ &\le& \sum\limits_{n = 1}^{\infty }{{{n}^{-1}}b_{n}^{-\alpha }\sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{\alpha }}}}. \end{eqnarray} $ (2.6)

    It is easy to show that

    $ \begin{eqnarray} \sum\limits_{n = 1}^{\infty }{{{n}^{-1}}b_{n}^{-\alpha }\sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}E{{\left| X \right|}^{\alpha }}I\left( \left| X \right|\le {{b}_{n}} \right)}}&\le& C\sum\limits_{n = 1}^{\infty }{b_{n}^{-\alpha }E{{\left| X \right|}^{\alpha }}I\left( \left| X \right|\le {{b}_{n}} \right)} \\ &\le& C\sum\limits_{n = 1}^{\infty }{b_{n}^{-\alpha }\sum\limits_{k = 1}^{n}{E{{\left| X \right|}^{\alpha }}I\left( {{b}_{k}} < \left| X \right|\le {{b}_{k+1}} \right)}} \\ &\le& C\sum\limits_{k = 1}^{\infty }{E{{\left| X \right|}^{\alpha }}I\left( {{b}_{k}} < \left| X \right|\le {{b}_{k+1}} \right){{\left( \log k \right)}^{1-\left( \alpha /\gamma \right)}}} \\ &\le& CE{{{\left| X \right|}^{\alpha }}}/{{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\left( \alpha /\gamma \right)-1}}}\; < \infty, \end{eqnarray} $ (2.7)

    and

    $ \begin{eqnarray} \sum\limits_{n = 1}^{\infty }{{{n}^{-1}}b_{n}^{-\alpha }\sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}E{{\left| X \right|}^{\alpha }}I\left( \left| X \right| > {{b}_{n}} \right)}}&\le& C\sum\limits_{n = 1}^{\infty }{b_{n}^{-\alpha }E{{\left| X \right|}^{\alpha }}I\left( \left| X \right| > {{b}_{n}} \right)} \\ & = &C\sum\limits_{n = 1}^{\infty }{b_{n}^{-\alpha }\sum\limits_{j = n}^{\infty }{E{{\left| X \right|}^{\alpha }}I\left( {{b}_{j}} < \left| X \right|\le {{b}_{j+1}} \right)}} \\ & = &C\sum\limits_{j = 1}^{\infty }{E{{\left| X \right|}^{\alpha }}I\left( {{b}_{j}} < \left| X \right|\le {{b}_{j+1}} \right)\sum\limits_{n = 1}^{j}{{{n}^{-1}}{{\left( \log n \right)}^{-\alpha /\gamma }}}} \\ &\le& C\sum\limits_{j = 1}^{\infty }{{{\left( \log j \right)}^{1-\left( \alpha /\gamma \right)}}E{{\left| X \right|}^{\alpha }}I\left( {{b}_{j}} < \left| X \right|\le {{b}_{j+1}} \right)} \\ &\le& CE{{{\left| X \right|}^{\alpha }}}/{{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\left( \alpha /\gamma \right)-1}}}\; < \infty. \end{eqnarray} $ (2.8)

    Hence, (2.5) holds by (2.6)–(2.8).

    Proof of Theorem 1.1. For any given $ \varepsilon > 0 $, observing that

    $ \begin{eqnarray} \sum\limits_{n = 1}^{\infty }{\frac{1}{n}}E\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right|-\varepsilon \right)_{+}^{\alpha} & = & \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{0}^{\infty }{P\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right|-\varepsilon > {{t}^{1/\alpha}} \right)dt}} \\ & = & \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{0}^{1}{P\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > \varepsilon +{{t}^{1/\alpha}} \right)dt}} \\ &&+ \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{P\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > \varepsilon +{{t}^{1/\alpha}} \right)dt}} \\ &\le& \sum\limits_{n = 1}^{\infty }{\frac{1}{n}P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > \varepsilon {{b}_{n}} \right)} \\ &&+ \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha}} \right)dt}} \\ &\triangleq& I+J. \end{eqnarray} $ (2.9)

    By Theorem A of Li et al.[12] declared in the first section, we get directly $ I < \infty $. In order to prove (1.5), it suffices to show that $ J < \infty $.

    Without loss of generality, assume that $ {{a}_{ni}}\ge 0 $. For all $ t\ge 1 $ and $ 1\le i\le n $, $ n\in \text{N} $, define

    $ \begin{equation*} {{Y}_{i}} = {{a}_{ni}}{{X}_{i}}I\left( \left| {{a}_{ni}}{{X}_{i}} \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right). \end{equation*} $

    It is easy to check that

    $ \begin{equation*} \left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)\subset \left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{Y}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)\bigcup \left( \bigcup\limits_{i = 1}^{n}{\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)} \right), \end{equation*} $

    which implies

    $ \begin{eqnarray} P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)&\le& P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{Y}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right) \\ && +P\left( \bigcup\limits_{i = 1}^{n}{\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)} \right). \end{eqnarray} $ (2.10)

    To prove $ J < \infty $, we need only to show that

    $ \begin{equation*} {{J}_{1}} = \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{Y}_{i}}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)dt}} < \infty, \end{equation*} $
    $ \begin{equation*} {{J}_{2}} = \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{P\left( \bigcup\limits_{i = 1}^{n}{\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)dt}} < \infty. \end{equation*} $

    Since

    $ P\left( \bigcup\limits_{i = 1}^{n}{\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)\le \sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}, $

    it follows from Lemma 2.3 that

    $ \begin{equation*} {{J}_{2}}\le \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}} < \infty. \end{equation*} $

    Next, we prove that

    $ \begin{equation} {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{E{{Y}_{i}}} \right|\to 0. \end{equation} $ (2.11)

    By $ E{{X}_{n}} = 0 $ and (2.4) of Lemma 2.2, it follows that

    $ \begin{array}{l} {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{E{{Y}_{i}}} \right| = {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{E{{a}_{ni}}{{X}_{i}}I\left( \left| {{a}_{ni}}{{X}_{i}} \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right)} \right|\\ = {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{E{{a}_{ni}}{{X}_{i}}I\left( \left| {{a}_{ni}}{{X}_{i}} \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)} \right|\\ \le C{\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}\sum\limits_{i = 1}^{n}{E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}. \end{array} $

    Observe that,

    $ \begin{eqnarray} E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)& = &E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right|\le {{b}_{n}} \right) \\ &&+E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right| > {{b}_{n}} \right). \end{eqnarray} $ (2.12)

    For $ 0 < \gamma < \alpha $ and $ 1 < \alpha \le 2 $, it is clearly shown that

    $ \begin{align} & E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right|\le {{b}_{n}} \right) \le {{C}}b_{n}^{1-\alpha }{{t}^{\left( 1/\alpha \right)-1}}{{\left| {{a}_{ni}} \right|}^{\alpha }}E{{\left| X \right|}^{\alpha }}I\left( \left| X \right|\le {{b}_{n}} \right) \\ & \le {{C}}b_{n}^{1-\alpha }{{t}^{\left( 1/\alpha \right)-1}}{{\left| {{a}_{ni}} \right|}^{\alpha }}E\left( \frac{{{\left| X \right|}^{\alpha }}}{{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}} \right)I\left( \left| X \right|\le {{b}_{n}} \right) \\ & \le {{C}}{{t}^{\left( 1/\alpha \right)-1}}{{n}^{-1+\left( 1/\alpha \right)}}{{\left| {{a}_{ni}} \right|}^{\alpha }}{{(\log n)}^{\left( 1/\gamma \right)-1}}, \end{align} $ (2.13)

    and

    $ \begin{eqnarray} E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right| > {{b}_{n}} \right)&\le& {{C}}\left| {{a}_{ni}} \right|E\left| X \right|I\left( \left| X \right| > {{b}_{n}} \right) \\ &\le& {{C}}b_{n}^{1-\alpha }{{\left( \log \left( 1+{{b}_{n}} \right) \right)}^{\left( \alpha /\gamma \right)-1}}\left| {{a}_{ni}} \right| \\ &\le& {{C}}{{n}^{-1+\left( 1/\alpha \right)}}{{(\log n)}^{-1+\left( 1/\gamma \right)}}\left| {{a}_{ni}} \right|. \end{eqnarray} $ (2.14)

    Thus,

    $ \begin{eqnarray} {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}\sum\limits_{i = 1}^{n}{E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right|\le {{b}_{n}} \right)}&\le& Cb_{n}^{-1}{{n}^{-1+\left( 1/\alpha \right)}}{{(\log n)}^{\left( 1/\gamma \right)-1}}\sum\limits_{i = 1}^{n}{{{\left| {{a}_{ni}} \right|}^{\alpha }}} \\ &\le& C{{(\log n)}^{-1}}\to 0, \end{eqnarray} $ (2.15)

    and

    $ \begin{eqnarray} {\mathop {\sup }\limits_{t \ge 1} }\, \frac{1}{{{b}_{n}}{{t}^{1/\alpha }}}\sum\limits_{i = 1}^{n}{E\left| {{a}_{ni}}X \right|I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }}, \left| X \right| > {{b}_{n}} \right)}&\le& Cb_{n}^{-1}{{n}^{-1+\left( 1/\alpha \right)}}{{(\log n)}^{-1+\left( 1/\gamma \right)}}\sum\limits_{i = 1}^{n}{\left| {{a}_{ni}} \right|} \\ &\le& C{{(\log n)}^{-1}}\to 0. \end{eqnarray} $ (2.16)

    Then, (2.11) holds by the argumentation of (2.12)–(2.16).

    Hence, for $ n $ sufficiently large, we have that $ {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{E{{Y}_{i}}} \right|\le \frac{{{b}_{n}}{{t}^{1/\alpha }}}{2} $ holds uniformly for all $ t\ge 1 $. Therefore,

    $ \begin{equation} {{J}_{1}} = \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{\left( {{Y}_{i}}-E{{Y}_{i}} \right)} \right| > \frac{{{b}_{n}}{{t}^{1/\alpha }}}{2} \right)dt}}. \end{equation} $ (2.17)

    By the Markov's inequality, (2.2) of Lemma 2.1 and (2.3) of Lemma 2.2, we get that

    $ \begin{eqnarray} {{J}_{1}}&\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}E\left( {\mathop {\max }\limits_{1 \le j \le n} }\, {{\left| \sum\limits_{i = 1}^{j}{\left( {{Y}_{i}}-E{{Y}_{i}} \right)} \right|}^{2}} \right)dt}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{Y}_{i}}-E{{Y}_{i}} \right|}^{2}}} \right)dt}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}{{X}_{i}} \right|}^{2}}I\left( \left| {{a}_{ni}}{{X}_{i}} \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)dt}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( \left| {{a}_{ni}}X \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)dt}} \\ &&+C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( \left| {{a}_{ni}}X \right|\le {{b}_{n}} \right)} \right)dt}} \\ &&+C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}} < \left| {{a}_{ni}}X \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)dt}} \\ &&+C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}} \\ & = &{{J}_{11}}+{{J}_{12}}+{{J}_{13}}. \end{eqnarray} $ (2.18)

    Based on the formula (2.2) of Lemma 2.2 in Li et al.[10], we get that

    $ \begin{eqnarray} {{J}_{11}}& = &\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( \left| {{a}_{ni}}X \right|\le {{b}_{n}} \right)} \right)dt}} \\ &\le& \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\frac{1}{b_{n}^{\alpha }}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{\alpha }}I\left( \left| {{a}_{ni}}X \right|\le {{b}_{n}} \right)} \right)} < \infty. \end{eqnarray} $ (2.19)

    Denoting $ t = {{x}^{\alpha }} $, by (2.3) of Lemma 2.2, the Markov's inequality and Lemma 2.3, we also get that

    $ \begin{eqnarray} {{J}_{12}}& = &\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\frac{1}{b_{n}^{2}{{t}^{2/\alpha }}}\left( \sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}} < \left| {{a}_{ni}}X \right|\le {{b}_{n}}{{t}^{1/\alpha }} \right)} \right)dt}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\int_{1}^{\infty }{{{x}^{\alpha -3}}\sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}} < \left| {{a}_{ni}}X \right|\le {{b}_{n}}x \right)}dx}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\sum\limits_{m = 1}^{\infty }{\int_{m}^{m+1}{{{x}^{\alpha -3}}\sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}} < \left| {{a}_{ni}}X \right|\le {{b}_{n}}x \right)}dx}}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\sum\limits_{m = 1}^{\infty }{{{m}^{\alpha -3}}\sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}} < \left| {{a}_{ni}}X \right|\le {{b}_{n}}\left( m+1 \right) \right)}}} \\ & = &C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\sum\limits_{i = 1}^{n}{\sum\limits_{m = 1}^{\infty }{\sum\limits_{s = 1}^{m}{{{m}^{\alpha -3}}E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}}s < \left| {{a}_{ni}}X \right|\le {{b}_{n}}\left( s+1 \right) \right)}}}} \\ & = &C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\sum\limits_{i = 1}^{n}{\sum\limits_{s = 1}^{\infty }{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}}s < \left| {{a}_{ni}}X \right|\le {{b}_{n}}\left( s+1 \right) \right)\sum\limits_{m = s}^{\infty }{{{m}^{\alpha -3}}}}}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{2}}\sum\limits_{i = 1}^{n}{\sum\limits_{s = 1}^{\infty }{E{{\left| {{a}_{ni}}X \right|}^{2}}I\left( {{b}_{n}}s < \left| {{a}_{ni}}X \right|\le {{b}_{n}}\left( s+1 \right) \right){{s}^{\alpha -2}}}}} \\ &\le& C\sum\limits_{n = 1}^{\infty }{\frac{1}{nb_{n}^{\alpha }}\sum\limits_{i = 1}^{n}{E{{\left| {{a}_{ni}}X \right|}^{\alpha }}I\left( \left| {{a}_{ni}}X \right| > {{b}_{n}} \right)}} \\ &\le& CE{{{\left| X \right|}^{\alpha }}}/{{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty. \end{eqnarray} $ (2.20)

    Analogous to the argumentation of Lemma 2.3, it is easy to show that

    $ \begin{equation} {{J}_{13}} = \sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{1}^{\infty }{\sum\limits_{i = 1}^{n}{P\left( \left| {{a}_{ni}}X \right| > {{b}_{n}}{{t}^{1/\alpha }} \right)}dt}}\le CE{{{\left| X \right|}^{\alpha }}}/{{{\left( \log \left( 1+\left| X \right| \right) \right)}^{\alpha /\gamma -1}}}\; < \infty. \end{equation} $ (2.21)

    Hence, the desired result $ {{J}_{1}} < \infty $ holds by the above statements. The proof of Theorem 1.1 is completed.

    Remark 2.1. Under the conditions of Theorem 1.1, noting that

    $ \begin{eqnarray} \infty & > & \sum\limits_{n = 1}^{\infty }{\frac{1}{n}}E\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right|-\varepsilon \right)_{+}^{\alpha} \\ & = & \sum\limits_{n = 1}^{\infty }{\frac{1}{n}}\int_{0}^{\infty }{P\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right|-\varepsilon > {{t}^{1/\alpha}} \right)d}t \\ &\ge& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}\int_{0}^{{{\varepsilon }^{\alpha }}}{P\left( \frac{1}{{{b}_{n}}}{\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > \varepsilon +{{t}^{1/\alpha }} \right)}dt} \\ &\ge& C\sum\limits_{n = 1}^{\infty }{\frac{1}{n}}P\left( {\mathop {\max }\limits_{1 \le j \le n} }\, \left| \sum\limits_{i = 1}^{j}{{{a}_{ni}}{{X}_{i}}} \right| > 2\varepsilon{{b}_{n}} \right)\quad \text{for} \quad\forall \varepsilon > 0. \end{eqnarray} $ (2.22)

    Since $ \varepsilon > 0 $ is arbitrary, it follows from (2.22) that the complete moment convergence is much stronger than the complete convergence. Compared with the corresponding results of Li et al.[12], Chen and Sung[6], it is worth pointing out that Theorem 1.1 of this paper is an extension and improvement of those of Li et al.[12], Chen and Sung[6] under the same moment condition. In addition, the main result partially settles the open problem posed by Huang et al.[10] for the case $ 0 < \gamma < \alpha $ with $ 1 < \alpha \le 2 $.

    In this work, we consider the problem of complete moment convergence for weighted sums of weakly dependent (or $ {{\rho }^{*}} $-mixing) random variables. The main results of this paper are presented in the form of the main theorem and a remark as well as Lemma 2.3, which plays a vital role to prove the main theorem. The presented main theorem improves and generalizes the corresponding complete convergence results of Li et al.[12] and Chen and Sung[6].

    The authors are most grateful to the Editor as well as the anonymous referees for carefully reading the manuscript and for offering some valuable suggestions and comments, which greatly enabled them to improve this paper. This paper is supported by the Doctor and Professor Natural Science Foundation of Guilin University of Aerospace Technology.

    All authors declare no conflicts of interest in this paper.



    [1] World Health Organization, Coronavirus, 2020. Available from: https://www.who.int/health-topics/coronavirus.
    [2] World Health Organization, Situation Report, 2020. Available from: https://www.who.int/docs/default/source/coronaviruse/situation-reports.
    [3] R. M. Anderson, R. M. May, Infectious diseases of humans: Dynamics and control, Oxford University Press, Oxford, 1991.
    [4] M. J. Keeling, P. Rohnai, Modeling infectious diseases in humans and animals, Princeton University Press, Princeton, 2011.
    [5] M. Gilbert, G. Pullano, F. Pinotti, E. Valdano, C. Poletto, P. Boelle, et al., Preparedness and vulnerability of African countries against importations of COVID-19 : a modelling study. Lancet, 395 (2020), 871-877. doi: 10.1016/S0140-6736(20)30411-6
    [6] Q. Li, X. Guan, P. Wu, X. Wang, L. Zhou, Y. Tong, et al., Early Transmission Dynamics in Wuhan, China, of Novel Coronavious-Infected Pneumonia, N. Engl. J. Med., 382 (2020), 1199-1207. doi: 10.1056/NEJMoa2001316
    [7] B. Tang, X. Wang, Q. Li, N. L. Bragazzi, S. Y. Tang, Y. Xiao, et al., Estimation of the transmission risk of 2019-nCov and its implication for public health interventions, J. Clin. Med., 9 (2020), 462. doi: 10.3390/jcm9020462
    [8] B. Tang, F. Xia, S. Y. Tang, N. L. Bragazzi, Q. Li, X. Sun, et al., The evolution of quarantined and suspected cases determines the final trend of the 2019-nCoV epidemics based on multi-source data analyses, Available at SSRN, (2020), 3537099.
    [9] Special Expert Group for Control of the Epidemic of Novel Coronavirus Pneumonia of the Chinese Preventive Medicine Association, An update on the epidemiological characteristics of novel coronavirus pneumonia (COVID-19), Chin. J. Epidemiol., 41 (2020), 139-144.
    [10] Y. Liu, A. A. Gayle, A. Wilder-Smith, J. Rocklov, The reproductive number of COVID-19 is higher compared to SARS coronavirus, J. Travel Med., 27 (2020), 32052846.
    [11] X. Wang, S. Y. Tang, N. L, Bragazzi, When will be the resumption of work in Wuhan and its surrounding areas during COVID-19 epidemic? A data-driven network modeling analysis, Sci. Sin. Math., 50 (2020), 969. doi: 10.1360/SSM-2020-0037
    [12] S. Y. Tang, B. Tang, N. L. Bragazzi, F. Xia, Analysis of COVID-19 epidemic traced data and stochastic discrete transmission dynamic model, Sci. Sin. Math., 50 (2020), 1071. doi: 10.1360/SSM-2020-0053
    [13] B. Cantó, C. Coll, E.Sánchez, Estimation of parameters in a structured SIR model, Adv. Differ. Equ., 1 (2017), 33.
    [14] Y. Chen, J. Cheng, Y. Jiang, K. Liu, A time delay dynamical model for outbreak of 2019-nCoV and the parameter identification, J. Inverse Ill-posed Probl., 28 (2020), 243-250. doi: 10.1515/jiip-2020-0010
    [15] M. Gatto, E. Bertuzzo, L. Mari, S. Miccoli, L. Carraro, R. Casagrandi, et al., Spread and dynamics of the COVID-19 epidemic in Italy: Effects of emergency containment measures, Proc. Natl. Acad. Sci., 117 (2020), 10484-10491. doi: 10.1073/pnas.2004978117
    [16] J. T. Wu, K. Leung, G. M. Leung, Nowcasting and forecasting the potential domestic and international spread of the 2019-nCoV outbreak originating in Wuhan, China: a modelling study, Lancet, 395 (2020), 689-697. doi: 10.1016/S0140-6736(20)30260-9
    [17] National Health Commission of the People's Republic of China, 2020. Available from: http://www.nhc.gov.cn.
    [18] A. J. Kucharski, T. W. Russell, C. Diamond, Y. Liu, J. Edmunds, S. Funk, et al., Early dynamics of transmission and control of COVID-19: a mathematical modelling study, Lancet. Infect. Dis., 20 (2020), 553-558. doi: 10.1016/S1473-3099(20)30144-4
    [19] J. Wallinga, P. Teunis, Different epidemic curves for severe acute respiratory syndrome reveal similar impacts of control measures, Am. J. Epidemiol., 160 (2004), 509-516. doi: 10.1093/aje/kwh255
    [20] N. Chintalapudi, G. Battineni, G. G. Sagaro, F. Amenta, COVID-19 outbreak reproduction number estimations and forecasting in Marche, Italy, Int. J. Infect. Dis., 96 (2020), 327-333. doi: 10.1016/j.ijid.2020.05.029
    [21] M. Coccia, An index to quantify environmental risk of exposure to future epidemics of the COVID-19 and similar viral agents: Theory and Practice, Environ. Res., (2020), 110155.
    [22] M. Coccia, Factors determining the diffusion of COVID-19 and suggested strategy to prevent future accelerated viral infectivity similar to COVID, Sci. Total Environ., (2020), 138474.
  • This article has been cited by:

    1. Yukun Xiao, Jianzhi Han, Cocommutative connected vertex (operator) bialgebras, 2025, 212, 03930440, 105461, 10.1016/j.geomphys.2025.105461
  • Reader Comments
  • © 2021 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(6535) PDF downloads(672) Cited by(9)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog