Research article

Survival prediction model for right-censored data based on improved composite quantile regression neural network


  • With the development of the field of survival analysis, statistical inference of right-censored data is of great importance for the study of medical diagnosis. In this study, a right-censored data survival prediction model based on an improved composite quantile regression neural network framework, called rcICQRNN, is proposed. It incorporates composite quantile regression with the loss function of a multi-hidden layer feedforward neural network, combined with an inverse probability weighting method for survival prediction. Meanwhile, the hyperparameters involved in the neural network are adjusted using the WOA algorithm, integer encoding and One-Hot encoding are implemented to encode the classification features, and the BWOA variable selection method for high-dimensional data is proposed. The rcICQRNN algorithm was tested on a simulated dataset and two real breast cancer datasets, and the performance of the model was evaluated by three evaluation metrics. The results show that the rcICQRNN-5 model is more suitable for analyzing simulated datasets. The One-Hot encoding of the WOA-rcICQRNN-30 model is more applicable to the NKI70 data. The model results are optimal for k=15 after feature selection for the METABRIC dataset. Finally, we implemented the method for cross-dataset validation. On the whole, the Cindex results using One-Hot encoding data are more stable, making the proposed rcICQRNN prediction model flexible enough to assist in medical decision making. It has practical applications in areas such as biomedicine, insurance actuarial and financial economics.

    Citation: Xiwen Qin, Dongmei Yin, Xiaogang Dong, Dongxue Chen, Shuang Zhang. Survival prediction model for right-censored data based on improved composite quantile regression neural network[J]. Mathematical Biosciences and Engineering, 2022, 19(8): 7521-7542. doi: 10.3934/mbe.2022354

    Related Papers:

    [1] Jiafan Zhang . On the distribution of primitive roots and Lehmer numbers. Electronic Research Archive, 2023, 31(11): 6913-6927. doi: 10.3934/era.2023350
    [2] Yang Gao, Qingzhong Ji . On the inverse stability of zn+c. Electronic Research Archive, 2025, 33(3): 1414-1428. doi: 10.3934/era.2025066
    [3] J. Bravo-Olivares, E. Fernández-Cara, E. Notte-Cuello, M.A. Rojas-Medar . Regularity criteria for 3D MHD flows in terms of spectral components. Electronic Research Archive, 2022, 30(9): 3238-3248. doi: 10.3934/era.2022164
    [4] Zhefeng Xu, Xiaoying Liu, Luyao Chen . Hybrid mean value involving some two-term exponential sums and fourth Gauss sums. Electronic Research Archive, 2025, 33(3): 1510-1522. doi: 10.3934/era.2025071
    [5] Jorge Garcia Villeda . A computable formula for the class number of the imaginary quadratic field Q(p), p=4n1. Electronic Research Archive, 2021, 29(6): 3853-3865. doi: 10.3934/era.2021065
    [6] Li Wang, Yuanyuan Meng . Generalized polynomial exponential sums and their fourth power mean. Electronic Research Archive, 2023, 31(7): 4313-4323. doi: 10.3934/era.2023220
    [7] Qingjie Chai, Hanyu Wei . The binomial sums for four types of polynomials involving floor and ceiling functions. Electronic Research Archive, 2025, 33(3): 1384-1397. doi: 10.3934/era.2025064
    [8] Hai-Liang Wu, Li-Yuan Wang . Permutations involving squares in finite fields. Electronic Research Archive, 2022, 30(6): 2109-2120. doi: 10.3934/era.2022106
    [9] Li Rui, Nilanjan Bag . Fourth power mean values of one kind special Kloosterman's sum. Electronic Research Archive, 2023, 31(10): 6445-6453. doi: 10.3934/era.2023326
    [10] Hongliang Chang, Yin Chen, Runxuan Zhang . A generalization on derivations of Lie algebras. Electronic Research Archive, 2021, 29(3): 2457-2473. doi: 10.3934/era.2020124
  • With the development of the field of survival analysis, statistical inference of right-censored data is of great importance for the study of medical diagnosis. In this study, a right-censored data survival prediction model based on an improved composite quantile regression neural network framework, called rcICQRNN, is proposed. It incorporates composite quantile regression with the loss function of a multi-hidden layer feedforward neural network, combined with an inverse probability weighting method for survival prediction. Meanwhile, the hyperparameters involved in the neural network are adjusted using the WOA algorithm, integer encoding and One-Hot encoding are implemented to encode the classification features, and the BWOA variable selection method for high-dimensional data is proposed. The rcICQRNN algorithm was tested on a simulated dataset and two real breast cancer datasets, and the performance of the model was evaluated by three evaluation metrics. The results show that the rcICQRNN-5 model is more suitable for analyzing simulated datasets. The One-Hot encoding of the WOA-rcICQRNN-30 model is more applicable to the NKI70 data. The model results are optimal for k=15 after feature selection for the METABRIC dataset. Finally, we implemented the method for cross-dataset validation. On the whole, the Cindex results using One-Hot encoding data are more stable, making the proposed rcICQRNN prediction model flexible enough to assist in medical decision making. It has practical applications in areas such as biomedicine, insurance actuarial and financial economics.



    Let Fq be the finite field of q elements with characteristic p, where q=pr, p is a prime number. Let Fq=Fq{0} and Z+ denote the set of positive integers. Let sZ+ and bFq. Let f(x1,,xs) be a diagonal polynomial over Fq of the following form

    f(x1,,xs)=a1xm11+a2xm22++asxmss,

    where aiFq, miZ+, i=1,,s. Denote by Nq(f=b) the number of Fq-rational points on the affine hypersurface f=b, namely,

    Nq(f=b)=#{(x1,,xs)As(Fq)f(x1,,xs)=b}.

    In 1949, Hua and Vandiver [1] and Weil [2] independently obtained the formula of Nq(f=b) in terms of character sum as follows

    Nq(f=b)=qs1+ψ1(a11)ψs(ass)J0q(ψ1,,ψs), (1.1)

    where the sum is taken over all s multiplicative characters of Fq that satisfy ψmii=ε, ψiε, i=1,,s and ψ1ψs=ε. Here ε is the trivial multiplicative character of Fq, and J0q(ψ1,,ψs) is the Jacobi sum over Fq defined by

    J0q(ψ1,,ψs)=c1++cs=0,ciFqψ1(c1)ψs(cs).

    Though the explicit formula for Nq(f=b) are difficult to obtain in general, it has been studied extensively because of their theoretical importance as well as their applications in cryptology and coding theory; see[3,4,5,6,7,8,9]. In this paper, we use the Jacobi sums, Gauss sums and the results of quadratic form to deduce the formula of the number of Fq2-rational points on a class of hypersurfaces over Fq2 under certain conditions. The main result of this paper can be stated as

    Theorem 1.1. Let q=2r with rZ+ and Fq2 be the finite field of q2 elements. Let f(X)=a1xm11+a2xm22++asxmss, g(Y)=y1y2+y3y4++yn1yn+y2n2t1+ +y2n3+y2n1+bty2n2t++b1y2n2+b0y2n, and l(X,Y)=f(X)+g(Y), where ai,bjFq2, mi1, (mi,mk)=1, ik, mi|(q+1), miZ+, 2|n, n>2, 0tn22, TrFq2/F2(bj)=1 for i,k=1,,s and j=0,1,,t. For hFq2, we have

    (1) If h=0, then

    Nq2(l(X,Y)=0)=q2(s+n1)+γFq2(si=1((γai)mimi1)(qs+2n3+(1)tqs+n3)).

    (2) If hFq2, then

    Nq2(l(X,Y)=h)=q2(s+n1)+(qs+2n3+(1)t+1(q21)qs+n3)si=1((hai)mimi1)+γFq2{h}[si=1((γai)mimi1)(q2n+s3+(1)tqn+s3)].

    Here,

    (γai)mi={1,ifγaiisaresidueofordermi,0,otherwise.

    To prove Theorem 1.1, we need the lemmas and theorems below which are related to the Jacobi sums and Gauss sums.

    Definition 2.1. Let χ be an additive character and ψ a multiplicative character of Fq. The Gauss sum Gq(ψ,χ) in Fq is defined by

    Gq(ψ,χ)=xFqψ(x)χ(x).

    In particular, if χ is the canonical additive character, i.e., χ(x)=e2πiTrFq/Fp(x)/p where TrFq/Fp(y)=y+yp++ypr1 is the absolute trace of y from Fq to Fp, we simply write Gq(ψ):=Gq(ψ,χ).

    Let ψ be a multiplicative character of Fq which is defined for all nonzero elements of Fq. We extend the definition of ψ by setting ψ(0)=0 if ψε and ε(0)=1.

    Definition 2.2. Let ψ1,,ψs be s multiplicative characters of Fq. Then, Jq(ψ1,,ψs) is the Jacobi sum over Fq defined by

    Jq(ψ1,,ψs)=c1++cs=1,ciFqψ1(c1)ψs(cs).

    The Jacobi sums Jq(ψ1,,ψs) as well as the sums J0q(ψ1,,ψs) can be evaluated easily in case some of the multiplicative characters ψi are trivial.

    Lemma 2.3. ([10,Theorem 5.19,p. 206]) If the multiplicative characters ψ1,,ψs of Fq are trivial, then

    Jq(ψ1,,ψs)=J0q(ψ1,,ψs)=qs1.

    If some, but not all, of the ψi are trivial, then

    Jq(ψ1,,ψs)=J0q(ψ1,,ψs)=0.

    Lemma 2.4. ([10,Theorem 5.20,p. 206]) If ψ1,,ψs are multiplicative characters of Fq with ψs nontrivial, then

    J0q(ψ1,,ψs)=0

    if ψ1ψs is nontrivial and

    J0q(ψ1,,ψs)=ψs(1)(q1)Jq(ψ1,,ψs1)

    if ψ1ψs is trivial.

    If all ψi are nontrivial, there exists an important connection between Jacobi sums and Gauss sums.

    Lemma 2.5. ([10,Theorem 5.21,p. 207]) If ψ1,,ψs are nontrivial multiplicative characters of Fq and χ is a nontrivial additive character of Fq, then

    Jq(ψ1,,ψs)=Gq(ψ1,χ)Gq(ψs,χ)Gq(ψ1ψs,χ)

    if ψ1ψs is nontrivial and

    Jq(ψ1,,ψs)=ψs(1)Jq(ψ1,,ψs1)=1qGq(ψ1,χ)Gq(ψs,χ)

    if ψ1ψs is trivial.

    We turn to another special formula for Gauss sums which applies to a wider range of multiplicative characters but needs a restriction on the underlying field.

    Lemma 2.6. ([10,Theorem 5.16,p. 202]) Let q be a prime power, let ψ be a nontrivial multiplicative character of Fq2 of order m dividing q+1. Then

    Gq2(ψ)={q,ifmoddorq+1meven,q,ifmevenandq+1modd.

    For hFq2, define v(h)=1 if hFq2 and v(0)=q21. The property of the function v(h) will be used in the later proofs.

    Lemma 2.7. ([10,Lemma 6.23,p. 281]) For any finite field Fq, we have

    cFqv(c)=0,

    for any bFq,

    c1++cm=bv(c1)v(ck)={0,1k<m,v(b)qm1,k=m,

    where the sum is over all c1,,cmFq with c1++cm=b.

    The quadratic forms have been studied intensively. A quadratic form f in n indeterminates is called nondegenerate if f is not equivalent to a quadratic form in fewer than n indeterminates. For any finite field Fq, two quadratic forms f and g over Fq are called equivalent if f can be transformed into g by means of a nonsingular linear substitution of indeterminates.

    Lemma 2.8. ([10,Theorem 6.30,p. 287]) Let fFq[x1,,xn], q even, be a nondegenerate quadratic form. If n is even, then f is either equivalent to

    x1x2+x3x4++xn1xn

    or to a quadratic form of the type

    x1x2+x3x4++xn1xn+x2n1+ax2n,

    where aFq satisfies TrFq/Fp(a)=1.

    Lemma 2.9. ([10,Corollary 3.79,p. 127]) Let aFq and let p be the characteristic of Fq, the trinomial xpxa is irreducible in Fq if and only if TrFq/Fp(a)0.

    Lemma 2.10. ([10,Lemma 6.31,p. 288]) For even q, let aFq with TrFq/Fp(a)=1 and bFq. Then

    Nq(x21+x1x2+ax22=b)=qv(b).

    Lemma 2.11. ([10,Theorem 6.32,p. 288]) Let Fq be a finite field with q even and let bFq. Then for even n, the number of solutions of the equation

    x1x2+x3x4++xn1xn=b

    in Fnq is qn1+v(b)q(n2)/2. For even n and aFq with TrFq/Fp(a)=1, the number of solutions of the equation

    x1x2+x3x4++xn1xn+x2n1+ax2n=b

    in Fnq is qn1v(b)q(n2)/2.

    Lemma 2.12. Let q=2r and hFq2. Let g(Y)Fq2[y1,y2,,yn] be a polynomial of the form

    g(Y)=y1y2+y3y4++yn1yn+y2n2t1++y2n3+y2n1+bty2n2t++b1y2n2+b0y2n,

    where bjFq2, 2|n, n>2, 0tn22, TrFq2/F2(bj)=1, j=0,1,,t. Then

    Nq2(g(Y)=h)=q2(n1)+(1)t+1qn2v(h). (2.1)

    Proof. We provide two proofs here. The first proof is as follows. Let q1=q2. Then by Lemmas 2.7 and 2.10, the number of solutions of g(Y)=h in Fq2 can be deduced as

    Nq2(g(Y)=h)=c1+c2++ct+2=hNq2(y1y2+y3y4++yn2t3yn2t2=c1)Nq2(yn2t1yn2t+y2n2t1+bty2n2t=c2)Nq2(yn1yn+y2n1+b0y2n=ct+2)=c1+c2++ct+2=h(qn2t31+v(c1)q(n2t4)/21)(q1v(c2))(q1v(ct+2))=c1+c2++ct+2=h(qn2t21+v(c1)q(n2t2)/21v(c2)qn2t31v(c1)v(c2)q(n2t4)/21)(q1v(c3))(q1v(ct+2))=c1+c2++ct+2=h(qnt21+v(c1)q(n2)/21v(c2)qnt31++(1)t+1v(c1)v(c2)v(ct+2)q(n2t4)/21)=qn11+q(n2)/21c1Fq2v(c1)++(1)t+1c1+c2++ct+2=hv(c1)v(c2)v(ct+2)q(n2t4)/21. (2.2)

    By Lamma 2.7 and (2.2), we have

    Nq2(g(Y)=h)=qn11+(1)t+1v(h)q(n2)/21=q2(n1)+(1)t+1v(h)qn2.

    Next we give the second proof. Note that if f and g are equivalent, then for any bFq2 the equation f(x1,,xn)=b and g(x1,,xn)=b have the same number of solutions in Fq2. So we can get the number of solutions of g(Y)=h for hFq2 by means of a nonsingular linear substitution of indeterminates.

    Let k(X)Fq2[x1,x2,x3,x4] and k(X)=x1x2+x21+Ax22+x3x4+x23+Bx24, where TrFq2/F2(A)=TrFq2/F2(B)=1. We first show that k(x) is equivalent to x1x2+x3x4.

    Let x3=y1+y3 and xi=yi for i3, then k(X) is equivalent to y1y2+y1y4+y3y4+Ay22+y23+By24.

    Let y2=z2+z4 and yi=zi for i2, then k(X) is equivalent to z1z2+z3z4+Az22+z23+Az24+Bz24.

    Let z1=α1+Aα2 and zi=αi for i1, then k(X) is equivalent to α1α2+α23+α3α4+(A+B)α24.

    Since TrFq2/F2(A+B)=0, we have α23+α3α4+(A+B)α24 is reducible by Lemma 2.9. Then k(X) is equivalent to x1x2+x3x4. It follows that if t is odd, then g(Y) is equivalent to x1x2+x3x4++xn1xn, and if t is even, then g(Y) is equivalent to x1x2+x3x4++xn1xn+x2n1+ax2n with TrFq2/F2(a)=1. By Lemma 2.11, we get the desired result.

    From (1.1), we know that the formula for the number of solutions of f(X)=0 over Fq2 is

    Nq2(f(X)=0)=q2(s1)+d11j1=1ds1js=1¯ψj11(a1)¯ψjss(as)J0q2(ψj11,,ψjss),

    where di=(mi,q21) and ψi is a multiplicative character of Fq2 of order di. Since mi|q+1, we have di=mi. Let H={(j1,,js)1ji<mi, 1is}. It follows that ψj11ψjss is nontrivial for any (j1,,js)H as (mi,mj)=1. By Lemma 2, we have J0q2(ψj11,,ψjss)=0 and hence Nq2(f(X)=0)=q2(s1).

    Let Nq2(f(X)=c) denote the number of solutions of the equation f(X)=c over Fq2 with cFq2. Let V={(j1,,js)|0ji<mi,1is}. Then

    Nq2(f(X)=c)=γ1++γs=cNq2(a1xm11=γ1)Nq2(asxmss=γs)=γ1++γs=cm11j1=0ψj11(γ1a1)ms1js=0ψjss(γsas).

    Since ψi is a multiplicative character of Fq2 of order mi, we have

    Nq2(f(X)=c)=γ1c++γsc=1(j1,,js)Vψj11(γ1c)ψj11(ca1)ψjss(γsc)ψjss(cas)=(j1,,js)Vψj11(ca1)ψjss(cas)γ1c++γsc=1ψj11(γ1c)ψjss(γsc)=(j1,,js)Vψj11(ca1)ψjss(cas)Jq2(ψj11,,ψjss).

    By Lemma 2.3,

    Nq2(f(X)=c)=q2(s1)+(j1,,js)Hψj11(ca1)ψjss(cas)Jq2(ψj11,,ψjss).

    By Lemma 2.5,

    Jq2(ψj11,,ψjss)=Gq2(ψj11)Gq2(ψjss)Gq2(ψj11ψjss).

    Since mi|q+1 and 2mi, by Lemma 2.6, we have

    Gq2(ψj11)==Gq2(ψjss)=Gq2(ψj11ψjss)=q.

    Then

    Nq2(f(X)=c)=q2(s1)+qs1m11j1=1ψj11(ca1)ms1js=1ψjss(cas)=q2(s1)+qs1(m11j1=0ψj11(ca1)1)(ms1js=0ψjss(cas)1).

    It follows that

    Nq2(f(X)=c)=q2(s1)+qs1si=1((cai)mimi1), (3.1)

    where

    (cai)mi={1,ifcai is a residue of ordermi,0,otherwise.

    For a given hFq2. We discuss the two cases according to whether h is zero or not.

    Case 1: h=0. If f(X)=0, then g(Y)=0; if f(X)0, then g(Y)0. Then

    Nq2(l(X,Y)=0)=c1+c2=0Nq2(f(X)=c1)Nq2(g(Y)=c2)=q2(s1)(q2(n1)+(1)t+1(q21)qn2)+c1+c2=0c1,c2Fq2Nq2(f(X)=c1)Nq2(g(Y)=c2). (3.2)

    By Lemma 2.12, (3.1) and (3.2), we have

    Nq2(l(X,Y)=0)=q2(s+n2)+(1)t+1q2(s1)+hn(1)t+1q2(s2)+n+c1Fq2[q2(s+n2)(1)t+1q2(s2)+n+si=1((c1ai)mimi1)(q2n+s3(1)t+1qn+s3)]=q2(s+n2)+(1)t+1q2(s1)+n(1)t+1q2(s2)+n+q2(s+n1)(1)t+1q2(s1)+nq2(s+n2)+(1)t+1q2(s2)+n+c1Fq2[si=1((c1ai)mimi1)(q2n+s3(1)t+1qn+s3)]=q2(s+n1)+c1Fq2[si=1((c1ai)mimi1)(q2n+s3(1)t+1qn+s3)]. (3.3)

    Case 2: hFq2. If f(X)=h, then g(Y)=0; if f(X)=0, then g(Y)=h; if f(X){0,h}, then g(Y){0,h}. So we have

    Nq2(l(X,Y))=h)=c1+c2=hNq2(f(X)=c1)Nq2(g(Y)=c2)=Nq2(f(X)=0)Nq2(g(Y)=h)+Nq2(f(X)=h)Nq2(g(Y)=0)+c1+c2=hc1,c2Fq2{h}Nq2(f(X)=c1)Nq2(g(Y)=c2). (3.4)

    By Lemma 2.12, (3.1) and (3.4),

    Nq2(l(X,Y)=h)=2q2(s+n2)+(1)t+1q2s+n2(1)t+12q2s+n4+(qs+2n3+(1)t+1(q21)qs+n3)si=1((hai)mimi1)+c1Fq2{h}[q2(s+n2)(1)t+1q2s+n4+si=1((c1ai)mimi1)(q2n+s3(1)t+1qn+s3)].

    It follows that

    Nq2(l(X,Y)=h)=2q2(s+n2)+(1)t+1q2s+n2(1)t+12q2s+n4+(qs+2n3+(1)t+1(q21)qs+n3)si=1((hai)mimi1)+c1Fq2{h}[q2(s+n2)(1)t+1q2s+n4+si=1((c1ai)mimi1)(q2n+s3(1)t+1qn+s3)]=q2(s+n1)+(qs+2n3+(1)t+1(q21)qs+n3)si=1((hai)mimi1)+c1Fq2{h}[si=1((c1ai)mimi1)(q2n+s3+(1)tqn+s3)]. (3.5)

    By (3.3) and (3.5), we get the desired result. The proof of Theorem 1.1 is complete.

    There is a direct corollary of Theorem 1.1 and we omit its proof.

    Corollary 4.1. Under the conditions of Theorem 1.1, if a1==as=hFq2, then we have

    Nq2(l(X,Y)=h)=q2(s+n1)+(qs+2n3+(1)t+1(q21)qs+n3)si=1(mi1)+γFq2{h}[si=1((γh)mimi1)(q2n+s3+(1)tqn+s3)],

    where

    (γh)mi={1,ifγhisaresidueofordermi,0,otherwise.

    Finally, we give two examples to conclude the paper.

    Example 4.2. Let F210=α=F2[x]/(x10+x3+1) where α is a root of x10+x3+1. Suppose l(X,Y)=α33x31+x112+y23+α10y24+y1y2+y3y4. Clearly, TrF210/F2(α10)=1, m1=3, m2=11, s=2, n=4, t=0, a2=1. By Theorem 1.1, we have

    N210(l(X,Y)=0)=10245+(327+323)×20=1126587102265344.

    Example 4.3. Let F212=β=F2[x]/(x12+x6+x4+x+1) where β is a root of x12+x6+x4+x+1. Suppose l(X,Y)=x51+x132+y23+β10y24+y1y2+y3y4. Clearly, TrF212/F2(β10)=1, m1=5, m2=13, s=2, n=4, t=0, a1=a2=1. By Corollary 1, we have

    N212(l(X,Y)=1)=25×12+(647643×4095)×48=1153132559312355328.

    This work was jointly supported by the Natural Science Foundation of Fujian Province, China under Grant No. 2022J02046, Fujian Key Laboratory of Granular Computing and Applications (Minnan Normal University), Institute of Meteorological Big Data-Digital Fujian and Fujian Key Laboratory of Data Science and Statistics.

    The authors declare there is no conflicts of interest.



    [1] P. Wang, Y. Li, C. K. Reddy, Machine learning for survival analysis, ACM Comput. Surv., 51 (2019), 1-36. https://doi.org/10.1145/3214306 doi: 10.1145/3214306
    [2] E. L. Kaplan, P. Meier, Nonparametric estimation from incomplete observations, J. Am. Stat. Assoc., 53 (1958), 457-481. https://doi.org/10.2307/2281868 doi: 10.2307/2281868
    [3] J. H. Shows, W. Lu, H. Z. Hao, Sparse estimation and inference for censored median regression, J. Stat. Plann. Inference, 140 (2010), 1903-1917. https://doi.org/10.1016/j.jspi.2010.01.043 doi: 10.1016/j.jspi.2010.01.043
    [4] A. Giussani, M. Bonetti, Marshall—Olkin frailty survival models for bivariate right-censored failure time data, J. Appl. Stat., 46 (2019), 2945-2961. https://doi.org/10.1080/02664763.2019.1624694 doi: 10.1080/02664763.2019.1624694
    [5] Q. Yu, The MLE of the uniform distribution with right-censored data, Lifetime Data Anal., 27 (2021), 1-17. https://doi.org/10.1007/s10985-021-09528-2 doi: 10.1007/s10985-021-09528-2
    [6] R. Koenker, G. W. Bassett, Regression quantiles, Econometrica, 46 (1978), 33-50. https://doi.org/10.2307/1913643 doi: 10.2307/1913643
    [7] H. Zou, M. Yuan, Composite quantile regression and the oracle model selection theory, Ann. Stat., 36 (2008), 1108-1126. https://doi.org/10.1214/07-AOS507 doi: 10.1214/07-AOS507
    [8] J. Shim, C. Hwang, K. Seok, Composite support vector quantile regression estimation, Comput. Stat., 29 (2014), 1651-1665. https://doi.org/10.1007/s00180-014-0511-4 doi: 10.1007/s00180-014-0511-4
    [9] S. Bang, H. Cho, M. Jhun, Adaptive lasso penalised censored composite quantile regression, Int. J. Data Min. Bioinf., 15 (2016), 22-46. https://doi.org/10.1504/IJDMB.2016.076015 doi: 10.1504/IJDMB.2016.076015
    [10] S. Bang, S. H. Eo, M. Jhun, H. J. Cho, Composite kernel quantile regression, Commun. Stat. Simul. Comput., 46 (2016), 2228-2240. https://doi.org/10.1080/03610918.2015.1039133 doi: 10.1080/03610918.2015.1039133
    [11] Q. Xu, K. Deng, C. Jiang, F. Sun, X. Huang, Composite quantile regression neural network with applications, Expert Syst. Appl., 76 (2017), 129-139. https://doi.org/10.1016/j.eswa.2017.01.054 doi: 10.1016/j.eswa.2017.01.054
    [12] J. Wang, W. Jiang, F. Xu, W. Fu, Weighted composite quantile regression with censoring indicators missing at random, Commun. Stat. Theory Methods, 50 (2019), 1-18. https://doi.org/10.1080/03610926.2019.1678638 doi: 10.1080/03610926.2019.1678638
    [13] L. M. De, P. M. Ravdin, Survival analysis of censored data: Neural network analysis detection of complex interactions between variables, Breast Cancer Res. Treat., 32 (1994), 113-118. https://doi.org/10.1007/BF00666212 doi: 10.1007/BF00666212
    [14] J. L. Katzman, U. Shaham, A. Cloninger, J. Bates, Y. Kluger, DeepSurv: Personalized treatment recommender system using a Cox proportional hazards deep neural network, BMC Med. Res. Method., 18(2018), 24. https://doi.org/10.1186/s12874-018-0482-1 doi: 10.1186/s12874-018-0482-1
    [15] C. Anika, G. Olivier, Deep learning with multimodal representation for pancancer prongosis prediction, Bioinformatics, 35 (2019), i446-i454. https://doi.org/10.1093/bioinformatics/btz342 doi: 10.1093/bioinformatics/btz342
    [16] J. Wang, N. Chen, J. Guo, X. Xu, Z. Yi, SurvNet: A novel deep neural network for lung cancer survival analysis with missing values, Front. Oncol., 10 (2021), 588990-588990. https://doi.org/10.3389/FONC.2020.588990 doi: 10.3389/FONC.2020.588990
    [17] J. H. Oh, W. Choi, E. Ko, M. Kang, A. Tannenbaum, J. O. Deasy, PathCNN: Interpretable convolutional neural networks for survival prediction and pathway analysis applied to glioblastoma, Bioinformatics, 37 (2021), i443-i450. https://doi.org/10.1093/BIOINFORMATICS/BTAB285 doi: 10.1093/BIOINFORMATICS/BTAB285
    [18] B. Ma, G. Yan, B. Chai, X. Hou, XGBLC: An improved survival prediction model based on XGBoost, Bioinformatics, 38 (2021), 410-418. https://doi.org/10.1093/bioinformatics/btab675. doi: 10.1093/bioinformatics/btab675
    [19] N. Arya, S. Saha, Multi-modal advanced deep learning architectures for breast cancer survival prediction, Knowl. Based Syst., 221 (2021), 106965. https://doi.org/10.1016/J.KNOSYS.2021.106965 doi: 10.1016/J.KNOSYS.2021.106965
    [20] S. M. Zahra, M. Alexa, A two-stage modeling approach for breast cancer survivability prediction, Int. J. Med. Inf., 149 (2021), 104438. https://doi.org/10.1016/J.IJMEDINF.2021.104438 doi: 10.1016/J.IJMEDINF.2021.104438
    [21] Y. Jia, J. H. Jeong, Deep learning for quantile regression under right censoring: DeepQuantreg, Comput. Stat. Data Anal., 165 (2022), 107323. https://doi.org/10.1016/J.CSDA.2021.107323 doi: 10.1016/J.CSDA.2021.107323
    [22] J. W. Taylor, A quantile regression neural network approach to estimating the conditional density of multiperiod returns, J. Forecasting, 19 (2000), 299-311. https://doi.org/10.1002/1099-131X(200007)19:4<299::AID-FOR775>3.0.CO;2-V doi: 10.1002/1099-131X(200007)19:4<299::AID-FOR775>3.0.CO;2-V
    [23] A. J. Cannon, Quantile regression neural networks: Implementation in r and application to precipitation downscaling, Comput. Geosci., 37 (2011), 1277-1284. https://doi.org/10.1016/j.cageo.2010.07.005 doi: 10.1016/j.cageo.2010.07.005
    [24] P. J. Huber, Robust Regression: Asymptotics, Conjectures and Monte Carlo, Ann. Stat., 1 (1973), 799-821. https://doi.org/10.1214/aos/1176342503 doi: 10.1214/aos/1176342503
    [25] H. Jian, S. Ma, H. Xie, Least absolute deviations estimation for the accelerated failure time model, Stat. Sin., 17 (2007), 1533-1548. https://www.jstor.org/stable/24307687
    [26] S. Mirjalili, A. Lewis, The whale optimization algorithm, Adv. Eng. Software, 95 (2016), 51-67. https://doi.org/10.1016/j.advengsoft.2016.01.008 doi: 10.1016/j.advengsoft.2016.01.008
    [27] W. Zheng, X. Peng, D. Lu, D. Zhang, Y. Liu, Z. Lin, et al, Composite quantile regression extreme learning machine with feature selection for short-term wind speed forecasting: A new approach, Energy Convers. Manage., 151 (2017), 737-752. https://doi.org/10.1016/j.enconman.2017.09.029 doi: 10.1016/j.enconman.2017.09.029
    [28] F. E. Harrell, K. L. Lee, D. B. Mark, Multivariable prognostic models: Issues in developing models, evaluating assumptions and adequacy, and measuring and reducing errors, Stat. Med., 15 (1996), 361-687. https://doi.org/10.1002/(SICI)1097-0258(19960229)15:4<361::AID-SIM168>3.0.CO;2-4
    [29] P. C. Austin, Generating survival times to simulate Cox proportional hazards models with time-varying covariates, Stat. Med., 31(2012), 3946-3958. https://doi.org/10.1002/sim.5452 doi: 10.1002/sim.5452
    [30] T. Hanaa, A. Mostafa, E. Nawal, S. Hanaa, A novel deep autoencoder based survival analysis approach for microarray dataset, PeerJ Comput. Sci., 7 (2021), e492-e492. https://doi.org/10.7717/PEERJ-CS.492 doi: 10.7717/PEERJ-CS.492
    [31] E. Biganzoli, P. Boracchi, L. Mariani, E. Marubini, Feed forward neural networks for the analysis of censored survival data: A partial logistic regression approach, Stat. Med., 17 (1998), 1169-1186. https://doi.org/10.1002/(SICI)1097-0258(19980530)17:10<1169::AID-SIM796>3.0.CO;2-D doi: 10.1002/(SICI)1097-0258(19980530)17:10<1169::AID-SIM796>3.0.CO;2-D
    [32] P. J. G. Lisboa, H. Wong, P. Harris, R. Swindell, A Bayesian neural network approach for modelling censored data with an application to prognosis after surgery for breast cancer, Artif. Intell. Med., 28 (2003), 1-25. https://doi.org/10.1016/S0933-3657(03)00033-2 doi: 10.1016/S0933-3657(03)00033-2
  • mbe-19-08-354-supplementary.pdf
  • Reader Comments
  • © 2022 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3265) PDF downloads(245) Cited by(3)

Figures and Tables

Figures(5)  /  Tables(6)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog