The development of new drugs is a long and costly process, Computer-aided drug design reduces development costs while computationally shortening the new drug development cycle, in which DTA (Drug-Target binding Affinity) prediction is a key step to screen out potential drugs. With the development of deep learning, various types of deep learning models have achieved notable performance in a wide range of fields. Most current related studies focus on extracting the sequence features of molecules while ignoring the valuable structural information; they employ sequence data that represent only the elemental composition of molecules without considering the molecular structure maps that contain structural information. In this paper, we use graph neural networks to predict DTA based on corresponding graph data of drugs and proteins, and we achieve competitive performance on two benchmark datasets, Davis and KIBA. In particular, an MSE of 0.227 and CI of 0.895 were obtained on Davis, and an MSE of 0.127 and CI of 0.903 were obtained on KIBA.
Citation: Dong Ma, Shuang Li, Zhihua Chen. Drug-target binding affinity prediction method based on a deep graph neural network[J]. Mathematical Biosciences and Engineering, 2023, 20(1): 269-282. doi: 10.3934/mbe.2023012
[1] | Yan Xie, Zhijun Liu, Ke Qi, Dongchen Shangguan, Qinglong Wang . A stochastic mussel-algae model under regime switching. Mathematical Biosciences and Engineering, 2022, 19(5): 4794-4811. doi: 10.3934/mbe.2022224 |
[2] | Yansong Pei, Bing Liu, Haokun Qi . Extinction and stationary distribution of stochastic predator-prey model with group defense behavior. Mathematical Biosciences and Engineering, 2022, 19(12): 13062-13078. doi: 10.3934/mbe.2022610 |
[3] | Lin Li, Wencai Zhao . Deterministic and stochastic dynamics of a modified Leslie-Gower prey-predator system with simplified Holling-type Ⅳ scheme. Mathematical Biosciences and Engineering, 2021, 18(3): 2813-2831. doi: 10.3934/mbe.2021143 |
[4] | Sanling Yuan, Xuehui Ji, Huaiping Zhu . Asymptotic behavior of a delayed stochastic logistic model with impulsive perturbations. Mathematical Biosciences and Engineering, 2017, 14(5&6): 1477-1498. doi: 10.3934/mbe.2017077 |
[5] | Chun Lu, Bing Li, Limei Zhou, Liwei Zhang . Survival analysis of an impulsive stochastic delay logistic model with Lévy jumps. Mathematical Biosciences and Engineering, 2019, 16(5): 3251-3271. doi: 10.3934/mbe.2019162 |
[6] | Zhiwei Huang, Gang Huang . Mathematical analysis on deterministic and stochastic lake ecosystem models. Mathematical Biosciences and Engineering, 2019, 16(5): 4723-4740. doi: 10.3934/mbe.2019237 |
[7] | Yan Zhang, Shujing Gao, Shihua Chen . Modelling and analysis of a stochastic nonautonomous predator-prey model with impulsive effects and nonlinear functional response. Mathematical Biosciences and Engineering, 2021, 18(2): 1485-1512. doi: 10.3934/mbe.2021077 |
[8] | Xueqing He, Ming Liu, Xiaofeng Xu . Analysis of stochastic disease including predator-prey model with fear factor and Lévy jump. Mathematical Biosciences and Engineering, 2023, 20(2): 1750-1773. doi: 10.3934/mbe.2023080 |
[9] | Yan Wang, Tingting Zhao, Jun Liu . Viral dynamics of an HIV stochastic model with cell-to-cell infection, CTL immune response and distributed delays. Mathematical Biosciences and Engineering, 2019, 16(6): 7126-7154. doi: 10.3934/mbe.2019358 |
[10] | H. J. Alsakaji, F. A. Rihan, K. Udhayakumar, F. El Ktaibi . Stochastic tumor-immune interaction model with external treatments and time delays: An optimal control problem. Mathematical Biosciences and Engineering, 2023, 20(11): 19270-19299. doi: 10.3934/mbe.2023852 |
The development of new drugs is a long and costly process, Computer-aided drug design reduces development costs while computationally shortening the new drug development cycle, in which DTA (Drug-Target binding Affinity) prediction is a key step to screen out potential drugs. With the development of deep learning, various types of deep learning models have achieved notable performance in a wide range of fields. Most current related studies focus on extracting the sequence features of molecules while ignoring the valuable structural information; they employ sequence data that represent only the elemental composition of molecules without considering the molecular structure maps that contain structural information. In this paper, we use graph neural networks to predict DTA based on corresponding graph data of drugs and proteins, and we achieve competitive performance on two benchmark datasets, Davis and KIBA. In particular, an MSE of 0.227 and CI of 0.895 were obtained on Davis, and an MSE of 0.127 and CI of 0.903 were obtained on KIBA.
Cell-based in vitro assays [27] are efficient methods to study the effect of industrial chemicals on environment or human health. Our work is based on the cytotoxicity profiling project carried by Alberta Centre for Toxicology in which initially 63 chemicals were investigated using the xCELLigence Real-Time Cell Analysis High Troughput (RTCA HT) Assay [26]. We consider a mathematical model represented by stochastic differential equations to study cytotoxicity, i.e. the effect of toxicants on human cells, such as the killing of cells or cellular pathological changes.
The cells were seeded into wells of micro-electronic plates (E-Plates), and the test substances with 11 concentrations (1:3 serial dilution from the stock solution) were dissolved in the cell culture medium [20]. The microelectrode electronic impedance value was converted by a software to Cell Index (
The success of clustering and classification methods depends on providing TCRCs that illustrates the cell population evolution from persistence to extinction. In [1] we consider a model represented by a system of ordinary differential equations to determine an appropriate range for the initial concentration of the toxicant. The model's parameters were estimated based on the data included in the TCRCs [1].
Let
dn(t)dt=βn(t)−γn2(t)−αCo(t)n(t), | (1) |
dCo(t)dt=λ21Ce(t)−η21Co(t), | (2) |
dCe(t)dt=λ22Co(t)n(t)−η22Ce(t)n(t) | (3) |
Here
The deterministic model (1)-(3) is a special case of the class of models proposed in [5], and it is related to the models considered in [7, 11, 15]. However, since we consider an acute dose of toxicant instead of a chronic one, the analysis of the survival/death of the cell population is different from the one done in the previously mentioned papers.
We have noticed that, for the toxicants considered here, the estimated values of the parameters
1. If
limt→∞n(t)=K, limt→∞Co(t)=limt→∞Ce(t)=0. |
2. If
limt→∞n(t)=0, limt→∞Co(t)=C∗eλ21η21, limt→∞Ce(t)=C∗e>βη21αλ21, |
In practice we usually estimate a parameter by an average value plus an error term. To keep the stochastic model as simple as possible, we ignore the relationship between the parameters
˜β=β+error1 ,˜γ=γ+error2 | (4) |
By the central limit theorem, the error terms may be approximated by a normal distribution with zero mean. Thus we replace equation (1) by a stochastic differential equation and, together with equations (2) and (3), we get the stochastic model
dn(t)=n(t)(β−γn(t)−αCo(t))dt+σ1n(t)dB1(t)−σ2n2(t)dB2(t), | (5) |
dCo(t)=(λ21Ce(t)−η21Co(t))dt, | (6) |
dCe(t)=(λ22Co(t)n(t)−η22Ce(t)n(t))dt, | (7) |
Here
Several versions of a stochastic logistic equation similar with (5) were considered in [18], [19], [8], [9], [10] and [21]. The system of stochastic differential equations (5)-(7) is closely related with the stochastic models in a polluted environment considered in [15], [16], and [24]. However, for the models considered in these papers, instead of the equations (6) and (7),
In this paper we extend the methods applied in [15] and [16] to find conditions for extinction, weakly persistence, and weakly stochastically permanence for the model (5)-(7). In addition to this we focus on the ergodic properties when the cell population is strongly persistent. The main contribution of this paper is the proof that
In the next section we prove that there is a unique non-negative solution of system (5)-(7) for any non-negative initial value. In section 3 we investigate the asymptotic behavior, and in section 4 we study the weak convergence of
We have to show that system (5)-(7) has a unique global positive solution in order for the stochastic model to be appropriate. Let
Since equations (6) and (7) are linear in
Co(t)=Co(0)e−η21t+λ21e−η21t∫t0Ce(s)eη21sds | (8) |
Ce(t)=Ce(0)exp(−η22∫t0n(s)ds)+λ22exp(−η22∫t0n(s)ds)∫t0Co(s)n(s)exp(η22∫s0n(l)dl)ds, t≥0. | (9) |
Let's define the differential operator
L=∂∂t+(βn−γn2−αCon)∂∂n+(λ21Ce−η21Co)∂∂Co+(λ22Con−η22Cen)∂∂Ce+12((σ21n2+σ22n4)∂2∂2n) |
For any function
dV(x(t),t)=LV(x(t),t)dt+∂V(x(t),t)∂n(σ1n(t)dB1(t)−σ2n2(t)dB2(t)), | (10) |
where
Theorem 2.1. Let
Proof. The proof is similar with the proof of theorem 3.1 in [29]. Since the coefficients are locally Lipschitz continuous functions, there exists a unique solution on
τm=inf{t∈[0,τe):min{n(t),Ce(t)}≤m−1 or max{n(t),Co(t),Ce(t)}≥m}, | (11) |
where
We show that
We define the
V(x)=Co+α4λ22(Ce−logCe−1)+αCe4λ22+(√n−log√n−1)+n. |
We get
LV(x)=(λ21Ce−η21Co)+α4λ22(1−1Ce)(λ22Con−η22Cen)+α4λ22(λ22Con−η22Cen)+(βn−γn2−αCon)(12√n−12n)+12(σ21n2+σ22n4)(−14n√n+12n2)+(βn−γn2−αCon) |
Omitting some of the negative terms, for any
LV(x)≤λ21Ce+αCon4+αCon4+αCo2−αCon+f(n),≤λ21Ce+αCo2+f(n), |
where
f(n)=−σ22n2√n8+α4λ21η22n+β√n2+γn2+σ214+σ22n24+βn |
Since
Let's define
L˜V(x,t)=−Ce−Ct(1+V(x))+e−CtLV(x)≤0. |
Using Itô's formula (10) for
E[˜V(x(t∧τm),t∧τm)]=˜V(x(0),0)+E[∫t∧τm0L˜V(x(u∧τm),u∧τm)du]≤˜V(x(0),0). |
Notice that for any
E[V(x(τm,ω))IΘm(ω)]≥P(Θm)bm≥ϵbm→∞ |
as
Here we focus on the case when
Lemma 2.2. If
Theorem 2.3. If
limt→∞Co(t)=λ21η21limt→∞Ce(t). |
In this section we assume that
Definition 3.1. The population
Definition 3.2. The population
Definition 3.3. The population
Definition 3.4. The population
Theorem 3.5. a. If
b. If
Proof. The proof is similar with the proof of Theorem 6 in [16]. We start with some preliminary results. By Itô's formula in (5) we have
dlnn(t)=(β−γn(t)−αCo(t)−σ21+σ22n2(t)2)dt+σ1dB1(t)−σ2n(t)dB2(t). |
This means that we have
lnn(t)−lnn(0)=(β−σ212)t−γ∫t0n(s)ds−α∫t0Co(s)ds−σ222∫t0n2(s)ds+σ1B1(t)−σ2∫t0n(s)dB2(s), | (12) |
Notice that the quadratic variation [17] of
⟨M(t),M(t)⟩=σ22∫t0n2(s)ds. |
Now we do the proof for part a. Using the exponential martingale inequality (Theorem 7.4 [17]) and Borel-Cantelli lemma ([22], pp. 102), and proceeding as in the proof of Theorem 6 in [16] we can show that for almost all
sup0≤t≤n(M(t)−12⟨M(t),M(t)⟩)≤2lnn. |
Hence, for all
−σ222∫t0n2(s)ds−σ2∫t0n(s)dB2(s)≤2lnn a.s.. |
Substituting the above inequality in (12) we get
lnn(t)−lnn(0)t≤β−σ212−α∫t0Co(s)dst+σ1B1(t)t+2lnnn−1 a.s., |
for all
lim supt→∞lnn(t)t≤β−σ212−αlim inft→∞∫t0Co(s)dst<0 a.s.. |
Next we prove part b. Suppose that
lim supt→∞lnn(t,ω)t≤0 | (13) |
Moreover, from the law of large numbers for local martingales (Theorem 3.4 in [17]) there exists a set
limt→∞M(t,ω)t=limt→∞B1(t,ω)t=0. |
From (12) we get:
ln(n(t))t=ln(n(0))t+(β−σ212)−α∫t0Co(s)dst−∫t0(γn(s)+σ222n2(s))dst+σ1B1(t)t+M(t,ω)t |
Hence, for any
lim supt→∞lnn(t,ω)t=(β−σ212)−αlim inft→∞∫t0Co(s,ω)dst |
Since we know that
We have the following result regarding the expectation of
Lemma 3.6. There exists a constant
Proof. Using Itô's formula in (5) we get:
d(etn(t))=n(t)et(1+β−αCo(t)−γn(t))dt+σ1n(t)etdB1(t)−σ2n2(t)etdB2(t)≤n(t)et(1+β−γn(t))dt+σ1n(t)etdB1(t)−σ2n2(t)etdB2(t)≤et(1+β)24γdt+σ1n(t)etdB1(t)−σ2n2(t)etdB2(t) | (14) |
Let
ηm=inf{t≥0:n(t)∉(1/m,m)}, | (15) |
for any
E[et∧τmn(t∧τm)]≤n(0)+E[∫t∧τm0es(1+β)24γds]≤n(0)+(1+β)24γ(et−1). |
Letting
E[n(t)]≤n(0)et+(1+β)24γ(1−e−t). |
Thus, there exists a constant
Corollary 1. For any
Proof. For any
P(n(t)>c1(ϵ))≤E[n(t)]c1(ϵ). |
Hence, from Lemma 3.6 we get
lim supt→∞P(n(t)>c1(ϵ))≤lim supt→∞E[n(t)]c1(ϵ)≤ϵ. |
Theorem 3.7. If
Proof. First we show that
By Itô's formula in (5) we get for any real constant c:
d(ectn(t))=ect(1n(t)(c−β+σ21+αCo(t))+γ+σ22n(t))dt−σ1ectn(t)dB1(t)+σ2ectdB2(t) |
Since
d(ectn(t))≤ect(γ+σ22n(t))dt−σ1ectn(t)dB1(t)+σ2ectdB2(t) | (16) |
Taking expectation in (16) and using Lemma 3.6 we get:
E[ec(t∧ηm)n(t∧ηm)]≤1n(0)+E[∫t∧ηm0ecs(γ+σ22n(s))ds]≤1n(0)+(γ+σ22K1)(ect−1)c, |
where
E[1n(t)]≤1n(0)ect+(γ+σ22K1)c(1−e−ct), |
so
Next we show that for any
For any
P(n(t)<c2(ϵ))=P(1n(t)>1c2(ϵ))≤c2(ϵ)E[1n(t)] |
Hence
lim supt→∞P(n(t)<c2(ϵ))≤ϵlim supn→∞E[1/n(t)]/M2≤ϵ. |
Thus
The deterministic system (1)-(3) has a maximum capacity equilibrium point
For stochastic differential equations, invariant and stationary distributions play the same role as fixed points for deterministic differential equations. In general, let
dX(t)=b(X(t))dt+d∑r=1σr(X(t))dBr(t), | (17) |
where
L=l∑i=1bi(x)∂∂xi+12l∑i,j=1Ai,j(x)∂2∂xi∂xj, Ai,j(x)=d∑r=1σr,i(x)σr,j(x). |
Let
Definition 4.1. A stationary distribution [6] for
∫EP(t,x,A)μ(dx)=μ(A), for any t≥0, and any A∈B(E). |
Definition 4.2. The Markov process
It is clear that the stability in distribution implies the existence of a unique stationary measure, but the converse is not always true [2]. We have the following result (see lemma 2.2 in [29] and the references therein).
Lemma 4.3. Suppose that there exists a bounded domain
limt→∞P(t,x,B)=μ(B)Px{limT→∞1T∫T0f(X(t))dt=∫Ef(x)μ(dx)}=1, |
for all
We now study the stochastic system (5)-(7) when
dX(t)=(βX(t)−γX2(t))dt+σ1X(t)dB1(t)−σ2X2(t)dB2(t), | (18) |
dXϵ(t)=(βXϵ(t)−γX2ϵ(t)−αϵXϵ(t))dt+σ1Xϵ(t)dB1(t)−σ2X2ϵ(t)dB2(t), | (19) |
Lemma 4.4. a. For any given initial value
b. For any
c. There exists a constant
Proof. The proofs for a. and b. can be done similarly with the proof of Theorem 2.1, using the
Let
Theorem 4.5. If
limt→∞PX(t,x,B)=μ1(B)Px{limT→∞1T∫T0f(X(t))dt=∫Ef(x)μ1(dx)}=1, |
for all
Proof. We consider the
LV(x)=−σ228x5/2+σ224x2−γ2x3/2+γ2x+(β2−σ218)x1/2+(σ214−β2). |
Since
Let
Let define the processes
dN(t)=(N(t)(σ21−β)+αN(t)Co(t)+γ+σ22N(t))dt−σ1N(t)dB1(t)+σ2dB2(t) a.s., | (20) |
dY(t)=(Y(t)(σ21−β)+γ+σ22Y(t))dt−σ1Y(t)dB1(t)+σ2dB2(t) a.s., | (21) |
dYϵ(t)=(Yϵ(t)(σ21−β+αϵ)+γ+σ22Yϵ(t))dt−σ1Yϵ(t)dB1(t)+σ2dB2(t) a.s.. | (22) |
From the proof of Theorem 3.7 we know that if
Lemma 4.6. If
Proof. The proof is based on the results in Lemma 4.4 and it is similar with the first part of the proof of Theorem 3.7. For completeness we have included it in Appendix C.
We use the processes
Theorem 4.7. Let
Proof. We follow the same idea as in the proof of Theorem 2.4 in [28]. From theorem 4.5 we know that
Firstly, let's notice that
Y(t)≤N(t) and Y(t)≤Yϵ(t) for any t≥0 a.s.. | (23) |
Indeed, if we denote
dξ(t)=(ξ(t)(σ21−β−σ22N(t)Y(t))+αN(t)Co(t))dt−σ1ξ(t)dB1(t) a.s.. |
The solution of the previous linear equation is given by (see chapter 3, [17])
ξ(t)=Φ(t)∫t0αN(s)Co(s)Φ(s)ds a.s., |
where
Φ(t)=exp{−t(β−σ212)−∫t0σ22N(s)Y(s)ds−σ1B1(t)}>0 |
Obviously
Secondly we show that for any
liminft→∞(Yϵ(t)−N(t))≥0 a.s.. | (24) |
From equations (20) and (22) we get
d(Yϵ(t)−N(t))=((Yϵ(t)−N(t))(σ21+αϵ−β−σ22N(t)Yϵ(t))+αN(t)(ϵ−Co(t)))dt−σ1(Yϵ(t)−N(t))dB1(t) a.s.. |
The solution of the linear equation is given by
Yϵ(t)−N(t)=Φ1(t)∫t0αN(s)(ϵ−Co(s))Φ1(s)ds a.s., |
where
0<Φ1(t)=exp{−t(β−αϵ−σ212)−∫t0σ22N(s)Yϵ(s)ds−σ1B1(t)}≤exp{−t(β−αϵ−σ212+σ1B1(t)t)} |
Since
Yϵ(t)−N(t)=Φ1(t)(∫T0αN(s)(ϵ−Co(s))Φ1(s)ds+∫tTαN(s)(ϵ−Co(s))Φ1(s)ds)≥Φ1(t)∫T0αN(s)(ϵ−Co(s))Φ1(s)ds |
Therefore for any
lim inft→∞(Yϵ(t)−N(t))≥limt→∞Φ1(t)∫T0αN(s)(ϵ−Co(s))Φ1(s)ds=0 a.s.. |
Thirdly we prove that
limϵ→0limt→∞E[Yϵ(t)−Y(t)]=0. | (25) |
We know from (23) that
d(Yϵ(t)−Y(t))=((Yϵ(t)−Y(t))(σ21+αϵ−β−σ22Y(t)Yϵ(t))+αϵY(t))dt−σ1(Yϵ(t)−Y(t))dB1(t)≤((Yϵ(t)−Y(t))(σ21+αϵ−β)+αϵY(t))dt−σ1(Yϵ(t)−Y(t))dB1(t) a.s.. |
From Lemma 4.6 we know that
E[Yϵ(t)−Y(t)]≤∫t0E[Yϵ(s)−Y(s)](σ21+αϵ−β)+αϵE[Y(s)]ds≤∫t0E[Yϵ(s)−Y(s)](σ21+αϵ−β)ds+tαϵsupt≥0E[Y(t)] a.s.. |
For any
0≤E[Yϵ(t)−Y(t)]≤αϵsupt≥0E[Y(t)]β−σ21−αϵ(1−exp(−t(β−σ21−αϵ))) |
Taking limits in the previous inequality we get equation (25).
Finally, using (23), (24), and (25) we obtain that
Corollary 2. Let
a. If
p(x)=1G1x4exp(−βσ22(1x−γβ)2),x>0 | (26) |
G1=σ22β5/2(Ψ(γ√2ββσ2)√π(σ22β+2γ2)+γσ2β1/2exp(−γ2σ22β)) | (27) |
where
b. If
Proof. We know that
a. If
dY(t)=(−Y(t)β+γ+σ22Y(t))dt+σ2dB2(t) a.s., | (28) |
Let define
q(y)=exp(−2σ22∫y1(−βu+σ22u+γ)du)=1y2exp(−βσ22(1−γβ)2)exp(βσ22(y−γβ)2) |
It can be easily shown that
∫10q(y)dy=∞, ∫∞1q(y)dy=∞, ∫∞01σ22q(y)dy=G1σ22exp(βσ22(1−γβ)2), |
where
p1(x)=1σ22q(x)∫∞01σ22q(y)dy=x2exp(−βσ22(x−γβ)2)G1 |
Thus, by Theorem 4.5,
limt→∞1t∫t0X(u)du=∫∞0xp(x)dx=σ22β3/2G1(σ2√βexp(−γ2σ22β)+2γ√πΨ(γ√2ββσ2)) a.s.. |
b. If
dY(t)=(γ−Y(t)(β−σ21))dt−σ1Y(t)dB1(t) a.s.. |
Proceeding similarly as for a. we can show that
limt→∞1t∫t0X(u)du=(2(β−σ21)σ21+1)σ212γ a.s.. |
Notice that if
On the other hand, if
First we illustrate numerically the results obtained in section 3 regarding survival analysis.We consider a cell population exposed to the toxicant monastrol as in the experiments described in [1]. The parameters' values for this toxicant are estimated in [1]:
One of the applications of the mathematical model is for finding the threshold value for
We illustrate this for the model with initial values
Notice also that the results displayed in Figs. 2 and 3 agree with the conclusion of Theorem 2.3. For the stochastic model with
lim inft→∞n(t,ω)>0,limt→∞Co(t,ω)=limt→∞Ce(t,ω)=0 |
Next we use the same parameters values as stated at the beginning of this section and the initial values
When
We present a stochastic model to study the effect of toxicants on human cells. To account for parameter uncertainties, the model is expressed as a system of coupled ordinary stochastic differential equations. The variables are the cell index
We first prove the positivity of the solutions. Then we investigate the influence of noise on the cell population survival. When the noise variances
Moreover, we prove that when the noise variance
Here we illustrate our results for the toxicant monastrol. We have also considered other toxicants from the experiments described in [1] classified in various clusters [30]. We have noticed that the cluster type does not change the type of stationary distribution, nor has an effect on the behavior of the distributions in response to increased noise variances.
Proof. The proof is similar with the proof of Lemma 3.1 in [1]. We define the stopping time
From (8) with
0≤Co(t,ω)=λ21e−η22t∫t0Ce(s,ω)eη21sds≤λ21e−η22tCe(0)∫t0eη21sds=λ21Ce(0)η21(1−e−η21t)≤λ21Ce(0)η21 |
Moreover, on
dCedt|t=τ=λ22Co(τ)n(τ)−η22Ce(τ)n(τ)≤Ce(0)n(τ)(λ21λ22η21−η22)<0 |
Thus we have a contradiction with the definition of
Proof. The proof is similar with the proof of Theorem 3.2 in [1]. Let
If
∫t0Co(s,ω)n(s,ω)exp(η22∫s0n(l,ω)dl)ds≤λ21Ce(0)η21exp(η22|n|1(ω))|n|1(ω). |
Thus
limt→∞Ce(t,ω)=Ce(0)exp(−η22|n|1(ω))+λ22M(ω)exp(−η22|n|1(ω))<∞. |
Consequently, there exists
∫t0Ce(s,ω)eη21sds≥∫tT1(ω)Ce(s,ω)eη21sds≥Ce(0)exp(−η22|n|1(ω))/2∫tT1(ω)eη21sds. |
So we can apply L'Hospital's rule in (8), and we get
limt→∞Co(t,ω)=λ21η21limt→∞Ce(t,ω)>0. |
Thus, on
Next, if
0≤λ22η22lim inft→∞Co(t,ω)≤lim inft→∞Ce(t,ω)≤lim supt→∞Ce(t,ω)≤λ22η22lim supt→∞Co(t,ω) |
Similarly, from (8) we either get that
0≤λ21η21lim inft→∞Ce(t,ω)≤lim inft→∞Co(t,ω)≤lim supt→∞Co(t,ω)≤λ21η21lim supt→∞Ce(t,ω), |
(if
limt→∞Co(t,ω)=limt→∞Ce(t,ω)=0, |
because
In conclusion, on
Proof. We choose any
d(ectY(t))=ect(Y(t)(c+σ21−β)+γ+σ22X(t))dt−σ1ectY(t)dB1(t)+σ2ectdB2(t)≤ect(γ+σ22X(t))dt−σ1ectY(t)dB1(t)+σ2ectdB2(t) | (29) |
Let
E[ec(t∧τm)Y(t∧τm)]≤1n(0)+E[∫t∧τm0ecs(γ+σ22X(s))ds]≤1n(0)+(γ+σ22C1)(ect−1)c. |
Letting
E[Y(t)]≤1n(0)ect+(γ+σ22C1)c(1−e−ct). |
Thus, there exists a constant
[1] |
Y. Zhang, Artificial intelligence for bioinformatics and biomedicine, Curr. Bioinf., 15 (2020), 801–802. https://doi.org/10.2174/157489361508201221092330 doi: 10.2174/157489361508201221092330
![]() |
[2] |
B. Jena, S. Saxena, G. K. Nayak, L. Saba, N. Sharma, J. S. Suri, Artificial intelligence-based hybrid deep learning models for image classification: The first narrative review, Comput. Biol. Med., 137 (2021), 104803. https://doi.org/10.1016/j.compbiomed.2021.104803 doi: 10.1016/j.compbiomed.2021.104803
![]() |
[3] |
H. Lin, Development and application of artificial intelligence methods in biological and medical data, Curr. Bioinf., 15 (2020), 515–516. https://doi.org/10.2174/157489361506200610112345 doi: 10.2174/157489361506200610112345
![]() |
[4] |
R. C. Andrade, M. Boroni, M. K. Amazonas, F. R. Vargas, New drug candidates for osteosarcoma: Drug repurposing based on gene expression signature, Comput. Biol. Med., 134 (2021), 104470. https://doi.org/10.1016/j.compbiomed.2021.104470 doi: 10.1016/j.compbiomed.2021.104470
![]() |
[5] |
J. Wang, Y. Shi, X. Wang, H. Chang, A drug target interaction prediction based on LINE-RF learning, Curr. Bioinf., 15 (2020), 750–757. https://doi.org/10.2174/1574893615666191227092453 doi: 10.2174/1574893615666191227092453
![]() |
[6] |
M. Aslam, M. Shehroz, F. Ali, A. Zia, S. Pervaiz, M. Shah, et al., Chlamydia trachomatis core genome data mining for promising novel drug targets and chimeric vaccine candidates identification, Comput. Biol. Med., 136 (2021), 104701. https://doi.org/10.1016/j.compbiomed.2021.104701 doi: 10.1016/j.compbiomed.2021.104701
![]() |
[7] |
J. Yan, J. Huang, C. Zhang, H. Huo, F. Chen, Virtual screening of acetylcholinesterase inhibitors based on machine learning combined with molecule docking methods, Curr. Bioinf., 16 (2021), 963–971. https://doi.org/10.2174/1574893615999200719234045 doi: 10.2174/1574893615999200719234045
![]() |
[8] |
F. F. Ahmed, M. Khatun, M. Mosharaf, M. N. Mollah, Prediction of protein-protein interactions in Arabidopsis thaliana using partial training samples in a machine learning framework, Curr. Bioinf., 16 (2021), 865–879. https://doi.org/10.2174/1574893616666210204145254 doi: 10.2174/1574893616666210204145254
![]() |
[9] |
D. P. Boso, D. D. Mascolo, R. Santagiuliana, P. Decuzzi, B. A. Schrefler, Drug delivery: Experiments, mathematical modelling and machine learning, Comput. Biol. Med., 123 (2020), 103820. https://doi.org/10.1016/j.compbiomed.2020.103820 doi: 10.1016/j.compbiomed.2020.103820
![]() |
[10] |
Y. Ding, J. Tang, F. Guo, Q. Zou, Identification of drug-target interactions via multiple kernel-based triple collaborative matrix factorization, Briefings Bioinf., 23 (2022). https://doi.org/10.1093/bib/bbab582 doi: 10.1093/bib/bbab582
![]() |
[11] |
R. Su, X. Liu, L. Wei, Q. Zou, Deep-Resp-Forest: A deep forest model to predict anti-cancer drug response, Methods, 166 (2019), 91–102. https://doi.org/10.1016/j.ymeth.2019.02.009 doi: 10.1016/j.ymeth.2019.02.009
![]() |
[12] |
Q. Bai, S. Liu, Y. Tian, T. Xu, A. J. Banegas-Luna, H. Pérez-Sánchez, Application advances of deep learning methods for de novo drug design and molecular dynamics simulation, Wiley Interdiscip. Rev.: Comput. Mol. Sci., 12 (2022), e1581. https://doi.org/10.1002/wcms.1581 doi: 10.1002/wcms.1581
![]() |
[13] |
Q. Bai, S. Tan, T. Xu, H. Liu, J. Huang, X. Yao, MolAICal: A soft tool for 3D drug design of protein targets by artificial intelligence and classical algorithm, Briefings Bioinf., 22 (2021). https://doi.org/10.1093/bib/bbaa161 doi: 10.1093/bib/bbaa161
![]() |
[14] |
J. Li, A. Fu, L. Zhang, An overview of scoring functions used for protein-ligand interactions in molecular docking, Interdiscip. Sci.: Comput. Life Sci., 11 (2019), 320–328. https://doi.org/10.1007/s12539-019-00327-w doi: 10.1007/s12539-019-00327-w
![]() |
[15] |
Y. Ding, J. Tang, F. Guo, Protein crystallization identification via fuzzy model on linear neighborhood representation, IEEE/ACM Trans. Comput. Biol. Bioinf., 18 (2019), 1986–1995. https://doi.org/10.1109/TCBB.2019.2954826 doi: 10.1109/TCBB.2019.2954826
![]() |
[16] |
Y. Ding, J. Tang, F. Guo, Human protein subcellular localization identification via fuzzy model on kernelized neighborhood representation, Appl. Soft Comput., 96 (2020), 106596. https://doi.org/10.1016/j.asoc.2020.106596 doi: 10.1016/j.asoc.2020.106596
![]() |
[17] |
T. Nguyen, H. Le, T. P. Quinn, T. Nguyen, T. D. Le, S. Venkatesh, GraphDTA: Predicting drug-target binding affinity with graph neural networks, Bioinformatics, 37 (2021), 1140–1147. https://doi.org/10.1093/bioinformatics/btaa921 doi: 10.1093/bioinformatics/btaa921
![]() |
[18] |
M. Jiang, Z. Li, S. Zhang, S. Wang, X. Wang, Q. Yuan, et al., Drug-target affinity prediction using graph neural network and contact maps, RSC Adv., 10 (2020), 20701–20712. https://doi.org/10.1039/D0RA02297G doi: 10.1039/D0RA02297G
![]() |
[19] | T. N. Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, preprint, arXiv: 1609.02907. |
[20] | P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio, Graph attention networks, preprint, arXiv: 1710.10903. |
[21] |
M. I. Davis, J. P. Hunt, S. Herrgard, P. Ciceri, L. M. Wodicka, G. Pallares, et al., Comprehensive analysis of kinase inhibitor selectivity, Nat. Biotechnol., 29 (2011), 1046–1051. https://doi.org/10.1038/nbt.1990 doi: 10.1038/nbt.1990
![]() |
[22] |
R. Wang, X. Fang, Y. Lu, S. Wang, The PDBbind database: Collection of binding affinities for protein-ligand complexes with known three-dimensional structures, J. Med. Chem., 47 (2004), 2977–2980. https://doi.org/10.1021/jm030580l doi: 10.1021/jm030580l
![]() |
[23] |
R. Wang, X. Fang, Y. Lu, Y. C. Yang, S. Wang, The PDBbind database: Methodologies and updates, J. Med. Chem., 48 (2005), 4111–4119. https://doi.org/10.1021/jm048957q doi: 10.1021/jm048957q
![]() |
[24] |
D. Weininger, SMILES, a chemical language and information system. 1. Introduction to methodology and encoding rules, J. Chem. Inf. Comput. Sci., 28 (1988), 31–36. https://doi.org/10.1021/ci00057a005 doi: 10.1021/ci00057a005
![]() |
[25] |
M. Michel, D. Menéndez Hurtado, A. Elofsson, PconsC4: Fast, accurate and hassle-free contact predictions, Bioinformatics, 35 (2019), 2677–2679. https://doi.org/10.1093/bioinformatics/bty1036 doi: 10.1093/bioinformatics/bty1036
![]() |
[26] |
Q. Wu, Z. Peng, I. Anishchenko, Q. Cong, D. Baker, J. Yang, Protein contact prediction using metagenome sequence data and residual neural networks, Bioinformatics, 36 (2020), 41–48. https://doi.org/10.1093/bioinformatics/btz477 doi: 10.1093/bioinformatics/btz477
![]() |
[27] |
J. C. Jeong, X. Lin, X. W. Chen, On position-specific scoring matrix for protein function prediction, IEEE/ACM Trans. Comput. Biol. Bioinf., 8 (2010), 308–315. https://doi.org/10.1109/TCBB.2010.93 doi: 10.1109/TCBB.2010.93
![]() |
[28] |
Y. Ding, P. Tiwari, Q. Zou, F. Guo, H. M. Pandey, C-loss based higher-order fuzzy inference systems for identifying DNA N4-methylcytosine sites, IEEE Trans. Fuzzy Syst., 2022 (2022). https://doi.org/10.1109/TFUZZ.2022.3159103 doi: 10.1109/TFUZZ.2022.3159103
![]() |
[29] |
X. Hu, L. Chu, J. Pei, W. Liu, J. Bian, Model complexity of deep learning: A survey, Knowl. Inf. Syst., 63 (2021), 2585–2619. https://doi.org/10.1007/s10115-021-01605-0 doi: 10.1007/s10115-021-01605-0
![]() |
[30] | Q. Li, Z. Han, X. M. Wu, Deeper insights into graph convolutional networks for semi-supervised learning, in Thirty-Second AAAI conference on artificial intelligence, AAAI, New Orleans, USA, (2018), 3538–3545. https://doi.org/10.1609/aaai.v32i1.11604 |
[31] | G. Taubin, A signal processing approach to fair surface design, in Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, ACM, (1995), 351–358. https://doi.org/10.1145/218380.218473 |
[32] |
Y. Ding, W. He, J. Tang, Q. Zou, F. Guo, Laplacian regularized sparse representation based classifier for identifying DNA N4-methylcytosine Sites via L2, 1/2-matrix Norm, IEEE/ACM Trans. Comput. Biol. Bioinf., 2021 (2021). https://doi.org/10.1109/TCBB.2021.3133309 doi: 10.1109/TCBB.2021.3133309
![]() |
[33] |
Y. Ding, J. Tang, F. Guo, Identification of drug-target interactions via dual laplacian regularized least squares with multiple kernel fusion, Knowledge-Based Syst., 204 (2020), 106254. https://doi.org/10.1016/j.knosys.2020.106254 doi: 10.1016/j.knosys.2020.106254
![]() |
[34] |
P. Tiwari, S. Dehdashti, A. K. Obeid, P. Marttinen, P. Bruza, Kernel method based on non-linear coherent states in quantum feature space, J. Phys. A: Math. Theor., 55 (2022), 355301. https://doi.org/10.1088/1751-8121/ac818e doi: 10.1088/1751-8121/ac818e
![]() |
[35] | J. Klicpera, S. Weißenberger, S. Günnemann, Diffusion improves graph learning, preprint, arXiv: 1911.05485. |
[36] | L. Page, S. Brin, R. Motwani, T. Winograd, The PageRank citation ranking: Bringing order to the web, Stanford InfoLab., 1999 (1999). |
[37] | F.Wu, A. Souza, T. Zhang, C. Fifty, T. Yu, K. Weinberger, Simplifying graph convolutional networks, in International conference on machine learning, PMLR, 97 (2019), 6861–6871. https://doi.org/10.48550/arXiv.902.07153 |
[38] | H. Zhu, P. Koniusz, Simple spectral graph convolution, in International Conference on Learning Representations, (2020). |
[39] |
F. Fouss, K. Francoisse, L.Yen, A. Pirotte, M. Saerens, An experimental investigation of kernels on graphs for collaborative recommendation and semisupervised classification, Neural networks, 31 (2012), 53–72. https://doi.org/10.1016/j.neunet.2012.03.001 doi: 10.1016/j.neunet.2012.03.001
![]() |
[40] | A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, Pytorch: An imperative style, high-performance deep learning library, in Advances in neural information processing systems, 32 (2019). |
[41] | M. Fey, J. E. Lenssen, Fast graph representation learning with PyTorch Geometric, preprint, arXiv: 1903.02428. |
[42] | C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, et al., Weisfeiler and leman go neural: Higher-order graph neural networks, in Proceedings of the AAAI conference on artificial intelligence, AAAI, Honolulu, USA, 33 (2019), 4602–4609. https://doi.org/10.1609/aaai.v33i01.33014602 |
[43] | W. Hamilton, Z. Ying, J. Leskovec, Inductive representation learning on large graphs, in Advances in neural information processing systems, 30 (2017). |
[44] | D. K. Duvenaud, D. Maclaurin, J. Iparraguirre, R. Bombarell, T. Hirzel, A. Aspuru-Guzik, et al., Convolutional networks on graphs for learning molecular fingerprints, in Advances in neural information processing systems, 28 (2015). https://doi.org/10.48550/arXiv.1509.09292 |
[45] |
M. Gönen, G. Heller, Concordance probability and discriminatory power in proportional hazards regression, Biometrika, 92 (2005), 965–970. https://doi.org/10.1093/biomet/92.4.965 doi: 10.1093/biomet/92.4.965
![]() |
[46] |
D. M. Allen, Mean square error of prediction as a criterion for selecting variables, Technometrics, 13 (1971), 469–475. https://doi.org/10.1080/00401706.1971.10488811 doi: 10.1080/00401706.1971.10488811
![]() |
[47] | Z. Xu, S. Wang, F. Zhu, J. Huang, Seq2seq fingerprint: An unsupervised deep molecular embedding for drug discovery, in Proceedings of the 8th ACM international conference on bioinformatics, computational biology, and health informatics, ACM, Boston, USA, (2017), 285–294. https://doi.org/10.1145/3107411.3107424 |
[48] |
E. Asgari, M. R. Mofrad Continuous distributed representation of biological sequences for deep proteomics and genomics, PloS one, 10 (2015), e0141287. https://doi.org/10.1371/journal.pone.0141287 doi: 10.1371/journal.pone.0141287
![]() |
[49] | J. Chung, C. Gulcehre, K. Cho, . Bengio, Empirical evaluation of gated recurrent neural networks on sequence modeling, preprint, arXiv: 1412.3555. |
[50] | T. Chen, C. Guestrin, Xgboost: A scalable tree boosting system, in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, ACM, San Francisco, USA, (2016), 785–794. https://doi.org/10.1145/2939672.2939785 |
[51] |
G. Fu, Y. Ding, A. Seal, B. Chen, Y. Sun, E. Bolton, Predicting drug target interactions using meta-path-based semantic network analysis, BMC Bioinf., 17 (2016), 1–10. https://doi.org/10.1186/s12859-016-1005-x doi: 10.1186/s12859-016-1005-x
![]() |
[52] |
Y. Pu, J. Li, J. Tang, F. Guo, DeepFusionDTA: Drug-target binding affinity prediction with information fusion and hybrid deep-learning ensemble model, IEEE/ACM Trans. Comput. Biol. Bioinf., 2021 (2021). https://doi.org/10.1109/TCBB.2021.3103966 doi: 10.1109/TCBB.2021.3103966
![]() |
[53] | H. Öztürk, E. Ozkirimli, A. Özgür, WideDTA: Prediction of drug-target binding affinity. preprint, arXiv: 1902.04166. |
[54] |
M. A. Thafar, M. Alshahrani, S. Albaradei, T. Gojobori, M. Essack, X. Gao, Affinity2Vec: Drug-target binding affinity prediction through representation learning, graph mining, and machine learning, Sci. Rep., 12 (2022), 1–18. https://doi.org/10.1038/s41598-022-08787-9 doi: 10.1038/s41598-022-08787-9
![]() |
1. | Chaoqun Xu, Sanling Yuan, Richards Growth Model Driven by Multiplicative and Additive Colored Noises: Steady-State Analysis, 2020, 19, 0219-4775, 2050032, 10.1142/S0219477520500327 | |
2. | Tiantian Ma, Dan Richard, Yongqing Betty Yang, Adam B Kashlak, Cristina Anton, Functional non-parametric mixed effects models for cytotoxicity assessment and clustering, 2023, 13, 2045-2322, 10.1038/s41598-023-31011-1 |