Accurate prediction of sewage flow is crucial for optimizing sewage treatment processes, cutting down energy consumption, and reducing pollution incidents. Current prediction models, including traditional statistical models and machine learning models, have limited performance when handling nonlinear and high-noise data. Although deep learning models excel in time series prediction, they still face challenges such as computational complexity, overfitting, and poor performance in practical applications. Accordingly, this study proposed a combined prediction model based on an improved sparrow search algorithm (SSA), convolutional neural network (CNN), transformer, and bidirectional long short-term memory network (BiLSTM) for sewage flow prediction. Specifically, the CNN part was responsible for extracting local features from the time series, the Transformer part captured global dependencies using the attention mechanism, and the BiLSTM part performed deep temporal processing of the features. The improved SSA algorithm optimized the model's hyperparameters to improve prediction accuracy and generalization capability. The proposed model was validated on a sewage flow dataset from an actual sewage treatment plant. Experimental results showed that the introduced Transformer mechanism significantly enhanced the ability to handle long time series data, and an improved SSA algorithm effectively optimized the hyperparameter selection, improving the model's prediction accuracy and training efficiency. After introducing an improved SSA, CNN, and Transformer modules, the prediction model's R2 increased by 0.18744, RMSE (root mean square error) decreased by 114.93, and MAE (mean absolute error) decreased by 86.67. The difference between the predicted peak/trough flow and monitored peak/trough flow was within 3.6% and the predicted peak/trough flow appearance time was within 2.5 minutes away from the monitored peak/trough flow time. By employing a multi-model fusion approach, this study achieved efficient and accurate sewage flow prediction, highlighting the potential and application prospects of the model in the field of sewage treatment.
Citation: Jiawen Ye, Lei Dai, Haiying Wang. Enhancing sewage flow prediction using an integrated improved SSA-CNN-Transformer-BiLSTM model[J]. AIMS Mathematics, 2024, 9(10): 26916-26950. doi: 10.3934/math.20241310
[1] | Zhonghua Wang, Xiuhai Fei . Maps on C∗-algebras are skew Lie triple derivations or homomorphisms at one point. AIMS Mathematics, 2023, 8(11): 25564-25571. doi: 10.3934/math.20231305 |
[2] | Guangyu An, Xueli Zhang, Jun He, Wenhua Qian . Characterizations of local Lie derivations on von Neumann algebras. AIMS Mathematics, 2022, 7(5): 7519-7527. doi: 10.3934/math.2022422 |
[3] | Mohd Arif Raza, Aisha Jabeen, Abdul Nadim Khan, Husain Alhazmi . Linear maps on von Neumann algebras acting as Lie type derivation via local actions. AIMS Mathematics, 2021, 6(8): 8453-8465. doi: 10.3934/math.2021490 |
[4] | Shan Li, Kaijia Luo, Jiankui Li . Generalized Lie n-derivations on generalized matrix algebras. AIMS Mathematics, 2024, 9(10): 29386-29403. doi: 10.3934/math.20241424 |
[5] | Junaid Nisar, Turki Alsuraiheed, Nadeem ur Rehman . Nonlinear mixed type product [K,F]∗⊙D on ∗-algebras. AIMS Mathematics, 2024, 9(8): 21596-21608. doi: 10.3934/math.20241049 |
[6] | Wenbo Huang, Jiankui Li, Shaoze Pan . Some zero product preserving additive mappings of operator algebras. AIMS Mathematics, 2024, 9(8): 22213-22224. doi: 10.3934/math.20241080 |
[7] | Xinfeng Liang, Mengya Zhang . Triangular algebras with nonlinear higher Lie n-derivation by local actions. AIMS Mathematics, 2024, 9(2): 2549-2583. doi: 10.3934/math.2024126 |
[8] | He Yuan, Qian Zhang, Zhendi Gu . Characterizations of generalized Lie n-higher derivations on certain triangular algebras. AIMS Mathematics, 2024, 9(11): 29916-29941. doi: 10.3934/math.20241446 |
[9] | Xiuhai Fei, Zhonghua Wang, Cuixian Lu, Haifang Zhang . Higher Jordan triple derivations on ∗-type trivial extension algebras. AIMS Mathematics, 2024, 9(3): 6933-6950. doi: 10.3934/math.2024338 |
[10] | Junyuan Huang, Xueqing Chen, Zhiqi Chen, Ming Ding . On a conjecture on transposed Poisson n-Lie algebras. AIMS Mathematics, 2024, 9(3): 6709-6733. doi: 10.3934/math.2024327 |
Accurate prediction of sewage flow is crucial for optimizing sewage treatment processes, cutting down energy consumption, and reducing pollution incidents. Current prediction models, including traditional statistical models and machine learning models, have limited performance when handling nonlinear and high-noise data. Although deep learning models excel in time series prediction, they still face challenges such as computational complexity, overfitting, and poor performance in practical applications. Accordingly, this study proposed a combined prediction model based on an improved sparrow search algorithm (SSA), convolutional neural network (CNN), transformer, and bidirectional long short-term memory network (BiLSTM) for sewage flow prediction. Specifically, the CNN part was responsible for extracting local features from the time series, the Transformer part captured global dependencies using the attention mechanism, and the BiLSTM part performed deep temporal processing of the features. The improved SSA algorithm optimized the model's hyperparameters to improve prediction accuracy and generalization capability. The proposed model was validated on a sewage flow dataset from an actual sewage treatment plant. Experimental results showed that the introduced Transformer mechanism significantly enhanced the ability to handle long time series data, and an improved SSA algorithm effectively optimized the hyperparameter selection, improving the model's prediction accuracy and training efficiency. After introducing an improved SSA, CNN, and Transformer modules, the prediction model's R2 increased by 0.18744, RMSE (root mean square error) decreased by 114.93, and MAE (mean absolute error) decreased by 86.67. The difference between the predicted peak/trough flow and monitored peak/trough flow was within 3.6% and the predicted peak/trough flow appearance time was within 2.5 minutes away from the monitored peak/trough flow time. By employing a multi-model fusion approach, this study achieved efficient and accurate sewage flow prediction, highlighting the potential and application prospects of the model in the field of sewage treatment.
Let A be an associative algebra. For A,B∈A, denote by [A,B]=AB−BA the Lie product of A and B. An additive (a linear) map δ:A→A is called a global Lie triple derivation if δ([[A,B],C])=[[δ(A),B],C]+[[A,δ(B)],C]+[[A,B],δ(C)] for all A,B,C∈A. The study of global Lie triple derivations on various algebras has attracted several authors' attention, see for example [2,11,16,17,20]. Next, let δ:A→A be a map (without the additivity (linearity) assumption). δ is called a global nonlinear Lie triple derivation if δ satisfies δ([[A,B],C])=[[δ(A),B],C]+[[A,δ(B)],C]+[[A,B],δ(C)] for all A,B,C∈A. Ji, Liu and Zhao [4] gave the concrete form of global nonlinear Lie triple derivations on triangular algebras. Chen and Xiao [3] investigated global nonlinear Lie triple derivations on parabolic subalgebras of finite-dimensional simple Lie algebras. Very recently, Zhao and Hao [21] paid attention to non-global nonlinear Lie triple derivations. Let F:A×A×A→A be a map and Q be a proper subset of A. δ is called a non-global nonlinear Lie triple derivation if δ satisfies δ([[A,B],C])=[[δ(A),B],C]+[[A,δ(B)],C]+[[A,B],δ(C)] for any A,B,C∈A with F(A,B,C)∈Q. Let M be a finite von Neumann algebra with no central summands of type I1. Zhao and Hao [21] proved that if δ:M→M satisfies δ([[A,B],C])=[[δ(A),B],C]+[[A,δ(B)],C]+[[A,B],δ(C)] for any A,B,C∈M with ABC=0, then δ=d+τ, where d is a derivation from M into itself and τ is a nonlinear map from M into its center such that τ([[A,B],C])=0 with ABC=0.
Let A be an associative ∗-algebra. For A,B∈A, denote by [A,B]∗=AB−BA∗ the skew Lie product of A and B. The skew Lie product arose in representability of quadratic functionals by sesquilinear functionals [12,13]. In recent years, the study related to skew Lie product has attracted some authors' attention, see for example [1,5,6,7,8,9,10,14,15,18,19,22] and references therein. A map δ:A→A (without the additivity (linearity) assumption) is called a global nonlinear skew Lie triple derivation if δ([[A,B]∗,C]∗)=[[δ(A),B]∗,C]∗+[[A,δ(B)]∗,C]∗+[[A,B]∗,δ(C)]∗ for all A,B,C∈A. A map δ:A→A is called an additive ∗-derivation if it is an additive derivation and satisfies δ(A∗)=δ(A)∗ for all A∈A. Li, Zhao and Chen [5] proved that every global nonlinear skew Lie triple derivation on factor von Neumann algebras is an additive ∗-derivation. Taghavi, Nouri and Darvish [15] proved that every global nonlinear skew Lie triple derivation on prime ∗-algebras is additive. Similarly, let F:A×A×A→A be a map and Q be a proper subset of A. If δ satisfies δ([[A,B]∗,C]∗)=[[δ(A),B]∗,C]∗+[[A,δ(B)]∗,C]∗+[[A,B]∗,δ(C)]∗ for any A,B,C∈A with F(A,B,C)∈Q, then δ is called a non-global nonlinear skew Lie triple derivation.
Motivated by the mentioned works, we will concentrate on characterizing a kind of non-global nonlinear skew Lie triple derivations δ on factor von Neumann algebras satisfying δ([[A,B]∗,C]∗)=[[δ(A),B]∗,C]∗+[[A,δ(B)]∗,C]∗+[[A,B]∗,δ(C)]∗ for any A,B,C∈A with A∗B∗C=0.
As usual, C denotes the complex number field. Let H be a complex Hilbert space and B(H) be the algebra of all bounded linear operators on H. Let A⊆B(H) be a factor von Neumann algebra (i.e., the center of A is CI, where I is the identity of A). Recall that A is prime (i.e., for any A,B∈A, AAB={0} implies A=0 or B=0).
The main result is the following theorem.
Theorem 2.1. Let A be a factor von Neumann algebra acting on a complex Hilbert space H with dimA>1. If a map δ:A→A satisfies
δ([[A,B]∗,C]∗)=[[δ(A),B]∗,C]∗+[[A,δ(B)]∗,C]∗+[[A,B]∗,δ(C)]∗ |
for any A,B,C∈A with A∗B∗C=0, then δ is an additive ∗-derivation.
Let P1∈A be a nontrivial projection. Write P2=I−P1, Aij=PiAPj (i,j=1,2). Then A=A11+A12+A21+A22. For any A∈A, A=A11+A12+A21+A22, Aij∈ Aij (i,j=1,2).
Lemma 2.1. (a) δ(Pi)∗=δ(Pi) (i=1,2);
(b) Piδ(Pi)Pj=−Piδ(Pj)Pj (1≤i≠j≤2).
Proof. (a) It is clear that δ(0)=0. For any X21∈A21, it follows from P∗1P∗1X21=0 and [[P1,P1]∗,X21]∗=0 that
0=δ([[P1,P1]∗,X21]∗)=[[δ(P1),P1]∗,X21]∗+[[P1,δ(P1)]∗,X21]∗+[[P1,P1]∗,δ(X21)]∗=−P1δ(P1)∗X21−X21δ(P1)∗+X21δ(P1)P1+P1δ(P1)X21−X21δ(P1)∗P1+X21δ(P1)∗. | (2.1) |
Multiplying (2.1) by P2 from the left and by P1 from the right, we have X21(δ(P1)P1−δ(P1)∗P1)=0. Then by the primeness of A, we get
P1δ(P1)∗P1=P1δ(P1)P1. | (2.2) |
By P∗1P∗2P2=0 and [[P1,P2]∗,P2]∗=0, we have
0=δ([[P1,P2]∗,P2]∗)=[[δ(P1),P2]∗,P2]∗+[[P1,δ(P2)]∗,P2]∗+[[P1,P2]∗,δ(P2)]∗=δ(P1)P2−P2δ(P1)∗P2−P2δ(P1)∗+P2δ(P1)P2+P1δ(P2)P2−P2δ(P2)∗P1. | (2.3) |
Multiplying (2.3) by P2 from both sides, we see that
P2δ(P1)∗P2=P2δ(P1)P2. | (2.4) |
From P∗1P∗1P2=0 and [[P1,P1]∗,P2]∗=0, we have
0=δ([[P1,P1]∗,P2]∗)=[[δ(P1),P1]∗,P2]∗+[[P1,δ(P1)]∗,P2]∗+[[P1,P1]∗,δ(P2)]∗=−P1δ(P1)∗P2+P2δ(P1)P1+P1δ(P1)P2−P2δ(P1)∗P1. | (2.5) |
Multiplying (2.5) by P1 from the left and by P2 from the right, then
P1δ(P1)∗P2=P1δ(P1)P2. | (2.6) |
Multiplying (2.5) by P2 from the left and by P1 from the right, then
P2δ(P1)∗P1=P2δ(P1)P1. | (2.7) |
It follows from (2.2), (2.4), (2.6) and (2.7) that δ(P1)∗=δ(P1). Similarly, we can obtain that δ(P2)∗=δ(P2).
(b) From P∗2P∗1P2=0 and [[P2,P1]∗,P2]∗=0, we have
0=δ([[P2,P1]∗,P2]∗)=[[δ(P2),P1]∗,P2]∗+[[P2,δ(P1)]∗,P2]∗+[[P2,P1]∗,δ(P2)]∗=−P1δ(P2)∗P2+P2δ(P2)P1+P2δ(P1)P2−δ(P1)P2−P2δ(P1)∗P2+P2δ(P1)∗. | (2.8) |
Multiplying (2.8) by P1 from the left and by P2 from the right, we have P1δ(P1)P2=−P1δ(P2)∗P2. Then P1δ(P1)P2=−P1δ(P2)P2 by (a). Similarly, we can obtain that P2δ(P2)P1=−P2δ(P1)P1.
Lemma 2.2. For any Aij∈Aij (1≤i≠j≤2), we have
Pjδ(Aij)Pi=0. |
Proof. Let A12∈A12. For any X12∈A12, since A∗12X∗12P2=0 and [[A12,X12]∗,P2]∗=0, we have
0=δ([[A12,X12]∗,P2]∗)=[[δ(A12),X12]∗,P2]∗+[[A12,δ(X12)]∗,P2]∗+[[A12,X12]∗,δ(P2)]∗=δ(A12)X12−X12δ(A12)∗P2−X∗12δ(A12)∗+P2δ(A12)X∗12+A12δ(X12)P2−P2δ(X12)∗A∗12−X12A∗12δ(P2)+δ(P2)A12X∗12. | (2.9) |
Multiplying (2.9) by P2 from both sides, we have
0=P2δ(A12)X12−X∗12δ(A12)∗P2. | (2.10) |
Replacing X12 with iX12 in (2.10) yields that
0=P2δ(A12)X12+X∗12δ(A12)∗P2. | (2.11) |
Combining (2.10) and (2.11), we see that P2δ(A12)X12=0. Then P2δ(A12)P1=0 by the primeness of A. Similarly, we can obtain that P1δ(A21)P2=0.
Lemma 2.3. For any A12∈A12,B21∈A21, there exist GA12,B21∈A11,KA12,B21∈A22 such that
δ(A12+B21)=δ(A12)+δ(B21)+GA12,B21+KA12,B21. |
Proof. Let T=δ(A12+B21)−δ(A12)−δ(B21). From P∗2(A12+B21)∗P2=P∗2A∗12P2=P∗2B∗21P2=0 and [[P2,B21]∗,P2]∗=0, we have
[[δ(P2),A12+B21]∗,P2]∗+[[P2,δ(A12+B21)]∗,P2]∗+[[P2,A12+B21]∗,δ(P2)]∗=δ([[P2,A12+B21]∗,P2]∗)=δ([[P2,A12]∗,P2]∗)+δ([[P2,B21]∗,P2]∗)=[[δ(P2),A12+B21]∗,P2]∗+[[P2,δ(A12)+δ(B21)]∗,P2]∗+[[P2,A12+B21]∗,δ(P2)]∗, |
which implies
[[P2,T]∗,P2]∗=0. | (2.12) |
Multiplying (2.12) by P1 from the left, we get T12=0. Similarly, T21=0. Let
GA12,B21=T11,KA12,B21=T22. |
Then GA12,B21∈A11,KA12,B21∈A22, and so δ(A12+B21)=δ(A12)+δ(B21)+GA12,B21+KA12,B21.
Lemma 2.4. (a) Pjδ(Pi)Pj=0 (1≤i≠j≤2);
(b) Piδ(Pi)Pi=0 (i=1,2).
Proof. (a) For any X12∈A12, since P∗1X∗12P1=0 and [[P1,X12]∗,P1]∗=0, we have
0=δ([[P1,X12]∗,P1]∗)=[[δ(P1),X12]∗,P1]∗+[[P1,δ(X12)]∗,P1]∗+[[P1,X12]∗,δ(P1)]∗=−X12δ(P1)P1+P1δ(P1)X∗12+P1δ(X12)P1−δ(X12)P1−P1δ(X12)∗P1+P1δ(X12)∗+X12δ(P1)−δ(P1)X∗12. | (2.13) |
Multiplying (2.13) by P1 from the left and by P2 from the right, we have
P1δ(X12)∗P2+X12δ(P1)P2=0. |
It follows from Lemma 2.2 that X12δ(P1)P2=−(P2δ(X12)P1)∗=0. Then P2δ(P1)P2=0. Similarly, P1δ(P2)P1=0.
(b) For any X21∈A21, from (iX21)∗P∗1P1=0, [[iX21,P1]∗,P1]∗=iX∗21+iX21, Lemma 2.1(a) and Lemma 2.3, there exist GiX∗21,iX21∈A11,KiX∗21,iX21∈A22 such that
δ(iX∗21)+δ(iX21)+GiX∗21,iX21+KiX∗21,iX21=δ([[iX21,P1]∗,P1]∗)=[[δ(iX21),P1]∗,P1]∗+[[iX21,δ(P1)]∗,P1]∗+[[iX21,P1]∗,δ(P1)]∗=δ(iX21)P1−P1δ(iX21)∗P1−P1δ(iX21)∗+P1δ(iX21)P1+iX21δ(P1)P1+iP1δ(P1)X∗21+iX21δ(P1)+iX∗21δ(P1)+iδ(P1)X∗21+iδ(P1)X21. | (2.14) |
Multiplying (2.14) by P2 from the left and by P1 from the right, we have
P2δ(iX∗21)P1=2iX21δ(P1)P1+iP2δ(P1)X21. | (2.15) |
By (2.15), Lemma 2.2 and the fact that P2δ(P1)P2=0, we obtain X21δ(P1)P1=0. Then P1δ(P1)P1=0. Similarly, P2δ(P2)P2=0.
Remark 2.1. Let S=P1δ(P1)P2−P2δ(P1)P1. Then S∗=−S by Lemma 2.1. We define a map Δ:A→A by
Δ(X)=δ(X)−[X,S] |
for any X∈A. It is easy to verify that Δ is a map satisfying
Δ([[A,B]∗,C]∗)=[[Δ(A),B]∗,C]∗+[[A,Δ(B)]∗,C]∗+[[A,B]∗,Δ(C)]∗ |
for any A,B,C∈A with A∗B∗C=0. By Lemmas 2.1–2.4, it follows that
(a) Δ(Pi)=0 (i=1,2);
(b) For any Aij∈Aij (1≤i≠j≤2), we have PjΔ(Aij)Pi=0;
(c) For any A12∈A12,B21∈A21, there exist UA12,B21∈A11,VA12,B21∈A22 such that
Δ(A12+B21)=Δ(A12)+Δ(B21)+UA12,B21+VA12,B21. |
Lemma 2.5. Δ(Aii)⊆Aii (i=1,2).
Proof. Let A11∈A11. From A∗11P∗2P2=0, [[A11,P2]∗,P2]∗=0 and Δ(P2)=0, we have
0=Δ([[A11,P2]∗,P2]∗)=[[Δ(A11),P2]∗,P2]∗=Δ(A11)P2−P2Δ(A11)∗P2−P2Δ(A11)∗+P2Δ(A11)P2. | (2.16) |
Multiplying (2.16) by P1 from the left, we get P1Δ(A11)P2=0. Since P∗2A∗11P1=0, [[P2,A11]∗,P1]∗=0 and Δ(P1)=Δ(P2)=0, we have
0=Δ([[P2,A11]∗,P1]∗)=[[P2,Δ(A11)]∗,P1]∗=P2Δ(A11)P1−P1Δ(A11)∗P2. | (2.17) |
Multiplying (2.17) by P2 from the left, we get P2Δ(A11)P1=0. For any X12∈A12, from X∗12A∗11P2=0, [[X12,A11]∗,P2]∗=0 and Δ(P2)=0, we have
0=Δ([[X12,A11]∗,P2]∗)=[[Δ(X12),A11]∗,P2]∗+[[X12,Δ(A11)]∗,P2]∗=−A11Δ(X12)∗P2+P2Δ(X12)A∗11+X12Δ(A11)P2−P2Δ(A11)∗X∗12. | (2.18) |
Multiplying (2.18) by P1 from the left, we get −A11Δ(X12)∗P2+X12Δ(A11)P2=0. It follows from Remark 2.1(b) that X12Δ(A11)P2=A11(P2Δ(X12)P1)∗=0. Then P2Δ(A11)P2=0. Hence Δ(A11)⊆A11. Similarly, Δ(A22)⊆A22.
Lemma 2.6. Δ(Aij)⊆Aij (1≤i≠j≤2).
Proof. Let A12∈A12. Then P2Δ(A12)P1=0 by Remark 2.1(b). For any X12∈A12, from X∗12A∗12P1=0 and Δ(P1)=0, we have
Δ(−A12X∗12+X12A∗12)=Δ([[X12,A12]∗,P1]∗)=[[Δ(X12),A12]∗,P1]∗+[[X12,Δ(A12)]∗,P1]∗=−A12Δ(X12)∗P1+P1Δ(X12)A∗12+X12Δ(A12)P1−Δ(A12)X∗12−P1Δ(A12)∗X∗12+X12Δ(A12)∗. | (2.19) |
Multiplying (2.19) by P2 from the left and by P1 from the right, then by Lemma 2.5, we get P2Δ(A12)X∗12=0. Hence P2Δ(A12)P2=0. Since A∗12X∗12P2=0, [[A12,X12]∗,P2]∗=0 and Δ(P2)=0, we have
0=Δ([[A12,X12]∗,P2]∗)=[[Δ(A12),X12]∗,P2]∗+[[A12,Δ(X12)]∗,P2]∗=Δ(A12)X12−X12Δ(A12)∗P2−X∗12Δ(A12)∗+P2Δ(A12)X∗12+A12Δ(X12)P2−P2Δ(X12)∗A∗12. | (2.20) |
Multiplying (2.20) by P1 from the left and by P2 from the right, then by P2Δ(A12)P2=P2Δ(X12)P2=0, we have P1Δ(A12)X12=0. It follows that P1Δ(A12)P1=0. Therefore Δ(A12)⊆A12. Similarly, Δ(A21)⊆A21.
Lemma 2.7. For any Aii∈Aii,Bij∈Aij,Bji∈Aji (1≤i≠j≤2), we have
(a) Δ(Aii+Bij)=Δ(Aii)+Δ(Bij);
(b) Δ(Aii+Bji)=Δ(Aii)+Δ(Bji).
Proof. (a) Let T=Δ(Aii+Bij)−Δ(Aii)−Δ(Bij). Since (iPj)∗I∗(Aii+Bij)=(iPj)∗I∗Aii=(iPj)∗I∗Bij=0 and [[iPj,I]∗,Aii]∗=0, we have
[[Δ(iPj),I]∗,Aii+Bij]∗+[[iPj,Δ(I)]∗,Aii+Bij]∗+[[iPj,I]∗,Δ(Aii+Bij)]∗=Δ([[iPj,I]∗,Aii+Bij]∗)=Δ([[iPj,I]∗,Aii]∗)+Δ([[iPj,I]∗,Bij]∗)=[[Δ(iPj),I]∗,Aii+Bij]∗+[[iPj,Δ(I)]∗,Aii+Bij]∗+[[iPj,I]∗,Δ(Aii)+Δ(Bij)]∗, |
which implies
[[iPj,I]∗,T]∗=0. | (2.21) |
Multiplying (2.21) by Pi from the left, by Pi from the right, by Pj from both sides, respectively, we get Tij=Tji=Tjj=0. Hence
Δ(Aii+Bij)=Δ(Aii)+Δ(Bij)+Tii. | (2.22) |
For any Xij∈Aij, from (Aii+Bij)∗X∗ijPj=A∗iiX∗ijPj=B∗ijX∗ijPj=0, [[Bij,Xij]∗,Pj]∗=0 and (2.22), we have
[[Δ(Aii)+Δ(Bij)+Tii,Xij]∗,Pj]∗+[[Aii+Bij,Δ(Xij)]∗,Pj]∗+[[Aii+Bij,Xij]∗,Δ(Pj)]∗=[[Δ(Aii+Bij),Xij]∗,Pj]∗+[[Aii+Bij,Δ(Xij)]∗,Pj]∗+[[Aii+Bij,Xij]∗,Δ(Pj)]∗=Δ([[Aii+Bij,Xij]∗,Pj]∗)=Δ([[Aii,Xij]∗,Pj]∗)+Δ([[Bij,Xij]∗,Pj]∗)=[[Δ(Aii)+Δ(Bij),Xij]∗,Pj]∗+[[Aii+Bij,Δ(Xij)]∗,Pj]∗+[[Aii+Bij,Xij]∗,Δ(Pj)]∗. |
This implies
[[Tii,Xij]∗,Pj]∗=0. | (2.23) |
Multiplying (2.23) by Pj from the right, we see that TiiXij=0. Hence Tii=0, and so we obtain (a).
Similarly, we can show that (b) holds.
Lemma 2.8. For any Aij,Bij∈Aij (1≤i≠j≤2), we have
Δ(Aij+Bij)=Δ(Aij)+Δ(Bij). |
Proof. For any A12,B12∈A12, it follows that
[[P1+A12,P2+B12]∗,P2]∗=A12+B12−A∗12−B∗12. | (2.24) |
Then by (2.24) and Remark 2.1(c), there exist UA12+B12,−A∗12−B∗12∈A11, VA12+B12,−A∗12−B∗12 ∈A22 such that
Δ([[P1+A12,P2+B12]∗,P2]∗)=Δ(A12+B12)+Δ(−A∗12−B∗12)+UA12+B12,−A∗12−B∗12+VA12+B12,−A∗12−B∗12. | (2.25) |
From (P1+A12)∗(P2+B12)∗P2=0, Δ(P1)=Δ(P2)=0, (2.25), Lemmas 2.6 and 2.7, we have
Δ(A12+B12)+Δ(−A∗12−B∗12)+UA12+B12,−A∗12−B∗12+VA12+B12,−A∗12−B∗12=Δ([[P1+A12,P2+B12]∗,P2]∗)=[[Δ(A12),P2+B12]∗,P2]∗+[[P1+A12,Δ(B12)]∗,P2]∗=Δ(A12)+Δ(B12)−Δ(A12)∗−Δ(B12)∗. | (2.26) |
Multiplying (2.26) by P1 from the left and by P2 from the right, then by Lemma 2.6 and the fact that UA12+B12,−A∗12−B∗12∈A11,VA12+B12,−A∗12−B∗12∈A22, we see that Δ(A12+B12)=Δ(A12)+Δ(B12). Similarly, we can show that Δ(A21+B21)=Δ(A21)+Δ(B21).
Lemma 2.9. For any Aii,Bii∈Aii (i=1,2), we have
Δ(Aii+Bii)=Δ(Aii)+Δ(Bii). |
Proof. For any A11,B11∈A11,B12∈A12, from A∗11B∗12P2=0, Δ(P2)=0, [[A11,B12]∗,P2]∗=A11B12−B∗12A∗11, Lemmas 2.5, 2.6 and 2.8, we have
Δ(A11B12)+Δ(−B∗12A∗11)=Δ([[A11,B12]∗,P2]∗)=[[Δ(A11),B12]∗,P2]∗+[[A11,Δ(B12)]∗,P2]∗=Δ(A11)B12+A11Δ(B12)−B∗12Δ(A11)∗−Δ(B12)∗A∗11. | (2.27) |
Multiplying (2.27) by P1 from the left and by P2 from the right, we have
Δ(A11B12)=Δ(A11)B12+A11Δ(B12). | (2.28) |
Similarly, we can show that
Δ(A22B21)=Δ(A22)B21+A22Δ(B21). | (2.29) |
For any X12∈A12, it follows from Lemma 2.8 and (2.28) that
Δ(A11+B11)X12+(A11+B11)Δ(X12)=Δ((A11+B11)X12)=Δ(A11X12)+Δ(B11X12)=Δ(A11)X12+A11Δ(X12)+Δ(B11)X12+B11Δ(X12). |
It follows that (Δ(A11+B11)−Δ(A11)−Δ(B11))X12=0. Then Δ(A11+B11)=Δ(A11)+Δ(B11). Similarly, we can show that Δ(A22+B22)=Δ(A22)+Δ(B22).
Lemma 2.10. For any A12∈A12,B21∈A21, we have
Δ(A12+B21)=Δ(A12)+Δ(B21). |
Proof. For any X12∈A12, by X∗12(A12+B21)∗P1=X∗12A∗12P1=X∗12B∗21P1=0,
[[X12,A12+B21]∗,P1]∗=[[X12,A12]∗,P1]∗+[[X12,B21]∗,P1]∗∈A11, |
Remark 2.1(c) and Lemma 2.9, there exist UA12,B21∈A11,VA12,B21∈A22 such that
[[Δ(X12),A12+B21]∗,P1]∗+[[X12,Δ(A12)+Δ(B21)+UA12,B21+VA12,B21]∗,P1]∗+[[X12,A12+B21]∗,Δ(P1)]∗=Δ([[X12,A12+B21]∗,P1]∗)=Δ([[X12,A12]∗,P1]∗)+Δ([[X12,B21]∗,P1]∗)=[[Δ(X12),A12+B21]∗,P1]∗+[[X12,Δ(A12)+Δ(B21)]∗,P1]∗+[[X12,A12+B21]∗,Δ(P1)]∗. |
Then
0=[[X12,UA12,B21+VA12,B21]∗,P1]∗=−VA12,B21X∗12+X12V∗A12,B21. | (2.30) |
Multiplying (2.30) by P1 from the right, we get VA12,B21X∗12=0. Hence VA12,B21=0. Then by Remark 2.1(c), we get
Δ(A12+B21)=Δ(A12)+Δ(B21)+UA12,B21. | (2.31) |
For any X21∈A21, from X∗21(A12+B21)∗P2=X∗21A∗12P2=X∗21B∗21P2=0,
[[X21,A12+B21]∗,P2]∗=[[X21,A12]∗,P2]∗+[[X21,B21]∗,P2]∗∈A22, |
Lemma 2.9 and (2.31), we have
[[Δ(X21),A12+B21]∗,P2]∗+[[X21,Δ(A12)+Δ(B21)+UA12,B21]∗,P2]∗+[[X21,A12+B21]∗,Δ(P2)]∗=Δ([[X21,A12+B21]∗,P2]∗)=Δ([[X21,A12]∗,P2]∗)+Δ([[X12,B21]∗,P2]∗)=[[Δ(X21),A12+B21]∗,P2]∗+[[X21,Δ(A12)+Δ(B21)]∗,P2]∗+[[X21,A12+B21]∗,Δ(P2)]∗, |
which implies
0=[[X21,UA12,B21]∗,P2]∗=−UA12,B21X∗21+X21U∗A12,B21. | (2.32) |
Multiplying (2.32) by P2 from the right, we obtain UA12,B21X∗21=0. Then UA12,B21=0. Hence we obtain the desired result.
Lemma 2.11. For any A11∈A11,B12∈A12,C21∈A21,D22∈A22, we have
(a) Δ(A11+B12+C21)=Δ(A11)+Δ(B12)+Δ(C21);
(b) Δ(B12+C21+D22)=Δ(B12)+Δ(C21)+Δ(D22).
Proof. (a) Let T=Δ(A11+B12+C21)−Δ(A11)−Δ(B12)−Δ(C21). From P∗2(A11+B12+C21)∗P2=P∗2A∗11P2=P∗2B∗12P2=P∗2C∗21P2=0 and [[P2,A11]∗,P2]∗=[[P2,C21]∗,P2]∗=0, we have
[[Δ(P2),A11+B12+C21]∗,P2]∗+[[P2,Δ(A11+B12+C21)]∗,P2]∗+[[P2,A11+B12+C21]∗,Δ(P2)]∗=Δ([[P2,A11+B12+C21]∗,P2]∗)=Δ([[P2,A11]∗,P2]∗)+Δ([[P2,B12]∗,P2]∗)+Δ([[P2,C21]∗,P2]∗)=[[Δ(P2),A11+B12+C21]∗,P2]∗+[[P2,Δ(A11)+Δ(B12)+Δ(C21)]∗,P2]∗+[[P2,A11+B12+C21]∗,Δ(P2)]∗. |
This implies
[[P2,T]∗,P2]∗=0. | (2.33) |
Multiplying (2.33) by P1 from the left, we have T12=0. For any X12∈A12, from P∗1X∗12(A11+B12+C21)=P∗1X∗12A11=P∗1X∗12B12=P∗1X∗12C21=0, [[P1,X12]∗,A11+B12+C21]∗=X12C21−B12X∗12 and Lemma 2.9, we have
[[Δ(P1),X12]∗,A11+B12+C21]∗+[[P1,Δ(X12)]∗,A11+B12+C21]∗+[[P1,X12]∗,Δ(A11+B12+C21)]∗=Δ([[P1,X12]∗,A11+B12+C21]∗)=Δ(X12C21)+Δ(−B12X∗12)=Δ([[P1,X12]∗,A11]∗)+Δ([[P1,X12]∗,B12]∗)+Δ([[P1,X12]∗,C21]∗)=[[Δ(P1),X12]∗,A11+B12+C21]∗+[[P1,Δ(X12)]∗,A11+B12+C21]∗+[[P1,X12]∗,Δ(A11)+Δ(B12)+Δ(C21)]∗, |
which implies
[[P1,X12]∗,T]∗=0. | (2.34) |
Multiplying (2.34) by P1 from both sides, we obtain X12TP1−P1TX∗12=0. Then X12TP1=0 by T12=0. Hence T21=0. Multiplying (2.34) by P2 from the right, we have X12TP2=0 and so T22=0. Let SA11,B12,C21=T11. Then SA11,B12,C21∈A11 and
Δ(A11+B12+C21)=Δ(A11)+Δ(B12)+Δ(C21)+SA11,B12,C21. |
Similarly, there exists a RB12,C21,D22∈A22 such that
Δ(B12+C21+D22)=Δ(B12)+Δ(C21)+Δ(D22)+RB12,C21,D22. | (2.35) |
For any X21∈A21, by [[P2,X21]∗,A11+B12+C21]∗=−A11X∗21+X21A11+X21B12−C21X∗21 and (2.35), there exist a R−A11X∗21,X21A11,X21B12−C21X∗21∈A22 such that
Δ([[P2,X21]∗,A11+B12+C21]∗)=Δ(−A11X∗21)+Δ(X21A11)+Δ(X21B12−C21X∗21)+R−A11X∗21,X21A11,X21B12−C21X∗21. | (2.36) |
From P∗2X∗21(A11+B12+C21)=P∗2X∗21A11=P∗2X∗21B12=P∗2X∗21C21=0, (2.36), Lemmas 2.9 and 2.10, we have
[[Δ(P2),X21]∗,A11+B12+C21]∗+[[P2,Δ(X21)]∗,A11+B12+C21]∗+[[P2,X21]∗,Δ(A11+B12+C21)]∗=Δ([[P2,X21]∗,A11+B12+C21]∗)=Δ(−A11X∗21)+Δ(X21A11)+Δ(X21B12−C21X∗21)+R−A11X∗21,X21A11,X21B12−C21X∗21=Δ(−A11X∗21+X21A11)+Δ(X21B12)+Δ(−C21X∗21)+R−A11X∗21,X21A11,X21B12−C21X∗21=Δ([[P2,X21]∗,A11]∗)+Δ([[P2,X21]∗,B12]∗)+Δ([[P2,X21]∗,C21]∗)+R−A11X∗21,X21A11,X21B12−C21X∗21=[[Δ(P2),X21]∗,A11+B12+C21]∗+[[P2,Δ(X21)]∗,A11+B12+C21]∗+[[P2,X21]∗,Δ(A11)+Δ(B12)+Δ(C21)]∗+R−A11X∗21,X21A11,X21B12−C21X∗21. |
It follows that
[[P2,X21]∗,T]∗=R−A11X∗21,X21A11,X21B12−C21X∗21. | (2.37) |
Multiplying (2.37) by P1 from the right, then by R−A11X∗21,X21A11,X21B12−C21X∗21∈A22, we obtain X21TP1=0. Hence SA11,B12,C21=T11=0, and so Δ(A11+B12+C21)=Δ(A11)+Δ(B12)+Δ(C21).
Similarly, we can show that (b) holds.
Lemma 2.12. For any A11∈A11,B12∈A12,C21∈A21,D22∈A22, we have
Δ(A11+B12+C21+D22)=Δ(A11)+Δ(B12)+Δ(C21)+Δ(D22). |
Proof. Let T=Δ(A11+B12+C21+D22)−Δ(A11)−Δ(B12)−Δ(C21)−Δ(D22). From (A11+B12+C21+D22)∗P∗1P2=A∗11P∗1P2=B∗12P∗1P2=C∗21P∗1P2=D∗22P∗1P2=0 and [[A11+B12+C21+D22,P1]∗,P2]∗=−C∗21+C21, we have
[[Δ(A11+B12+C21+D22),P1]∗,P2]∗+[[A11+B12+C21+D22,Δ(P1)]∗,P2]∗+[[A11+B12+C21+D22,P1]∗,Δ(P2)]∗=Δ([[A11+B12+C21+D22,P1]∗,P2]∗)=Δ([[A11,P1]∗,P2]∗)+Δ([[B12,P1]∗,P2]∗)+Δ([[C21,P1]∗,P2]∗)+Δ([[D22,P1]∗,P2]∗)=[[Δ(A11)+Δ(B12)+Δ(C21)+Δ(D22),P1]∗,P2]∗+[[A11+B12+C21+D22,Δ(P1)]∗,P2]∗+[[A11+B12+C21+D22,P1]∗,Δ(P2)]∗. |
This implies
[[T,P1]∗,P2]∗=0. | (2.38) |
Multiplying (2.38) by P2 from the left, we have T21=0. Similarly, T12=0. For any X12∈A12, from P∗1X∗12(A11+B12+C21+D22)=P∗1X∗12A11=P∗1X∗12B12=P∗1X∗12C21=P∗1X∗12D22=0, [[P1,X12]∗,A11+B12+C21+D22]∗=X12C21−B12X∗12+X12D22−D22X∗12, Lemmas 2.9 and 2.12, we have
[[Δ(P1),X12]∗,A11+B12+C21+D22]∗+[[P1,Δ(X12)]∗,A11+B12+C21+D22]∗+[[P1,X12]∗,Δ(A11+B12+C21+D22)]∗=Δ([[P1,X12]∗,A11+B12+C21+D22]∗)=Δ(X12C21)+Δ(−B12X∗12)+Δ(X12D22−D22X∗12)=Δ([[P1,X12]∗,A11]∗)+Δ([[P1,X12]∗,B12]∗)+Δ([[P1,X12]∗,C21]∗)+Δ([[P1,X12]∗,D22]∗)=[[Δ(P1),X12]∗,A11+B12+C21+D22]∗+[[P1,Δ(X12)]∗,A11+B12+C21+D22]∗+[[P1,X12]∗,Δ(A11)+Δ(B12)+Δ(C21)+Δ(D22)]∗. |
This implies
[[P1,X12]∗,T]∗=0. | (2.39) |
Multiplying (2.39) by P2 from the right, we obtain X12TP2=0. Then T22=0. Similarly, T11=0. Hence we obtain the desired result.
Lemma 2.13. For any Aii,Bii∈Aii,Aij,Bij∈Aij,Bji∈Aji,Bjj∈Ajj (1≤i≠j≤2), we have
(a) Δ(AiiBij)=Δ(Aii)Bij+AiiΔ(Bij);
(b) Δ(AijBjj)=Δ(Aij)Bjj+AijΔ(Bjj);
(c) Δ(AiiBii)=Δ(Aii)Bii+AiiΔ(Bii);
(d) Δ(AijBji)=Δ(Aij)Bji+AijΔ(Bji).
Proof. (a) It follows from (2.28) and (2.29) that (a) holds.
(b) Let A12∈A12,B22∈A22. From A∗12B∗22P2=0, Δ(P2)=0, [[A12,B22]∗,P2]∗=A12B22−B∗22A∗12, Lemmas 2.5, 2.6 and 2.12, we have
Δ(A12B22)+Δ(−B∗22A∗12)=Δ([[A12,B22]∗,P2]∗)=[[Δ(A12),B22]∗,P2]∗)+[[A12,Δ(B22)]∗,P2]∗=Δ(A12)B22+A12Δ(B22)−B∗22Δ(A12)∗−Δ(B22)∗A∗12. | (2.40) |
Multiplying (2.40) by P1 from the left and by P2 from the right, we have Δ(A12B22)=Δ(A12)B22+A12Δ(B22). Similarly, Δ(A21B11)=Δ(A21)B11+A21Δ(B11).
(c) Let A11,B11∈A11,X12∈A12. It follows from (a) that
Δ(A11B11)X12+A11B11Δ(X12)=Δ(A11B11X12)=Δ(A11)B11X12+A11Δ(B11X12)=Δ(A11)B11X12+A11Δ(B11)X12+A11B11Δ(X12). |
It follows that (Δ(A11B11)−Δ(A11)B11−A11Δ(B11))X12=0. Hence Δ(A11B11)=Δ(A11)B11+A11Δ(B11). Similarly, Δ(A22B22)=Δ(A22)B22+A22Δ(B22).
(d) Let A12∈A12,B21∈A21. From B∗21P∗1A12=0, Δ(P1)=0, [[B21,P1]∗,A12]∗=B21A12+A12B21, Lemmas 2.6 and 2.12, we have
Δ(B21A12)+Δ(A12B21)=Δ([[B21,P1]∗,A12]∗)=[[Δ(B21),P1]∗,A12]∗+[[B21,P1]∗,Δ(A12)]∗=Δ(B21)A12+A12Δ(B21)+B21Δ(A12)+Δ(A12)B21. | (2.41) |
Multiplying (2.41) by P1 from both sides, we have Δ(A12B21)=Δ(A12)B21+A12Δ(B21). Similarly, Δ(A21B12)=Δ(A21)B12+A21Δ(B12).
Now, we give the proof of Theorem 2.1 in the following.
Proof of Theorem 2.1. By Lemmas 2.5, 2.6, 2.8, 2.9, 2.12 and 2.13, it is easy to verify that Δ is an additive derivation on A. Let Aij∈Aij (1≤i≠j≤2). By A∗ijP∗jPj=0, Δ(Pj)=0 and Lemma 2.6, we have
Δ(Aij)−Δ(A∗ij)=Δ([[Aij,Pj]∗,Pj]∗)=[[Δ(Aij),Pj]∗,Pj]∗=Δ(Aij)−Δ(Aij)∗. |
It follows that
Δ(A∗ij)=Δ(Aij)∗. | (2.42) |
Let Aii∈Aii, Xji∈Aji (1≤i≠j≤2). Since A∗iiP∗iXji=0, Δ(Pi)=0, [[Aii,Pi]∗,Xji]∗=XjiAii−XjiA∗ii, Lemmas 2.5, 2.6 and 2.13(b), we have
Δ(Xji)Aii+XjiΔ(Aii)−Δ(Xji)A∗ii−XjiΔ(A∗ii)=Δ(XjiAii)−Δ(XjiA∗ii)=Δ([[Aii,Pi]∗,Xji]∗)=[[Δ(Aii),Pi]∗,Xji]+[[Aii,Pi]∗,Δ(Xji)]=Δ(Xji)Aii+XjiΔ(Aii)−Δ(Xji)A∗ii−XjiΔ(Aii)∗. |
It follows that Xji(Δ(A∗ii)−Δ(Aii)∗)=0. Then
Δ(A∗ii)=Δ(Aii)∗. | (2.43) |
For any A∈A, we have A=∑2i,j=1Aij. By (2.42), (2.43) and the additivity of Δ on A, it follows that
Δ(A∗)=2∑i,j=1Δ(A∗ij)=2∑i,j=1Δ(Aij)∗=Δ(A)∗. |
Hence Δ is an additive ∗-derivation. Therefore, δ is an additive ∗-derivation on A by Remark 2.1.
In this paper, we gave the characterization of a kind of non-global nonlinear skew Lie triple derivations on factor von Neumann algebras.
This research is supported by Scientific Research Project of Shangluo University (21SKY104).
The authors declare that there are no conflicts of interest.
[1] |
W. Chen, J. Lim, S. Miyata, Y. Akashi, Exploring the spatial distribution for efficient sewage heat utilization in urban areas using the urban sewage state prediction model, Appl. Energ., 360 (2024), 122776. https://doi.org/10.1016/J.APENERGY.2024.122776 doi: 10.1016/J.APENERGY.2024.122776
![]() |
[2] |
Z. Jaffari, S. Na, A. Abbas, K. Y. Park, K. H. Cho, Digital imaging-in-flow (FlowCAM) and probabilistic machine learning to assess the sonolytic disinfection of cyanobacteria in sewage wastewater, J. Hazard. Mater., 468 (2024), 133762. https://doi.org/10.1016/J.JHAZMAT.2024.133762 doi: 10.1016/J.JHAZMAT.2024.133762
![]() |
[3] |
A. Osmane, K. Zidan, R. Benaddi, S. Sbahi, N. Ouazzani, M. Belmouden, et al., Assessment of the effectiveness of a full-scale trickling filter for the treatment of municipal sewage in an arid environment: Multiple linear regression model prediction of fecal coliform removal, J. Water Process Eng., 64 (2024), 105684. https://doi.org/10.1016/j.jwpe.2024.105684 doi: 10.1016/j.jwpe.2024.105684
![]() |
[4] |
M. Ansari, F. Othman, A. El-Shafie, Optimized fuzzy inference system to enhance prediction accuracy for influent characteristics of a sewage treatment plant, Sci. Total Environ., 722 (2020), 137878. https://doi.org/10.1016/j.scitotenv.2020.137878 doi: 10.1016/j.scitotenv.2020.137878
![]() |
[5] |
X. Wang, B. Zhao, X. Yang, Co-pyrolysis of microalgae and sewage sludge: Biocrude assessment and char yield prediction, Energy Convers. Manage., 117 (2016), 326–334. https://doi.org/10.1016/j.enconman.2016.03.013 doi: 10.1016/j.enconman.2016.03.013
![]() |
[6] |
V. Nourani, R. Zonouz, M. Dini, Estimation of prediction intervals for uncertainty assessment of artificial neural network based wastewater treatment plant effluent modeling, J. Water Process Eng., 55 (2023), 104145. https://doi.org/10.1016/j.jwpe.2023.104145 doi: 10.1016/j.jwpe.2023.104145
![]() |
[7] |
H. Mahanna, N. EL-Rahsidy, M. Kaloop, S. El-Sapakh, A. Alluqmani, R. Hassan, Prediction of wastewater treatment plant performance through machine learning techniques, Desalin. Water Treat., 14 (2024), 100524. https://doi.org/10.1016/j.dwt.2024.100524 doi: 10.1016/j.dwt.2024.100524
![]() |
[8] |
J. Li, K. Sharma, Y. Liu, G. Jiang, Z. Yuan, Real-time prediction of rain-impacted sewage flow for on-line control of chemical dosing in sewers, Water Res., 149 (2019), 311–321. https://doi.org/10.1016/j.watres.2018.11.021 doi: 10.1016/j.watres.2018.11.021
![]() |
[9] |
Y. Liu, X. Wu, W. Qi, Assessing the water quality in urban river considering the influence of rainstorm flood: A case study of Handan city, China, Ecol. Indic., 160 (2024), 111941. https://doi.org/10.1016/j.ecolind.2024.111941 doi: 10.1016/j.ecolind.2024.111941
![]() |
[10] |
E. Ekinci, B. Özbay, S. Omurca, F. Sayın, İ. Özbay, Application of machine learning algorithms and feature selection methods for better prediction of sludge production in a real advanced biological wastewater treatment plant, J. Environ. Manag., 348 (2023), 119448. https://doi.org/10.1016/j.jenvman.2023.119448 doi: 10.1016/j.jenvman.2023.119448
![]() |
[11] |
Z. Gao, J. Chen, G. Wang, S. Ren, L. Fang, Y. Aa, et al., A novel multivariate time series prediction of crucial water quality parameters with Long Short-Term Memory (LSTM) networks, J. Contam. Hydrol., 259 (2023), 104262. https://doi.org/10.1016/j.jconhyd.2023.104262 doi: 10.1016/j.jconhyd.2023.104262
![]() |
[12] |
M. Yaqub, H. Asif, S. Kim, W. Lee, Modeling of a full-scale sewage treatment plant to predict the nutrient removal efficiency using a long short-term memory (LSTM) neural network, J. Water Process Eng., 37 (2020), 101388. https://doi.org/10.1016/j.jwpe.2020.101388 doi: 10.1016/j.jwpe.2020.101388
![]() |
[13] |
N. Farhi, E. Kohen, H. Mamane, Y. Shavitt, Prediction of wastewater treatment quality using LSTM neural network, Environ. Technol. Inno., 23 (2021), 101632. https://doi.org/10.1016/j.eti.2021.101632 doi: 10.1016/j.eti.2021.101632
![]() |
[14] |
W. Alfwzan, M. Selim, A. Almalki, I. S. Alharbi, Water quality assessment using Bi-LSTM and computational fluid dynamics (CFD) techniques, Alex. Eng. J., 97 (2024), 346–359. https://doi.org/10.1016/j.aej.2024.04.030 doi: 10.1016/j.aej.2024.04.030
![]() |
[15] |
W. Zhang, J. Zhao, P. Quan, J. Wang, X. Meng, Q. Li, Prediction of influent wastewater quality based on wavelet transform and residual LSTM, Appl. Soft Comput., 148 (2023), 110858. https://doi.org/10.1016/j.asoc.2023.110858 doi: 10.1016/j.asoc.2023.110858
![]() |
[16] |
Y. Zhang, C. Li, Y. Jiang, L. Sun, R. Zhao, K. Yan, et al., Accurate prediction of water quality in urban drainage network with integrated EMD-LSTM model, J. Clean. Prod., 354 (2022), 131724. https://doi.org/10.1016/j.jclepro.2022.131724 doi: 10.1016/j.jclepro.2022.131724
![]() |
[17] |
L. Zheng, H. Wang, C. Liu, S. Zhang, A. Ding, E. Xie, et al., Prediction of harmful algal blooms in large water bodies using the combined EFDC and LSTM models, J. Environ. Manag., 295 (2021), 113060. https://doi.org/10.1016/j.jenvman.2021.113060 doi: 10.1016/j.jenvman.2021.113060
![]() |
[18] |
L. Zhang, C. Wang, W. Hu, X. Wang, H. Wang, X. Sun, et al., Dynamic real-time forecasting technique for reclaimed water volumes in urban river environmental management, Environ. Res., 248 (2024), 118267. https://doi.org/10.1016/j.envres.2024.118267 doi: 10.1016/j.envres.2024.118267
![]() |
[19] |
S. Huan, A novel interval decomposition correlation particle swarm optimization-extreme learning machine model for short-term and long-term water quality prediction, J. Hydrol., 625 (2023), 130034. https://doi.org/10.1016/j.jhydrol.2023.130034 doi: 10.1016/j.jhydrol.2023.130034
![]() |
[20] |
H. Darabi, A. Haghighi, O. Rahmati, A. Shahrood, S. Rouzbeh, B. Pradhan, et al., A hybridized model based on neural network and swarm intelligence-grey wolf algorithm for spatial prediction of urban flood-inundation, J. Hydrol., 603 (2021), 126854. https://doi.org/10.1016/j.jhydrol.2021.126854 doi: 10.1016/j.jhydrol.2021.126854
![]() |
[21] |
Z. Wang, H. Dai, B. Chen, S. Cheng, Y. Sun, J. Zhao, et al., Effluent quality prediction of the sewage treatment based on a hybrid neural network model: Comparison and application, J. Environ. Manag., 351 (2024), 119900. https://doi.org/10.1016/j.jenvman.2023.119900 doi: 10.1016/j.jenvman.2023.119900
![]() |
[22] |
F. Farzin, S. Moghaddam, M. Ehteshami, Auto-tuning data-driven model for biogas yield prediction from anaerobic digestion of sewage sludge at the south-tehran wastewater treatment plant: Feature selection and hyperparameter population-based optimization, Renew. Energy, 227 (2024), 120554. https://doi.org/10.1016/j.renene.2024.120554 doi: 10.1016/j.renene.2024.120554
![]() |
[23] |
G. Ye, J. Wan, Z. Deng, Y. Wang, J. Chen, B. Zhu, et al., Prediction of effluent total nitrogen and energy consumption in wastewater treatment plants: Bayesian optimization machine learning methods, Bioresource Technol., 395 (2024), 130361. https://doi.org/10.1016/j.biortech.2024.130361 doi: 10.1016/j.biortech.2024.130361
![]() |
[24] |
J. Piri, B. Pirzadeh, B. Keshtegar, M. Givehchi, Reliability analysis of pumping station for sewage network using hybrid neural networks-genetic algorithm and method of moment, Process Saf. Environ., 145 (2021), 39–51. https://doi.org/10.1016/j.psep.2020.07.045 doi: 10.1016/j.psep.2020.07.045
![]() |
[25] |
M. Salamattalab, M. Zonoozi, M. Molavi-Arabshahi, Innovative approach for predicting biogas production from large-scale anaerobic digester using long-short term memory (LSTM) coupled with genetic algorithm (GA), Waste Manag., 175 (2024), 30–41. https://doi.org/10.1016/j.wasman.2023.12.046 doi: 10.1016/j.wasman.2023.12.046
![]() |
[26] |
A. Mohammed, K. Hassan, M. Abdel-Aal, Moving average smoothing for gregory-newton interpolation: A novel approach for short-term demand forecasting, IFAC-PapersOnLine, 55 (2022), 749–754. https://doi.org/10.1016/j.ifacol.2022.09.499 doi: 10.1016/j.ifacol.2022.09.499
![]() |
[27] |
P. Mei, M. Li, Q. Zhang, G. Li, L. Song, Prediction model of drinking water source quality with potential industrial-agricultural pollution based on CNN-GRU-Attention, J. Hydrol., 610 (2022), 127934. https://doi.org/10.1016/j.jhydrol.2022.127934 doi: 10.1016/j.jhydrol.2022.127934
![]() |
[28] |
A. L. de Rojas, M. A. Jaramillo-Morán, J. E. Sandubete, EMDFormer model for time series forecasting, AIMS Math., 9 (2024), 9419–9434. https://doi.org/10.3934/math.2024459 doi: 10.3934/math.2024459
![]() |
[29] |
H. Jin, Y. Liang, H. Lu, S. Zhang, Y. Gao, Y. Zhao, et al., An intelligent framework for spatiotemporal simulation of flooding considering urban underlying surface characteristics, Int. J. Appl. Earth Obs., 130 (2024), 103908. https://doi.org/10.1016/j.jag.2024.103908 doi: 10.1016/j.jag.2024.103908
![]() |
[30] |
B. Qu, E. Jiang, J. Li, Y. Liu, C. Liu, Coupling coordination relationship of water resources, eco-environment and socio-economy in the water-receiving area of the Lower Yellow River, Ecol. Indic., 160 (2024), 111766. https://doi.org/10.1016/j.ecolind.2024.111766 doi: 10.1016/j.ecolind.2024.111766
![]() |
[31] |
J. Cai, B. Sun, H. Wang, Y. Zheng, S. Zhou, H. Li, et al., Application of the improved dung beetle optimizer, muti-head attention and hybrid deep learning algorithms to groundwater depth prediction in the Ningxia area, China, Atmos. Ocean. Sci. Lett., 2024, 100497. https://doi.org/10.1016/j.aosl.2024.100497 doi: 10.1016/j.aosl.2024.100497
![]() |
[32] |
M. Wang, G. Zhao, S. Wang, Hybrid random forest models optimized by Sparrow search algorithm (SSA) and Harris hawk optimization algorithm (HHO) for slope stability prediction, Transp. Geotech., 48 (2024), 101305. https://doi.org/10.1016/j.trgeo.2024.101305 doi: 10.1016/j.trgeo.2024.101305
![]() |
[33] |
C. Zhang, S. Ding, A stochastic configuration network based on chaotic sparrow search algorithm, Knowl.-Based Syst., 220 (2024), 106924. https://doi.org/10.1016/j.knosys.2021.106924 doi: 10.1016/j.knosys.2021.106924
![]() |
[34] |
X. Shao, J. Yu, Z. Li, X. Yang, B. Sundén, Energy-saving optimization of the parallel chillers system based on a multi-strategy improved sparrow search algorithm, Heliyon, 9 (2023), e21012. https://doi.org/10.1016/j.heliyon.2023.e21012 doi: 10.1016/j.heliyon.2023.e21012
![]() |
[35] |
X. Long, W. Cai, L. Yang, H. Huang, Improved particle swarm optimization with reverse learning and neighbor adjustment for space surveillance network task scheduling, Swarm Evol. Comput., 85 (2024), 101482. https://doi.org/10.1016/j.swevo.2024.101482 doi: 10.1016/j.swevo.2024.101482
![]() |
[36] |
S. Zhao, Y. Duan, N. Roy, B. Zhang, A deep learning methodology based on adaptive multiscale CNN and enhanced highway LSTM for industrial process fault diagnosis, Reliab. Eng. Syst. Safe., 249 (2024), 110208. https://doi.org/10.1016/j.ress.2024.110208 doi: 10.1016/j.ress.2024.110208
![]() |
[37] |
K. Wang, X. Fan, X. Yang, Z. Zhou, An AQI decomposition ensemble model based on SSA-LSTM using improved AMSSA-VMD decomposition reconstruction technique, Environ. Res., 232 (2023), 116365. https://doi.org/10.1016/j.envres.2023.116365 doi: 10.1016/j.envres.2023.116365
![]() |
[38] |
Y. Leng, H. Zhang, X. Li, A novel evaluation method for renewable energy development based on improved sparrow search algorithm and projection pursuit model, Expert Syst. Appl., 244 (2024), 122991. https://doi.org/10.1016/j.eswa.2023.122991 doi: 10.1016/j.eswa.2023.122991
![]() |
[39] |
Z. Zhang, X. Cheng, Z. Xing, Z. Wang, Y. Qin, Optimal sizing of battery-supercapacitor energy storage systems for trams using improved PSO algorithm, J. Energy Storage, 73 (2023), 108962. https://doi.org/10.1016/j.est.2023.108962 doi: 10.1016/j.est.2023.108962
![]() |
[40] |
J. Li, R. Liu, R. Wang, Handling dynamic capacitated vehicle routing problems based on adaptive genetic algorithm with elastic strategy, Swarm Evol. Comput., 86 (2024), 101529. https://doi.org/10.1016/j.swevo.2024.101529 doi: 10.1016/j.swevo.2024.101529
![]() |
[41] |
X. Zhang, J. Xia, Z. Chen, J. Zhu, H. Wang, A nutrient optimization method for hydroponic lettuce based on multi-strategy improved grey wolf optimizer algorithm, Comput. Electron. Agr., 224 (2024), 109167. https://doi.org/10.1016/j.compag.2024.109167 doi: 10.1016/j.compag.2024.109167
![]() |
[42] |
J. Sahayaraj, K. Gunasekaran, S. Verma, M. Dhurgadevi, Energy efficient clustering and sink mobility protocol using improved dingo and boosted beluga whale optimization algorithm for extending network lifetime in WSNs, Sustain. Comput.-Infor., 43 (2024), 101008. https://doi.org/10.1016/j.suscom.2024.101008 doi: 10.1016/j.suscom.2024.101008
![]() |
[43] |
F. Zhu, G. Li, H. Tang, Y. Li, X. Lv, X.Wang, Dung beetle optimization algorithm based on quantum computing and multi-strategy fusion for solving engineering problems, Expert Syst. Appl., 236 (2024), 121219. https://doi.org/10.1016/j.eswa.2023.121219 doi: 10.1016/j.eswa.2023.121219
![]() |
[44] |
L. Yin, W. Ding, Deep neural network accelerated-group african vulture optimization algorithm for unit commitment considering uncertain wind power, Appl. Soft Comput., 162 (2024), 111845. https://doi.org/10.1016/j.asoc.2024.111845 doi: 10.1016/j.asoc.2024.111845
![]() |
[45] |
M. Abdel-Basset, R. Mohamed, M. Abouhawwash, Crested porcupine optimizer: A new nature-inspired metaheuristic, Knowl.-Based Syst., 284 (2024), 111257. https://doi.org/10.1016/j.knosys.2023.111257 doi: 10.1016/j.knosys.2023.111257
![]() |
[46] |
U. Khan, N. Khan, M. Zafar, Resource efficient PV power forecasting: Transductive transfer learning based hybrid deep learning model for smart grid in Industry 5.0, Energy Convers. Man.-X, 20 (2024), 100486. https://doi.org/10.1016/j.ecmx.2023.100486 doi: 10.1016/j.ecmx.2023.100486
![]() |
[47] |
X. Zhou, B. Sheil, S. Suryasentana, P. Shi, Multi-fidelity fusion for soil classification via LSTM and multi-head self-attention CNN model, Adv. Eng. Inform., 62 (2024), 102655. https://doi.org/10.1016/j.aei.2024.102655 doi: 10.1016/j.aei.2024.102655
![]() |
[48] |
M. Javanmard, S. Ghaderi, A hybrid model with applying machine learning algorithms and optimization model to forecast greenhouse gas emissions with energy market data, Sustain. Cities Soc., 82 (2022), 103886. https://doi.org/10.1016/j.scs.2022.103886 doi: 10.1016/j.scs.2022.103886
![]() |
[49] |
S. Tariq, J. Loy-Benitez, K. Nam, S. Kim, M. Kim, C. Yoo, Deep-AI soft sensor for sustainable health risk monitoring and control of fine particulate matter at sesnsor devoid underground spaces: A zero-shot transfer learning approach, Tunn. Undergr. Sp. Tech., 131 (2023), 104843. https://doi.org/10.1016/j.tust.2022.104843 doi: 10.1016/j.tust.2022.104843
![]() |
[50] |
Z. Wang, N. Xu, X. Bao, J. Wu, X. Cui, Spatio-temporal deep learning model for accurate streamflow prediction with multi-source data fusion, Environ. Modell. Softw., 178 (2024), 106091. https://doi.org/10.1016/j.envsoft.2024.106091 doi: 10.1016/j.envsoft.2024.106091
![]() |
[51] |
G. Dai, Z. Tian, J. Fan, C. K. Sunil, C. Dewi, DFN-PSAN: Multi-level deep information feature fusion extraction network for interpretable plant disease classification, Comput. Electron. Agr., 216 (2024), 108481. https://doi.org/10.1016/j.compag.2023.108481 doi: 10.1016/j.compag.2023.108481
![]() |
1. | Mohammad Ashraf, Md Shamim Akhter, Mohammad Afajal Ansari, Non-global nonlinear skew Lie n -derivations on *-algebras , 2024, 52, 0092-7872, 3734, 10.1080/00927872.2024.2328802 |