Review Special Issues

Machine learning applications in flood forecasting and predictions, challenges, and way-out in the perspective of changing environment

  • Received: 10 August 2024 Revised: 16 September 2024 Accepted: 13 December 2024 Published: 08 January 2025
  • Floods have been identified as one of the world's most common and widely distributed natural disasters over the last few decades. Floods' negative impacts could be significantly reduced if accurately predicted or forecasted in advance. Apart from large-scale spatiotemporal data and greater attention to data from the Internet of Things, the worldwide volume of digital data is increasing. Artificial intelligence plays a vital role in analyzing and developing the corresponding flood mitigation plan, flood prediction, or forecast. Machine learning (ML)-based models have recently received much attention due to their self-learning capabilities from data without incorporating any complex physical processes. This study provides a comprehensive review of ML approaches used in flood prediction, forecasting, and classification tasks, serving as a guide for future challenges. The importance and challenges of applying these techniques to flood prediction are discussed. Finally, recommendations and future directions of ML models in flood analysis are presented.

    Citation: Vijendra Kumar, Kul Vaibhav Sharma, Nikunj K. Mangukiya, Deepak Kumar Tiwari, Preeti Vijay Ramkar, Upaka Rathnayake. Machine learning applications in flood forecasting and predictions, challenges, and way-out in the perspective of changing environment[J]. AIMS Environmental Science, 2025, 12(1): 72-105. doi: 10.3934/environsci.2025004

    Related Papers:

    [1] Guifen Liu, Wenqiang Zhao . Regularity of Wong-Zakai approximation for non-autonomous stochastic quasi-linear parabolic equation on RN. Electronic Research Archive, 2021, 29(6): 3655-3686. doi: 10.3934/era.2021056
    [2] Vo Van Au, Hossein Jafari, Zakia Hammouch, Nguyen Huy Tuan . On a final value problem for a nonlinear fractional pseudo-parabolic equation. Electronic Research Archive, 2021, 29(1): 1709-1734. doi: 10.3934/era.2020088
    [3] Hongze Zhu, Chenguang Zhou, Nana Sun . A weak Galerkin method for nonlinear stochastic parabolic partial differential equations with additive noise. Electronic Research Archive, 2022, 30(6): 2321-2334. doi: 10.3934/era.2022118
    [4] Haiyan Song, Fei Sun . A numerical method for parabolic complementarity problem. Electronic Research Archive, 2023, 31(2): 1048-1064. doi: 10.3934/era.2023052
    [5] Lianbing She, Nan Liu, Xin Li, Renhai Wang . Three types of weak pullback attractors for lattice pseudo-parabolic equations driven by locally Lipschitz noise. Electronic Research Archive, 2021, 29(5): 3097-3119. doi: 10.3934/era.2021028
    [6] Yang Jiao . On estimates for augmented Hessian type parabolic equations on Riemannian manifolds. Electronic Research Archive, 2022, 30(9): 3266-3289. doi: 10.3934/era.2022166
    [7] Shuting Chang, Yaojun Ye . Upper and lower bounds for the blow-up time of a fourth-order parabolic equation with exponential nonlinearity. Electronic Research Archive, 2024, 32(11): 6225-6234. doi: 10.3934/era.2024289
    [8] Yiyuan Qian, Haiming Song, Xiaoshen Wang, Kai Zhang . Primal-dual active-set method for solving the unilateral pricing problem of American better-of options on two assets. Electronic Research Archive, 2022, 30(1): 90-115. doi: 10.3934/era.2022005
    [9] Jun Zhou . Initial boundary value problem for a inhomogeneous pseudo-parabolic equation. Electronic Research Archive, 2020, 28(1): 67-90. doi: 10.3934/era.2020005
    [10] Yang Cao, Qiuting Zhao . Initial boundary value problem of a class of mixed pseudo-parabolic Kirchhoff equations. Electronic Research Archive, 2021, 29(6): 3833-3851. doi: 10.3934/era.2021064
  • Floods have been identified as one of the world's most common and widely distributed natural disasters over the last few decades. Floods' negative impacts could be significantly reduced if accurately predicted or forecasted in advance. Apart from large-scale spatiotemporal data and greater attention to data from the Internet of Things, the worldwide volume of digital data is increasing. Artificial intelligence plays a vital role in analyzing and developing the corresponding flood mitigation plan, flood prediction, or forecast. Machine learning (ML)-based models have recently received much attention due to their self-learning capabilities from data without incorporating any complex physical processes. This study provides a comprehensive review of ML approaches used in flood prediction, forecasting, and classification tasks, serving as a guide for future challenges. The importance and challenges of applying these techniques to flood prediction are discussed. Finally, recommendations and future directions of ML models in flood analysis are presented.



    In this paper, we consider the following bound constrained nonlinear systems of equations:

    F(x)=0,s.t.xΩ, (1.1)

    where F(x)=(F1(x),F2(x),,Fn(x))T, and Fi:RnR is a nonlinear continuously differentiable function whose gradient is available. We denote by F(x)=(F1(x),F2(x),,Fn(x))T the Jacobian matrix of F at a given point x. The set ΩRn is defined as

    Ω:={xRn|lixiui,i=1,2,,n}

    for some given lower and upper bounds satisfying li<ui+ for all i=1,2,,n.

    The bound constrained nonlinear equation of the type (1.1) is an important problem in the practical problems. There are a couple of different mathematical programming problems like Karush-Kuhn-Tucker systems and complementarity problems can be reformulated as the problem (1.1), see [1,2,3,4,5,6,7,8,9]. On the other hand, in many cases, the function Fi(x) is not always defined on the whole space Rn, and one usually puts some suitable bounds on some or all of the variables.

    The Newton type method is one of the most important numerical methods for problem (1.1) and many researchers are interested in this method [7,10,11,12,13,14]. Given a current iterate xkΩ, the Newton method considers the least-squares solutions dk of the following nonlinear constrained equation:

    min12F(xk)+F(xk)d2s.t.xk+dΩ. (1.2)

    We set the next iterate to be xk+1=xk+dk and call (1.2) the constrained Gauss-Newton method.

    Another natural possibility is to consider solving the basic unconstrained Newton equation:

    F(xk)+F(xk)d=0. (1.3)

    Denote the solution of (1.3) by dkN if it exists and then define xk+1 as the projection of xk+dkN onto Ω. This scheme can be called the projected Newton method.

    On the other hand, there are many versions of the Newton type method, such as the constrained Levenberg-Marquardt method [6,15,16] usually used to solve the following subproblems:

    min12F(xk)+F(xk)d2+σd2s.t.xk+dΩ, (1.4)

    where σ is a positive constant.

    Along with constrained versions of the methods in question, one can also consider their projected variants. The projected Levenberg-Marquardt method has been proposed in [15] and its iteration consists of finding the solution dkLM of the unconstrained subproblem

    min12F(xk)+F(xk)d2+σd2, (1.5)

    and then defines the next iterate xk+1 as the projection of xk+dkLM onto Ω.

    The Newton iteration can be costly, since partial derivatives must be computed and the linear system (1.3) must be solved at every iteration. This fact motivates the development of quasi-Newton methods [10,14,17] which are defined as the generalizations of (1.3) given by

    F(xk)+Bkd=0. (1.6)

    In quasi-Newton methods, the matrices Bk are intended to be approximations of F(xk) and be updated by some quasi-Newton formulas. Another well known algorithm is the trust region type algorithm, for example, [3,4,9,18,19,20,21].

    Whether Newton method or quasi-Newton, one has to solve a linear system with full dimension, which will be expensive for large scale problems. To overcome this drawback, the active set methods are developed by many authors [7,12,13,22]. Since only a reduced dimension linear system to be dealt with at each iteration, the active set Newton methods are more efficient than the full Newton method especially for large scale problems.

    To prove global convergence of the method outlined above, one often assumes that the iteration sequence is contained in a bounded set. If li and ui are bounded and the algorithm generates a feasible sequence, the assumption holds naturally. Otherwise, one often makes an assumption that the level set is bounded. For unconstrained nonlinear equations system, M.Solodov designed a Newton method with projection technique, the method can generate a bounded iteration sequence without additional assumption and the global convergence is obtained. Motivated by the idea of M.Solodov [23], in this paper, we extend the method to constrained equations (1.1). By using this active set strategy, we only need to solve a linear system with reduced dimension at each iteration. The algorithm generates a bounded sequence automatically even if li and ui are infinite. We obtain the global convergence and give numerical tests to show the efficiency of the proposed algorithm.

    The paper is organized as follows: In section 2, we describe our algorithm in detail. In section 3, we prove the global convergence of the proposed algorithm. Some numerical tests are shown in Section 4 and a conclusion is given in section 5. Throughout this paper, we use to denote the 2norm and E denotes the identity matrix.

    We now describe our active set quasi-Newton method with projection technique in detail. To describe our algorithm, we introduce the definition of projection operator which is defined as a mapping form Rn to a nonempty closed convex subset Ω:

    PΩ(x)=argmin{yx|yΩ},xRn. (2.1)

    A well-known property of the operator is that it is nonexpensive, namely,

    PΩ(x)PΩ(y)xy,x,yRn. (2.2)

    Given a current iterate xk, let

    δk:=min{δ,c||F(xk)||},

    where δ and c are positive constants such that

    δ12mini=1,2,,n|uili|,

    and define the index sets

    Ak:={i{1,2,...,n}|xkilkδkoruixkiδk},
    Ik:={1,2,...,n}Ak={i|lk+δk<xki<uiδk}.

    The precise statement of our algorithm is as follows:

    Algorithm 2.1: (Active Set-type Quasi-Newton Method)

    (S.0) Choose a positive definite matrix Bk, x0[l,u], choose parameters β(0,1), λ(0,1), δ>0, c>0, ε>0, μk>0, and ρk[0,1), and set k:=0.

    (S.1) If F(xk)ε, stop.

    (S.2) Try to compute a vector dkRn in the following way:

    For iAk, set

    dki=Fi(xk)/(1ρk)μk. (2.3)

    For iIk, set solve the linear system

    (Bk+μkEIk)dki=Fi(xk)+ek, (2.4)

    where

    ekμkρkdki.

    (S.3) Find zk=xk+αkdk, where αk=βmk with mk being the smallest nonnegative integer m such that

    F(xk+βmdk),dkλ(1ρk)μkdk2. (2.5)

    (S.4) Compute

    xk+1=PΩ[xkF(zk),xkzk||F(zk)||2F(zk)]. (2.6)

    (S.5) Update Bk+1, set k:=k+1, go to (S.1).

    Just as mentioned in [13], throughout this paper, we assume that the parameter δ>0 is chosen sufficiently small such that

    δ12mini=1,2,,n|uili|.

    This implies that we cannot have xkiliδk and uixkiδk for the same index iAk.

    Our algorithm is somewhat different from the traditional active set Newton method as described in [13], where the search step dk in (S.2) is computed in the following formulas:

    For iAk, set

    dki={lixkiifxkiliδk,uixkiifuixkiδk. (2.7)

    For iIk, solve the linear system

    F(xk)IkIkdIk=F(xk)IkF(xk)IkAkdAk. (2.8)

    As described in [13], in order to understand the formula for the computation of the components dki for iIk, note that, after a possible permutation of the rows and columns, [13] rewrite the standard (unconstrained) Newton equation F(xk)d=F(xk) as

    (F(xk)IkIkF(xk)IkAkF(xk)AkIkF(xk)AkAk)(dIkdAk)=(F(xk)IkF(xk)Ak) (2.9)

    Here we replace (2.7) by (2.3) and (2.8) by (2.4), the main proposal is to guarantee that the inequality (2.5) holds. On the other hand, we compute dIk by (2.4) instead of (2.8) which can be seen as an inexact Newton method.

    The matrix Bk is updated by the well known rank two secant type formula updated by the well known BFGS formula

    Bk+1=BkBksksTkBksTkBksk+ykyTkyTksk, (2.10)

    where yk=F(xk+1)F(xk) and sk=xk+1xk.

    In the section, we prove the global convergence of Algorithm 2.1, we make the following assumption.

    Assumption:

    (A1) The function F(x) is Lipschitz continuous and monotone, i.e., there exists a positive constant L such that

    F(x)F(y)Lxy (3.1)

    and

    F(x)F(y),(xy)0,x,yΩ. (3.2)

    (A2) The sequence of matrices {Bk} is positive definite and bounded, i.e., there exists a positive constant κ such that Bkκ for all k.

    We first show that the algorithm is feasible, i.e., there exists a positive m such that (2.5) holds.

    Lemma 3.1. The Algorithm 2.1 is well defined.

    Proof. We prove that the inequality (2.5) will hold with a nonnegative integer m. Suppose that for some index k this is not the case, which means, for all integer m, we have

    F(xk+βmdk),dk<λ(1ρk)μkdk2. (3.3)

    We further get

    limmF(xk+βmdk),dk=F(xk),dk=FAk,dAkFIk,dIk=FAk2/(1ρk)μk+(Bk+μkEIk)dIkek,dIk(1ρk)μkdAk2+μkdIk2ekdIk(1ρk)μkdAk2+(1ρk)μkdIk2(1ρk)μkdk2. (3.4)

    Now we take the limit of both sides of (3.4) as m, when (3.4) holds which implies that λ1, which contradicts the choice of λ(0,1). Hence we have that the inequality holds for some integer m, and the whole algorithm is well defined.

    In what follows, we assume that the algorithm generates an infinite iteration sequence. The following result shows that the algorithm generates a bounded sequence automatically and the proof is similar to Lemma 3.2 in [24] and we omit it here.

    Theorem 3.2. Suppose assumptions (A1) and (A2) hold, sequences {xk} and {zk} are generated by Algorithm 2.1, then {xk} and {zk} are both bounded. Furthermore, for any ¯x such that F(¯x)=0, it holds that

    xk+1¯x2xk¯x2xk+1xk2. (3.5)
    limkxkzk=0. (3.6)

    and

    limkxk+1xk=0. (3.7)

    Now we give the global convergence result of the Algorithm 2.1.

    Lemma 3.3. Let {xk} be generated by Algorithm 2.1, assume Assumption (A1) and (A2) hold, and there exists constants 0<ρ_<¯ρ<1, and μ_<¯μ such that ρ_ρk¯ρ, and μ_μk¯μ. Then {xk} converges to some x such that F(x)=0.

    Proof. By the inequality (2.5), we have

    F(zk),xkzk=αkF(zk),dkλ(1ρk)μkαkdk2. (3.8)

    By the definition of dk, we have that

    dAk=FAk/(1ρk)μkFAk/(1¯ρ)μ_. (3.9)

    and

    FIk(Bk+μkEIk)dIkek(1ρk)μkdIk(1¯ρ)μ_dIk. (3.10)

    Combining (3.9) and (3.10), we can assume that there exists a positive constant c1 such that

    F(xk)c1dk. (3.11)

    On the other hand, the definition of dk also gives that

    FAk=(1ρk)μkdAk(1ρ_)¯μdAk. (3.12)

    From (2.4) and Assumption (A2), we have

    FIk(Bk+μkEIk)dIk+ek(κ+μk+ρkμk)dIk[κ+(1+¯ρ)¯μ]dIk. (3.13)

    Combining (3.12) and (3.13), we can assume that there exists a positive constant c2 such that

    F(xk)c2dk. (3.14)

    Now by (3.8), we obtain

    F(zk)xkzkF(zk),xkzkλ(1¯ρ)μ_αkdk2. (3.15)

    By the continuity of F(x), the bound of sequence {zk} and (3.6), we have

    limkαkdk2=0. (3.16)

    We consider the two possible cases:

    lim infkF(xk)=0andlim infkF(xk)>0. (3.17)

    In the first case, the continuity of F and the boundness of {xk} imply that the sequence {xk} has some accumulation point x such that F(x)=0. Since ¯x was an arbitrary solution, we can choose ¯x=x in (3.5). The sequence {xkx} converges and since x is an accumulation point of {xk}, it must be the case that {xk} converges to x.

    Now consider the second case. From (3.14), we have

    lim infkdk>0.

    Hence by (3.16), we have

    lim infkαk=0.

    (The following proof is very similar to the last part in Theorem 2.1 [23], for complement, we list it here.) By the step rule, we have the inequality (2.5) is not valid for the value βmk1, i.e.,

    F(xk+βmk1dk),dk<λ(1ρk)μkdk2 (3.18)

    Let k, we get

    F(x),d<λ(1ρ)μd2, (3.19)

    Here x, d, ρ, μ denote the limits of the corresponding sequence respectively. On the other hand, by (3.4), we get

    F(x),d(1ρ)μd2, (3.20)

    that contradicts the choice for λ(0,1). Hence the case lim infkF(xk) is impossible.

    This completes the proof.

    In this section, we demonstrate the numerical performance of Algorithm 2.1 (AQN) and its computational advantage by comparing with the modified Kanzow [13] ACTN method (denoted as AKP) and the classical Quasi-Newton method with project (denoted as CQN). All presented codes are written in MATLAB2019 and run on a PC with 3.30GHz CPU processor, 4.0GB memory and Windows 8 operation system.

    We consider ten problems with dimension n = 1000, 5000, 10000. We use six different starting points, that is:

    x1=(0.1,0.1,...,0.1)T,x2=(12,122,...,12n)T,x3=(2,2,...,2)T,x4=(1,12,...,1n)T,x5=(1,112,...,11n)T,x6=rand(0,1).

    After several parameter selection experiments, we select the initial parameters that can make the three algorithms have better performance :

    β=0.5,λ=0.6,δ=0.001,c=1,μk=0.5,ε=106,ρk=0.3.

    Set the terminating criterion for the iteration process as ||F(xk)||106. The problems are listed as follows.

    Problem 1. [25]

    Fi(x)=exi1,i=1,2,...,n, (4.1)

    where Ω=Rn+.

    Problem 2. [25]

    F1(x)=ex11,Fi(x)=exi+xi11,i=2,...,n, (4.2)

    where Ω=Rn+.

    Problem 3. [25]

    F1(x)=2x1x2+ex11,Fi(x)=xi1+2xixi+1+exi1,i=2,...,n1,Fn(x)=xn1+2xn+exn1, (4.3)

    where Ω=Rn+.

    Problem 4. [25]

    F1(x)=52x1+x21,Fi(x)=xi1+52xi+xi+11,i=2,...,n1,Fn(x)=xn1+52xn1, (4.4)

    where Ω=Rn+.

    Problem 5. [25]

    Fi(x)=exi+32sin(2xi)1,i=1,2,...,n, (4.5)

    where Ω=Rn+.

    Problem 6. [25]

    F1(x)=x1ecos(h(x1+x2)),Fi(x)=xiecos(h(xi1+xi+xi+1)),i=2,...,n1,Fn(x)=xnecos(h(xn1+xn)), (4.6)

    where h=1n+1 and Ω=Rn+.

    Problem 7. [25]

    Fi(x)=2xisin|xi|,i=1,2,...,n, (4.7)

    where Ω=Rn+.

    Problem 8. [26]

    Fi(x)=22xi1,i=1,2,...,n, (4.8)

    where Ω=Rn+.

    Problem 9. [26]

    Fi(x)=ex2i+3sinxicosxi1,i=1,2,...,n, (4.9)

    where Ω=Rn+.

    Problem 10. [24]

    Fi(x)=xisin(|xi1|),i=1,2,...,n, (4.10)

    where Ω=Rn+.

    Comprehensive results of our numerical experiment are presented in Tables 110. The columns of the presented tables have the following definitions:

    Table 1.  Numerical results for Problem 1.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 30 61 3.656 8.63E-07 22 45 9.419 6.82E-07 22 45 1.365 7.86E-07
    5000 32 65 123.832 7.97E-07 23 47 252.343 7.62E-07 23 47 94.233 8.78E-07
    10000 33 67 707.123 7.25E-07 24 49 1204.500 5.39E-07 24 49 801.382 6.21E-07
    X2 1000 28 57 5.164 9.12E-07 19 39 8.252 7.03E-07 18 37 1.256 8.22E-07
    5000 28 57 143.052 9.12E-07 19 39 199.614 7.03E-07 18 37 83.818 8.22E-07
    10000 28 57 779.363 9.12E-07 19 39 913.557 7.03E-07 18 37 609.232 8.22E-07
    X3 1000 34 69 5.087 7.49E-07 26 53 10.410 5.92E-07 27 55 2.115 5.01E-07
    5000 36 73 161.879 6.92E-07 27 55 288.600 6.62E-07 28 57 128.120 5.60E-07
    10000 36 73 853.569 9.79E-07 27 55 1328.800 9.36E-07 28 57 920.577 7.92E-07
    X4 1000 34 69 7.029 8.51E-07 20 41 7.855 9.44E-07 22 45 2.102 5.97E-07
    5000 35 71 208.186 6.74E-07 20 41 210.414 9.44E-07 22 45 100.805 5.97E-07
    10000 35 71 1078.500 9.74E-07 20 41 966.249 9.44E-07 22 45 721.224 5.97E-07
    X5 1000 34 69 5.434 9.47E-07 24 49 9.660 8.28E-07 25 51 1.683 8.35E-07
    5000 36 73 169.626 8.76E-07 25 51 264.494 9.26E-07 26 53 118.489 9.34E-07
    10000 37 75 907.287 7.96E-07 26 53 1266.100 6.55E-07 27 55 910.632 6.61E-07
    X6 1000 34 69 5.624 9.32E-07 24 49 9.475 8.07E-07 25 51 1.702 8.48E-07
    5000 36 73 169.128 8.70E-07 25 51 264.354 9.25E-07 26 53 119.666 9.25E-07
    10000 37 75 912.983 7.91E-07 26 53 1283.200 6.54E-07 27 55 916.142 6.61E-07

     | Show Table
    DownLoad: CSV
    Table 2.  Numerical results for Problem 2.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 60 121 10.393 9.69E-07 40 81 16.089 9.12E-07 45 91 8.107 7.83E-07
    5000 59 119 304.457 9.61E-07 40 81 427.665 8.75E-07 44 89 198.837 9.90E-07
    10000 59 119 1494.000 8.76E-07 40 81 2015.480 8.62E-07 44 89 1590.250 9.57E-07
    X2 1000 72 145 15.477 8.99E-07 46 93 18.481 9.61E-07 47 95 3.669 9.48E-07
    5000 72 145 439.560 8.99E-07 46 93 503.471 9.61E-07 47 95 216.065 9.48E-07
    10000 72 145 2181.400 8.99E-07 46 93 2308.030 9.61E-07 47 95 1613.963 9.48E-07
    X3 1000 76 153 16.627 8.33E-07 48 97 19.281 7.93E-07 40 81 2.786 8.10E-07
    5000 74 149 451.509 9.91E-07 48 97 519.860 7.65E-07 38 77 176.960 9.97E-07
    10000 74 149 2251.200 9.34E-07 48 97 2434.370 7.55E-07 38 77 1279.398 9.46E-07
    X4 1000 75 151 16.529 9.82E-07 48 97 19.329 8.67E-07 50 101 3.162 9.43E-07
    5000 75 151 472.387 9.78E-07 48 97 519.555 8.67E-07 50 101 229.550 9.43E-07
    10000 75 151 2346.100 9.76E-07 48 97 2408.623 8.67E-07 50 101 1657.715 9.43E-07
    X5 1000 74 149 15.728 9.03E-07 47 95 18.880 8.61E-07 50 101 3.553 8.40E-07
    5000 73 147 448.860 9.98E-07 47 95 509.295 8.29E-07 48 97 218.293 7.60E-07
    10000 73 147 2209.300 8.51E-07 47 95 2358.238 8.17E-07 48 97 1607.768 9.27E-07
    X6 1000 90 181 19.039 8.23E-07 56 113 22.527 8.90E-07 57 115 3.537 9.40E-07
    5000 94 189 563.484 8.70E-07 59 119 636.860 8.24E-07 62 125 285.941 7.93E-07
    10000 95 191 2835.600 1.00E-06 60 121 3044.774 8.69E-07 62 125 2105.600 9.51E-07

     | Show Table
    DownLoad: CSV
    Table 3.  Numerical results for Problem 3.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 93 187 19.750 9.30E-07 81 163 33.507 8.00E-07 44 89 3.338 8.25E-07
    5000 92 185 562.208 9.53E-07 84 169 918.799 8.97E-07 56 93 214.676 8.47E-07
    10000 99 199 3150.498 9.36E-07 87 175 4584.800 7.85e-07 48 97 1611.711 9.30E-07
    X2 1000 76 153 16.189 9.89E-07 61 123 25.470 8.65E-07 36 73 2.281 8.46E-07
    5000 76 153 463.761 9.89E-07 61 123 667.653 8.65E-07 36 73 164.605 8.46E-07
    10000 76 153 2325.900 9.89E-07 61 123 3055.546 8.65E-07 36 73 1190.508 8.46E-07
    X3 1000 121 243 31.734 9.47E-07 94 189 38.858 9.03E-07 57 115 3.676 6.93E-07
    5000 106 213 697.003 8.87E-07 100 201 1093.100 8.52E-07 63 127 288.233 6.82E-07
    10000 112 225 3761.469 9.95E-07 101 203 5055.167 8.92E-07 - - - -
    X4 1000 87 175 19.633 9.94E-07 72 145 29.727 8.19E-07 41 83 3.892 7.88E-07
    5000 88 177 569.847 9.97E-07 72 145 791.423 8.20E-07 41 83 186.841 7.94E-07
    10000 88 177 2817.300 9.81E-07 72 145 3597.463 8.20E-07 41 83 1368.882 7.94E-07
    X5 1000 109 219 25.967 9.32E-07 89 179 37.222 8.30E-07 53 107 3.714 9.84E-07
    5000 116 233 788.855 8.80E-07 93 187 1035.100 8.54E-07 57 115 263.580 8.93E-07
    10000 117 235 3934.500 9.97E-07 95 191 4746.173 7.82E-07 - - - -
    X6 1000 107 215 25.975 9.97E-07 95 191 39.617 7.52E-07 53 107 3.527 9.08E-07
    5000 116 233 783.719 8.42E-07 98 197 1090.700 9.45E-07 55 111 255.271 9.85E-07
    10000 120 241 4148.100 9.15E-07 101 203 5045.600 7.46E-07 58 117 1955.841 7.26E-07

     | Show Table
    DownLoad: CSV
    Table 4.  Numerical results for Problem 4.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 96 193 39.937 9.05E-07 96 193 43.074 9.05E-07 62 125 6.113 8.47E-07
    5000 97 195 1080.500 7.73E-07 97 195 1085.200 7.73E-07 64 129 292.586 8.46E-07
    10000 96 193 4915.800 7.99E-07 96 193 4809.600 7.99E-07 66 133 2193.345 9.51E-07
    X2 1000 96 193 39.898 7.99E-07 94 189 38.846 8.52E-07 76 153 4.878 8.00E-07
    5000 94 189 1049.800 8.95E-07 93 187 1015.018 9.44E-07 - - - -
    10000 94 189 4742.816 9.23E-07 92 185 4602.733 9.80E-07 - - - -
    X3 1000 75 151 31.182 8.40E-07 75 151 30.961 8.40E-07 66 133 4.018 9.87E-07
    5000 73 147 802.568 9.18E-07 73 147 796.533 9.18E-07 73 147 337.588 6.90E-07
    10000 75 151 3808.900 7.44E-07 75 151 3748.947 7.44E-07 75 151 2478.365 8.36E-07
    X4 1000 90 181 37.247 9.77E-07 93 187 38.923 8.42E-07 66 133 4.171 9.87E-07
    5000 93 187 1025.700 8.81E-07 92 185 1005.456 9.24E-07 76 153 348.842 9.56E-07
    10000 93 187 4705.600 9.01E-07 92 185 4622.813 9.17E-07 79 159 2634.719 8.62E-07
    X5 1000 91 183 37.749 8.51E-07 93 187 38.567 8.93E-07 70 141 4.794 9.67E-07
    5000 91 183 1002.900 8.29E-07 92 185 1004.607 9.91E-07 75 151 348.323 9.69E-07
    10000 110 221 5628.200 7.64E-07 94 189 4710.946 7.83E-07 75 151 2492.402 9.22E-07
    X6 1000 142 285 58.851 8.81E-07 143 287 60.753 9.66E-07 79 159 7.068 9.92E-07
    5000 151 303 1669.800 8.98E-07 151 303 1653.700 8.47E-07 85 171 393.990 7.41E-07
    10000 154 309 7856.500 9.98E-07 154 309 7653.971 9.64E-07 86 173 2869.170 8.31E-07

     | Show Table
    DownLoad: CSV
    Table 5.  Numerical results for Problem 5.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 11 23 2.351 7.06E-07 11 23 5.609 7.06E-07 30 61 7.252 7.27E-07
    5000 12 25 69.706 1.55E-07 12 25 67.909 1.55E-07 30 61 138.354 9.03E-07
    10000 12 25 352.637 2.19E-07 12 25 348.586 2.19E-07 35 71 1172.793 6.11E-07
    X2 1000 12 25 2.984 9.73E-07 12 25 2.678 6.04E-07 61 123 4.507 7.65E-07
    5000 13 27 80.610 2.13E-07 13 27 79.346 1.33E-07 - - - -
    10000 13 27 405.754 3.02E-07 13 27 398.015 1.88E-07 - - - -
    X3 1000 12 25 2.017 4.61E-07 12 25 1.936 4.61E-07 - - - -
    5000 13 27 61.161 1.82E-07 13 27 60.556 1.82E-07 - - - -
    10000 13 27 323.124 2.58E-07 13 27 319.935 2.58E-07 - - - -
    X4 1000 16 33 4.417 1.21E-07 13 27 3.303 3.32E-07 49 99 3.172 9.84E-07
    5000 14 29 93.207 1.19E-07 13 27 88.261 7.42E-07 65 131 298.947 9.85E-07
    10000 14 29 454.358 1.67E-07 13 27 397.148 1.87E-07 63 127 2097.393 7.91E-07
    X5 1000 17 35 4.812 2.09E-07 13 27 2.720 1.52E-07 59 119 4.057 7.74E-07
    5000 17 35 133.735 4.68E-07 13 27 79.443 3.40E-07 60 121 277.731 9.35E-07
    10000 17 35 659.338 6.62E-07 13 27 396.954 4.81E-07 62 125 2093.674 7.82E-07
    X6 1000 13 27 2.824 1.63E-07 13 27 2.754 1.51E-07 59 119 4.104 8.52E-07
    5000 17 35 134.016 3.29E-07 13 27 79.329 3.39E-07 60 121 297.163 8.30E-07
    10000 17 35 651.481 6.60E-07 13 27 396.469 4.82E-07 59 119 1976.682 8.91E-07

     | Show Table
    DownLoad: CSV
    Table 6.  Numerical results for Problem 6.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 49 99 20.419 7.13E-07 42 85 19.977 8.74E-07 28 57 7.443 9.06E-07
    5000 50 101 549.836 9.05E-07 43 87 484.930 9.65E-07 30 61 136.093 6.79E-07
    10000 51 103 2623.200 8.85E-07 44 89 2252.977 8.71E-07 30 61 1008.218 9.29E-07
    X2 1000 48 97 20.223 6.99E-07 42 85 17.454 9.11E-07 29 59 1.842 9.19E-07
    5000 51 103 560.601 7.23E-07 44 89 491.333 6.51E-07 32 65 149.412 5.94E-07
    10000 51 103 2607.600 8.64E-07 44 89 2251.287 9.07E-07 35 71 1172.464 5.23E-07
    X3 1000 39 79 16.445 7.27E-07 39 79 16.669 7.27E-07 27 55 1.868 7.22E-07
    5000 40 81 440.412 7.64E-07 40 81 439.550 7.64E-07 27 55 123.541 8.95E-07
    10000 41 83 2082.900 7.31E-07 41 83 2039.955 7.31E-07 28 57 934.110 5.24E-07
    X4 1000 48 97 20.053 7.21E-07 42 85 17.886 9.11E-07 28 57 3.648 9.14E-07
    5000 49 99 540.145 7.74E-07 44 89 490.692 8.50E-07 31 63 148.501 6.77E-07
    10000 51 103 2582.600 7.23E-07 44 89 2264.235 6.61E-07 31 63 1213.002 7.78E-07
    X5 1000 43 87 18.038 7.87E-07 42 85 17.649 9.23E-07 28 57 3.153 5.72E-07
    5000 44 89 483.664 9.58E-07 43 87 489.753 9.44E-07 29 59 151.122 8.58E-07
    10000 45 91 2279.000 9.44E-07 45 91 2273.992 5.95E-07 30 61 1025.754 5.67E-07
    X6 1000 44 89 18.394 9.10E-07 42 85 17.362 9.04E-07 28 57 2.144 9.01E-07
    5000 50 101 550.710 8.75E-07 43 87 491.334 9.04E-07 29 59 132.839 8.87E-07
    10000 51 103 2599.800 8.75E-07 44 89 2265.719 9.95E-07 30 61 1022.843 7.92E-07

     | Show Table
    DownLoad: CSV
    Table 7.  Numerical results for Problem 7.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 30 61 3.984 9.52E-07 22 45 9.309 7.52E-07 22 45 10.004 7.55E-07
    5000 32 65 124.936 8.79E-07 23 47 257.722 8.41E-07 23 47 103.523 8.44E-07
    10000 33 67 696.347 7.99E-07 24 49 1205.700 5.95E-07 24 49 850.861 5.97E-07
    X2 1000 30 61 8.726 8.29E-07 20 41 10.117 5.27E-07 20 41 4.586 5.67E-07
    5000 30 61 180.592 8.29E-07 20 41 225.843 5.27E-07 20 41 95.335 5.67E-07
    10000 30 61 886.356 8.29E-07 20 41 1006.400 5.27E-07 20 41 663.478 5.67E-07
    X3 1000 35 71 5.263 6.87E-07 26 53 10.435 8.45E-07 25 51 1.791 9.09E-07
    5000 36 73 160.033 9.88E-07 27 55 286.293 9.45E-07 27 55 123.108 5.08E-07
    10000 37 75 862.174 8.98E-07 28 57 1366.000 6.68E-07 27 55 894.793 7.19E-07
    X4 1000 34 69 7.383 8.53E-07 21 43 8.777 5.35E-07 21 43 1.328 6.68E-07
    5000 36 73 224.577 7.58E-07 21 43 230.923 5.35E-07 21 43 96.340 6.68E-07
    10000 35 71 886.356 8.29E-07 21 43 1051.442 5.35E-07 21 43 693.062 6.68E-07
    X5 1000 35 71 5.908 7.47E-07 24 49 10.020 9.58E-07 25 51 1.524 5.81E-07
    5000 37 75 180.603 6.89E-07 26 53 284.448 5.36E-07 26 53 119.156 6.50E-07
    10000 37 75 944.657 9.75E-07 26 53 1318.300 7.58E-07 26 53 861.209 9.19E-07
    X6 1000 35 71 5.919 7.63E-07 24 49 10.713 9.75E-07 25 51 6.592 5.83E-07
    5000 37 75 180.124 6.98E-07 26 53 284.553 5.40E-07 26 53 119.424 6.50E-07
    10000 37 75 934.966 9.82E-07 26 53 1310.800 7.60E-07 26 53 913.455 9.17E-07

     | Show Table
    DownLoad: CSV
    Table 8.  Numerical results for Problem 8.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 14 29 5.802 7.75E-07 14 29 5.782 7.75E-07 24 49 1.915 7.92E-07
    5000 15 31 165.321 5.08E-07 15 31 164.148 5.08E-07 27 55 124.365 8.29E-07
    10000 15 31 756.748 7.18E-07 15 31 751.927 7.18E-07 28 57 933.741 6.67E-07
    X2 1000 15 31 7.203 8.97E-07 15 31 6.145 3.16E-07 40 81 2.560 7.75E-07
    5000 16 33 176.418 5.87E-07 15 31 164.099 7.08E-07 - - - -
    10000 16 33 816.974 8.31E-07 16 33 802.254 2.93E-07 - - - -
    X3 1000 16 33 6.622 4.32E-07 16 33 6.548 4.32E-07 27 55 1.773 8.73E-07
    5000 16 33 177.685 9.66E-07 16 33 174.795 9.66E-07 31 63 143.292 6.08E-07
    10000 17 35 862.303 4.00E-07 17 35 851.777 4.00E-07 31 63 1043.391 8.67E-07
    X4 1000 18 37 7.461 3.10E-07 15 31 6.472 3.12E-07 28 57 1.923 8.04E-07
    5000 16 33 178.233 6.05E-07 15 31 163.715 7.05E-07 41 83 189.831 9.66E-07
    10000 16 33 814.770 8.43E-07 15 31 750.481 9.99E-07 42 85 1412.292 7.38E-07
    X5 1000 18 37 7.470 2.97E-07 14 29 5.710 9.89E-07 28 57 2.999 7.11E-07
    5000 18 37 197.939 6.65E-07 15 31 163.855 6.48E-07 29 59 132.820 7.97E-07
    10000 18 37 913.455 9.40E-07 15 31 751.494 9.17E-07 30 61 1030.916 5.65E-07
    X6 1000 18 37 7.583 2.97E-07 15 31 6.453 2.96E-07 28 57 1.812 7.13E-07
    5000 18 37 199.224 4.30E-07 15 31 163.758 6.42E-07 29 59 132.158 8.65E-07
    10000 19 39 962.107 2.93E-07 15 31 756.324 9.11E-07 29 59 966.698 6.08E-07

     | Show Table
    DownLoad: CSV
    Table 9.  Numerical results for Problem 9.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 38 77 3.055 8.24E-07 12 25 4.959 5.05E-07 23 47 3.828 5.40E-07
    5000 40 81 113.409 9.88E-07 13 27 143.222 2.82E-07 24 49 109.381 6.04E-07
    10000 42 85 712.896 7.49E-07 13 27 651.289 3.99E-07 24 49 802.195 8.54E-07
    X2 1000 37 75 4.855 8.16E-07 11 23 4.708 3.37E-07 21 43 1.338 7.99E-07
    5000 37 75 153.719 8.16E-07 11 23 120.945 3.37E-07 21 43 94.816 7.99E-07
    10000 37 75 835.797 8.16E-07 11 23 555.333 3.37E-07 21 43 699.885 7.99E-07
    X4 1000 44 89 6.384 8.49E-07 12 25 6.552 5.24E-07 25 51 1.755 5.99E-07
    5000 45 91 187.263 9.32E-07 12 25 132.641 5.24E-07 25 51 114.659 6.45E-07
    10000 47 95 1195.000 9.22E-07 12 25 607.4983 5.24E-07 25 51 834.238 6.56E-07
    X5 1000 43 87 5.098 9.20E-07 14 29 6.816 3.27E-07 31 63 3.127 6.74E-07
    5000 46 93 171.163 8.08E-07 14 29 156.394 7.32E-07 32 65 146.725 7.64E-07
    10000 47 95 967.804 8.34E-07 15 31 764.984 2.59E-07 33 67 1127.503 5.43E-07
    X6 1000 43 87 5.095 9.54E-07 14 29 7.562 3.29E-07 31 63 2.035 6.66E-07
    5000 46 93 170.719 8.49E-07 14 29 157.247 7.28E-07 32 65 152.282 7.68E-07
    10000 47 95 924.105 8.18E-07 15 31 763.091 2.57E-06 33 67 1103.404 5.36E-07

     | Show Table
    DownLoad: CSV
    Table 10.  Numerical results for Problem 10.
    IP DIM AQN CQN AKP
    NI NF CPU NORM NI NF CPU NORM NI NF CPU NORM
    X1 1000 28 57 11.625 5.43E-07 28 57 11.710 5.43E-07 25 51 2.226 6.68E-07
    5000 29 59 323.043 6.46E-07 29 59 316.952 6.46E-07 26 53 118.148 7.47E-07
    10000 29 59 1476.900 9.13E-07 29 59 1452.863 9.13E-07 27 55 895.844 5.28E-07
    X2 1000 25 51 10.099 8.50E-07 25 51 10.236 8.50E-07 30 61 1.998 5.30E-07
    5000 27 55 290.044 5.38E-07 27 55 285.598 5.38E-07 - - - -
    10000 27 55 1328.900 7.61E-07 27 55 1312.528 7.61E-07 - - - -
    X3 1000 28 57 11.242 7.69E-07 28 57 11.411 7.69E-07 26 53 1.740 9.50E-07
    5000 29 59 311.758 9.14E-07 29 59 310.210 9.14E-07 28 57 126.392 5.31E-07
    10000 30 61 1482.600 6.88E-07 30 61 1469.212 6.88E-07 28 57 931.480 7.51E-07
    X4 1000 25 51 10.069 8.57E-07 25 51 10.064 8.57E-07 28 57 1.864 5.98E-07
    5000 27 55 290.624 5.38E-07 27 55 287.511 5.38E-07 29 59 131.109 7.84E-07
    10000 27 55 1331.600 7.61E-07 27 55 1316.705 7.61E-07 30 61 999.476 6.92E-07
    X5 1000 27 55 11.246 6.81E-07 27 55 11.420 6.81E-07 27 55 1.838 9.05E-07
    5000 28 57 309.842 8.10E-07 28 57 309.207 8.10E-07 29 59 135.589 8.78E-07
    10000 29 59 1472.000 6.10E-07 29 59 1471.858 6.10E-07 30 61 1043.962 6.19E-07
    X6 1000 27 55 11.266 6.71E-07 27 55 11.374 6.79E-07 27 55 2.818 6.93E-07
    5000 28 57 310.428 8.08E-07 28 57 308.040 8.07E-07 29 59 147.317 8.16E-07
    10000 29 59 1460.100 6.10E-07 29 59 1459.546 6.08E-07 30 61 1177.273 5.73E-07

     | Show Table
    DownLoad: CSV

    IP: the initial points.

    DIM: the dimension of the problem.

    NI: the iterative number.

    NF: the iterative number of function evaluation.

    CPU: the CPU time in seconds when the algorithm terminate.

    NORM: the final norm equation.

    We denote result by '' whenever the number of iterations exceeds 500 or the terminating criterion has not been satisfied. Among these results, none of the three methods were able to solve Problem 9 when initial point is x3=(2,2,...,2)T. Therefore, Table 9 does not include the case when the initial point is x3. Meanwhile, in the drawing process, when the result was denoted by '', its NI, NF, CPU and NORM are counted as .

    The performance of the three methods was evaluated using the performance profile which is presented by Dolan and Moré [27]. We comparing three methods with the same problem, dimension and initial point in an experiment, and recoding information of interest such as NI, NF, CPU and NORM.

    We denote the set of problems as P and the set of methods as M. For example, for each problem p and method m, we define

    tp,m=CPUtimerequiredtosolveproblempbymethodm. (4.11)

    Compare the performance on problem p by method m with the best performance by any method on this problem, that is, we use the performance ratio

    rp,m=tp,mmin{tp,m:mM}. (4.12)

    We assume that a parameter Rrp,m for all p,m is chosen, and rp,m=R if and only if method m does not solve problem p. If method m can solve problem p successfully, we obtain an overall assessment of the performance between these methods. It can be described as follows:

    ρm(τ):=1npsize{pP:rp,mτ}

    where np represents the number of elements in set P, then ρm(τ) is the probability for method mM that a performance ratio rp,m is within a factor τR of the best possible ratio. The function ρm is the (cumulative) distribution function for the performance ratio.

    The performance profile ρm:R[0,1] for a method is a nondecreasing, piecewise constant function, continuous from the right at each breakpoint. We are interested in methods with a high probability of solve success, then we need only to compare the values of ρm(τ) for all of the methods and choose the method with the largest, there means that we need to find which method's function ρm first rearch the line ρm(τ)=1. In the same way, we can obtain the performance profile with respect to NI, NF and NORM.

    As can be seen from the information in the Table 110, AQN has more stable solving performance and can solve more problems, such as what AKP cannot solve: x3 and x5 of Problem 3 when n=10000; x2 of Problem 3 when n=5000,10000; x2 of Problem 5 when n=1000,5000,10000; x3 of Problem 5 when n=5000,10000; x2 of Problem 8 when n=5000,10000; x2 of Problem 10 when n=5000,10000. Compared with CQN, from Figure 1 and Figure 2, we can see that AQN reaches the line that ρm(1)=1 before CQN, which demonstrates AQN has a faster solution time(CPU) and the solution results of the final norm equation(NORM) are more accurate. Although from Figure 3 and Figure 4, there shows the iterative number of CQN is less than AQN, in practice, we pay more attention to the advantage of solution time. To sum up, AQN has a more stable and faster solving performance.

    Figure 1.  Performance profiles based on CPU time.
    Figure 2.  Performance profiles based on the final norm equation.
    Figure 3.  Performance profiles based on number of iterations.
    Figure 4.  Performance profiles based on number of function evaluation.

    In this paper, we propose an active set quasi-Newton method for the solution of optimization problem with bound constraints. The implementation of the method uses the quasi-Newton step as a trial step and the project step as the correction step. By using active set technique, we only need to solve a reduced dimension linear equation at each iteration to generate the search direction. We prove that the generated sequence is bounded automatically and obtain the global convergence of the proposed algorithm. Meanwhile, compared with other algorithms, our method has the most stable performance. There are some questions that need studying in the near future. Firstly, it is possible to get the global convergence of the proposed algorithm without the assumption of the positive definite of the matrix Bk. Secondly, how to get the local convergence of the proposed algorithm especially under some weak condition such as the local error bound condition needs further studying.

    All authors declare no conflicts of interest in this paper.



    [1] Tehrany MS, Pradhan B, Jebur MN (2015) Flood susceptibility analysis and its verification using a novel ensemble support vector machine and frequency ratio method. Stoch Environ Res Risk Assess 29: 1149–1165. https://doi.org/10.1007/s00477-015-1021-9 doi: 10.1007/s00477-015-1021-9
    [2] Lin Q, Leandro J, Gerber S, et al. (2020) Multistep flood inundation forecasts with resilient backpropagation neural networks: Kulmbach case study. Water 12. https://doi.org/10.3390/w12123568 doi: 10.3390/w12123568
    [3] Ritter J, Berenguer M, Corral C, et al. (2020) ReAFFIRM: Real-time Assessment of Flash Flood Impacts – a Regional high-resolution Method. Environ Int 136: 105375. https://doi.org/10.1016/j.envint.2019.105375 doi: 10.1016/j.envint.2019.105375
    [4] Abdelhady AU, Xu D, Ouyang Z, et al. (2022) A framework for estimating water ingress due to hurricane rainfall. J Wind Eng Ind Aerodyn 221: 104891. https://doi.org/10.1016/j.jweia.2021.104891 doi: 10.1016/j.jweia.2021.104891
    [5] Sankaranarayanan S, Prabhakar M, Satish S, et al. (2020) Flood prediction based on weather parameters using deep learning. J Water Clim Change 11: 1766–1783. https://doi.org/10.2166/wcc.2019.321 doi: 10.2166/wcc.2019.321
    [6] Kolen B, Slomp R, Jonkman SN (2013) The impacts of storm Xynthia February 27-28, 2010 in France: Lessons for flood risk management. J Flood Risk Manag 6: 261–278. https://doi.org/10.1111/jfr3.12011 doi: 10.1111/jfr3.12011
    [7] Berndtsson R, Becker P, Persson A, et al. (2019) Drivers of changing urban flood risk: A framework for action. J Environ Manag 240: 47–56. https://doi.org/10.1016/j.jenvman.2019.03.094 doi: 10.1016/j.jenvman.2019.03.094
    [8] Kwon SH, Kim JH (2021) Machine learning and urban drainage systems: State-of-the-art review. Water (Switzerland) 13: 1–14. https://doi.org/10.3390/w13243545 doi: 10.3390/w13243545
    [9] Jain SK, Mani P, Jain SK, et al. (2018) A Brief review of flood forecasting techniques and their applications. Int J River Basin Manag 16: 329–344. https://doi.org/10.1080/15715124.2017.1411920 doi: 10.1080/15715124.2017.1411920
    [10] Moore RJ, Bell VA, Jones DA (2005) Forecasting for flood warning. C R Geosci 337: 203–217. https://doi.org/10.1016/j.crte.2004.10.017 doi: 10.1016/j.crte.2004.10.017
    [11] Difrancesco KN, Tullos DD (2014) Flexibility in Water Resour Manag: Review of Concepts and Development of Assessment Measures for Flood Management Systems. J Am Water Resour Assoc 50: 1527–1539. https://doi.org/10.1111/jawr.12214 doi: 10.1111/jawr.12214
    [12] Zounemat-Kermani M, Matta E, Cominola A, et al. (2020) Neurocomputing in surface water hydrology and hydraulics: A review of two decades retrospective, current status and future prospects. J Hydrol 588: 125085. https://doi.org/10.1016/j.jhydrol.2020.125085 doi: 10.1016/j.jhydrol.2020.125085
    [13] Dazzi S, Vacondio R, Mignosa P (2021) Flood stage forecasting using machine-learning methods: A case study on the parma river (italy). Water 13. https://doi.org/10.3390/w13121612 doi: 10.3390/w13121612
    [14] Kratzert F, Klotz D, Brenner C, et al. (2018) Rainfall–runoff modelling using Long Short-Term Memory (LSTM) networks. Hydrol Earth Syst Sci 22: 6005–6022. https://doi.org/10.5194/hess-22-6005-2018 doi: 10.5194/hess-22-6005-2018
    [15] Mosavi A, Ozturk P, Chau K (2018) Flood Prediction Using Machine Learning Models: Literature Review. Water 10: 1536. https://doi.org/10.3390/w10111536 doi: 10.3390/w10111536
    [16] Han S, Coulibaly P (2017) Bayesian flood forecasting methods: A review. J Hydrol 551: 340–351. https://doi.org/10.1016/j.jhydrol.2017.06.004 doi: 10.1016/j.jhydrol.2017.06.004
    [17] Badjana HM, Fink M, Helmschrot J, et al. (2017) Hydrological system analysis and modelling of the Kara River basin (West Africa) using a lumped metric conceptual model. Hydrol Sci J 62: 1094–1113. https://doi.org/10.1080/02626667.2017.1307571 doi: 10.1080/02626667.2017.1307571
    [18] Dal Molin M, Schirmer M, Zappa M, et al. (2020) Understanding dominant controls on streamflow spatial variability to set up a semi-distributed hydrological model: The case study of the Thur catchment. Hydrol Earth Syst Sci 24: 1319–1345. https://doi.org/10.5194/hess-24-1319-2020 doi: 10.5194/hess-24-1319-2020
    [19] Wang J, Shi P, Jiang P, et al. (2017) Application of BP neural network algorithm in traditional hydrological model for flood forecasting. Water 9: 1–16. https://doi.org/10.3390/w9010048 doi: 10.3390/w9010048
    [20] Sarker IH (2021) Machine Learning: Algorithms, Real-World Applications and Research Directions. SN Computer Science 2: 1–21. https://doi.org/10.1007/s42979-021-00592-x doi: 10.1007/s42979-021-00592-x
    [21] Liakos KG, Busato P, Moshou D, et al. (2018) Machine learning in agriculture: A review. Sensors 18: 1–29. https://doi.org/10.3390/s18082674 doi: 10.3390/s18082674
    [22] Sene K (2016) Hydrometeorology, Cham, Springer International Publishing. https://doi.org/10.1007/978-3-319-23546-2
    [23] Mosavi A, Ozturk P, Chau KW (2018) Flood prediction using machine learning models: Literature review. Water (Switzerland) 10: 1–40. https://doi.org/10.3390/w10111536 doi: 10.3390/w10111536
    [24] Ighile EH, Shirakawa H, Tanikawa H (2022) A Study on the Application of GIS and Machine Learning to Predict Flood Areas in Nigeria. Sustainability14. https://doi.org/10.3390/su14095039 doi: 10.3390/su14095039
    [25] Nayak M, Das S, Senapati MR (2022) Improving Flood Prediction with Deep Learning Methods. J Inst Eng India Ser B. https://doi.org/10.1007/s40031-022-00720-y doi: 10.1007/s40031-022-00720-y
    [26] Sankaranarayanan S, Prabhakar M, Satish S, et al. (2020) Flood prediction based on weather parameters using deep learning. J Water Clim Change 11: 1766–1783. https://doi.org/10.2166/wcc.2019.321 doi: 10.2166/wcc.2019.321
    [27] Jabbari A, Bae DH (2018) Application of Artificial Neural Networks for accuracy enhancements of real-time flood forecasting in the Imjin basin. Water10. https://doi.org/10.3390/w10111626 doi: 10.3390/w10111626
    [28] Elsafi SH (2014) Artificial Neural Networks (ANNs) for flood forecasting at Dongola Station in the River Nile, Sudan. Alex Eng J 53: 655–662. https://doi.org/10.1016/j.aej.2014.06.010 doi: 10.1016/j.aej.2014.06.010
    [29] Chen JC, Ning SK, Chen HW, et al. (2008) Flooding probability of urban area estimated by decision tree and artificial neural networks. J Hydroinform 10: 57–67. https://doi.org/10.2166/hydro.2008.009 doi: 10.2166/hydro.2008.009
    [30] Tehrany MS, Pradhan B, Jebur MN (2013) Spatial prediction of flood susceptible areas using rule based decision tree (DT) and a novel ensemble bivariate and multivariate statistical models in GIS. J Hydrol 504: 69–79. https://doi.org/10.1016/j.jhydrol.2013.09.034 doi: 10.1016/j.jhydrol.2013.09.034
    [31] Tehrany MS, Pradhan B, Mansor S, et al. (2015) Flood susceptibility assessment using GIS-based support vector machine model with different kernel types. Catena 125: 91–101. https://doi.org/10.1016/j.catena.2014.10.017 doi: 10.1016/j.catena.2014.10.017
    [32] Liu M, Huang Y, Li Z, et al. (2020) The applicability of lstm-knn model for real-time flood forecasting in different climate zones in China. Water 12: 1–21. https://doi.org/10.3390/w12020440 doi: 10.3390/w12020440
    [33] Boateng EY, Otoo J, Abaye DA (2020) Basic Tenets of Classification Algorithms K-Nearest-Neighbor, Support Vector Machine, Random Forest and Neural Network: A Review. J Data Anal Inf Process 08: 341–357. https://doi.org/10.4236/jdaip.2020.84020 doi: 10.4236/jdaip.2020.84020
    [34] Modaresi F, Araghinejad S, Ebrahimi K (2018) A Comparative Assessment of Artificial Neural Network, Generalized Regression Neural Network, Least-Square Support Vector Regression, and K-Nearest Neighbor Regression for Monthly Streamflow Forecasting in Linear and Nonlinear Conditions. Water Resour Manag 32: 243–258. https://doi.org/10.1007/s11269-017-1807-2 doi: 10.1007/s11269-017-1807-2
    [35] Ghorbani MA, Zadeh HA, Isazadeh M, et al. (2016) A comparative study of artificial neural network (MLP, RBF) and support vector machine models for river flow prediction. Environ Earth Sci 75: 1–14. https://doi.org/10.1007/s12665-015-5096-x doi: 10.1007/s12665-015-5096-x
    [36] Sarker IH, Kayes ASM, Badsha S, et al. (2020) Cybersecurity data science: an overview from machine learning perspective. J Big Data 7. https://doi.org/10.1186/s40537-020-00318-5 doi: 10.1186/s40537-020-00318-5
    [37] Janiesch C, Zschech P, Heinrich K (2021) Machine learning and deep learning. Electron Mark 31: 685–695. https://doi.org/10.1007/s12525-021-00475-2 doi: 10.1007/s12525-021-00475-2
    [38] Xu T, Liang F (2021) Machine learning for hydrologic sciences: An introductory overview. Wiley Interdisciplinary Reviews: Water 8: 1–29. https://doi.org/10.1002/wat2.1533
    [39] Mohammed M, Khan MB, Bashier EBM (2016) Machine Learning algorithms and applications, Dordrecht, CRC Press. https://doi.org/10.1201/9781315371658
    [40] Shen C (2018) A Transdisciplinary Review of Deep Learning Research and Its Relevance for Water Resources Scientists. Water Resour Res 54: 8558–8593. https://doi.org/10.1029/2018WR022643 doi: 10.1029/2018WR022643
    [41] Gnecco G, Morisi R, Roth G, et al. (2017) Supervised and semi-supervised classifiers for the detection of flood-prone areas. Soft Comput 21: 3673–3685. https://doi.org/10.1007/s00500-015-1983-z doi: 10.1007/s00500-015-1983-z
    [42] Wagenaar D, De Jong J, Bouwer LM (2017) Multi-variable flood damage modelling with limited data using supervised learning approaches. Nat Hazards Earth Syst Sci 17: 1683–1696. https://doi.org/10.5194/nhess-17-1683-2017 doi: 10.5194/nhess-17-1683-2017
    [43] Oppel H, Fischer S (2020) A New Unsupervised Learning Method to Assess Clusters of Temporal Distribution of Rainfall and Their Coherence with Flood Types. Water Resour Res 56. https://doi.org/10.1029/2019WR026511 doi: 10.1029/2019WR026511
    [44] Gentleman R, Carey VJ (2008) Unsupervised Machine Learning, Bioconductor Case Studies, New York. NY, Springer., 137–157. https://doi.org/10.1007/978-0-387-77240-0_10
    [45] Usama M, Qadir J, Raza A, et al. (2019) Unsupervised Machine Learning for Networking: Techniques, Applications and Research Challenges. IEEE Access 7: 65579–65615. https://doi.org/10.1109/ACCESS.2019.2916648 doi: 10.1109/ACCESS.2019.2916648
    [46] Vamplew P, Dazeley R, Berry A, et al. (2011) Empirical evaluation methods for multiobjective reinforcement learning algorithms. Mach Learn 84: 51–80. https://doi.org/10.1109/ACCESS.2019.2916648 doi: 10.1109/ACCESS.2019.2916648
    [47] Santiago Júnior VA de, Özcan E, Carvalho VR de (2020) Hyper-Heuristics based on Reinforcement Learning, Balanced Heuristic Selection and Group Decision Acceptance. Appl Soft Comput 97: 106760. https://doi.org/10.1007/s10994-010-5232-5 doi: 10.1007/s10994-010-5232-5
    [48] Babbar-sebens M, Mukhopadhyay S (2009) Reinforcement Learning for Human-Machine Collaborative Optimization.  Appl Ground Water Monit October 3563–3568.
    [49] Jain SK, Mani P, Jain SK, et al. (2018) A Brief review of flood forecasting techniques and their applications. Int J River Basin Manag 16: 329–344. https://doi.org/10.1080/15715124.2017.1411920 doi: 10.1080/15715124.2017.1411920
    [50] Rözer V, Müller M, Bubeck P, et al. (2016) Coping with pluvial floods by private households. Water 8. https://doi.org/10.3390/W8070304 doi: 10.3390/W8070304
    [51] Nachappa T, Meena SR (2020) A novel per pixel and object-based ensemble approach for flood susceptibility mapping. Geom Nat Hazards Risk 11: 2147–2175. https://doi.org/10.1080/19475705.2020.1833990 doi: 10.1080/19475705.2020.1833990
    [52] Wu J, Liu H, Wei G, et al. (2019) Flash flood forecasting using support vector regression model in a small mountainous catchment. Water 11. https://doi.org/10.3390/w11071327 doi: 10.3390/w11071327
    [53] Yariyan P, Janizadeh S, Van Phong T, et al. (2020) Improvement of Best First Decision Trees Using Bagging and Dagging Ensembles for Flood Probability Mapping. Water Resour Manag 34: 3037–3053. https://doi.org/10.1007/s11269-020-02603-7 doi: 10.1007/s11269-020-02603-7
    [54] Landuyt L, Verhoest NEC, Van Coillie FMB (2020) Flood mapping in vegetated areas using an unsupervised clustering approach on sentinel-1 and-2 imagery. Remote Sens 12: 1–20. https://doi.org/10.3390/rs12213611 doi: 10.3390/rs12213611
    [55] Li W, Kiaghadi A, Dawson C (2021) Exploring the best sequence LSTM modeling architecture for flood prediction. Neural Comput Appl 33: 5571–5580. https://doi.org/10.1007/s00521-020-05334-3 doi: 10.1007/s00521-020-05334-3
    [56] Ahmed AN, Van Lam T, Hung ND, et al. (2021) A comprehensive comparison of recent developed meta-heuristic algorithms for streamflow time series forecasting problem. Appl Soft Comput 105: 107282. https://doi.org/10.1016/j.asoc.2021.107282 doi: 10.1016/j.asoc.2021.107282
    [57] Liu K, Li Z, Yao C, et al. (2016) Coupling the k-nearest neighbor procedure with the Kalman filter for real-time updating of the hydraulic model in flood forecasting. Int J Sediment Res 31: 149–158. https://doi.org/10.1016/j.ijsrc.2016.02.002 doi: 10.1016/j.ijsrc.2016.02.002
    [58] Kabir S, Patidar S, Xia X, et al. (2020) A deep convolutional neural network model for rapid prediction of fluvial flood inundation. J Hydrol 590: 125481. https://doi.org/10.1016/j.jhydrol.2020.125481 doi: 10.1016/j.jhydrol.2020.125481
    [59] Costache R, Arabameri A, Blaschke T, et al. (2021) Flash-flood potential mapping using deep learning, alternating decision trees and data provided by remote sensing sensors. Sensors 21: 1–21. https://doi.org/10.3390/s21010280 doi: 10.3390/s21010280
    [60] Ateeq-ur-Rauf, Ghumman AR, Ahmad S, et al. (2018) Performance assessment of artificial neural networks and support vector regression models for stream flow predictions. Environ Monit Assess 190. https://doi.org/10.1007/s10661-018-7012-9 doi: 10.1007/s10661-018-7012-9
    [61] Al-Fawa'reh M, Hawamdeh A, Alrawashdeh R, et al. (2021) Intelligent Methods for flood forecasting in Wadi al Wala, Jordan. International Congress of Advanced Technology and Engineering, ICOTEN. https://doi.org/10.1109/ICOTEN52080.2021.9493425
    [62] Parizi E, Bagheri-Gavkosh M, Hosseini SM, et al. (2021) Linkage of geographically weighted regression with spatial cluster analyses for regionalization of flood peak discharges drivers: Case studies across Iran. J Clean Prod 310. https://doi.org/10.1016/j.jclepro.2021.127526 doi: 10.1016/j.jclepro.2021.127526
    [63] Tsakiri K, Marsellos A, Kapetanakis S (2018) Artificial neural network and multiple linear regression for flood prediction in Mohawk River, New York. Water 10. https://doi.org/10.3390/w10091158 doi: 10.3390/w10091158
    [64] Wu CL, Chau KW (2010) Data-driven models for monthly streamflow time series prediction. Eng. Appl. Artif. Intell. 23: 1350–1367. https://doi.org/10.1016/j.engappai.2010.04.003 doi: 10.1016/j.engappai.2010.04.003
    [65] Zounemat-Kermani M, Kisi O, Rajaee T (2013) Performance of radial basis and LM-feed forward artificial neural networks for predicting daily watershed runoff. Appl Soft Comput 13: 4633–4644. https://doi.org/10.1016/j.asoc.2013.07.007 doi: 10.1016/j.asoc.2013.07.007
    [66] Sulaiman J, Wahab SH (2018) Heavy Rainfall Forecasting Model Using Artificial Neural Network for Flood Prone Area, In: Kim KJ, Kim H, Baek N (Eds.), IT Convergence and Security. Singap, Springer, 68–76. https://doi.org/10.1007/978-981-10-6451-7_9
    [67] Jain A, Indurthy SKVP (2003) Comparative Analysis of Event-based Rainfall-runoff Modeling Techniques—Deterministic, Statistical, and Artificial Neural Networks. J Hydrol Eng 8: 93–98. https://doi.org/10.1061/(asce)1084-0699(2003)8:2(93) doi: 10.1061/(asce)1084-0699(2003)8:2(93)
    [68] Cruz FRG, Binag MG, Ga MRG, et al. (2019) Flood Prediction Using Multi-Layer Artificial Neural Network in Monitoring System with Rain Gauge, Water Level, Soil Moisture Sensors. IEEE Region 10 Annual International Conference, Proceedings/TENCON 2499–2503. https://doi.org/10.1109/TENCON.2018.8650387
    [69] Kim H Il, Han KY (2020) Urban flood prediction using deep neural network with data augmentation. Water 12. https://doi.org/10.3390/w12030899 doi: 10.3390/w12030899
    [70] Ni JR, Xue A (2003) Application of artificial neural network to the rapid feedback of potential ecological risk in flood diversion zone. Eng Appl Artif Intell 16: 105–119. https://doi.org/10.1016/S0952-1976(03)00059-9 doi: 10.1016/S0952-1976(03)00059-9
    [71] Wu CL, Chau KW, Fan C (2010) Prediction of rainfall time series using modular artificial neural networks coupled with data-preprocessing techniques. J Hydrol 389: 146–167. https://doi.org/10.1016/j.jhydrol.2010.05.040 doi: 10.1016/j.jhydrol.2010.05.040
    [72] Feng LH, Lu J (2010) The practical research on flood forecasting based on artificial neural networks. Expert Syst Appl 37: 2974–2977. https://doi.org/10.1016/j.eswa.2009.09.037 doi: 10.1016/j.eswa.2009.09.037
    [73] Dtissibe FY, Ari AAA, Titouna C, et al. (2020) Flood forecasting based on an artificial neural network scheme. Natural Hazards 104: 1211–1237. https://doi.org/10.1007/s11069-020-04211-5 doi: 10.1007/s11069-020-04211-5
    [74] Deo RC, Şahin M (2015) Application of the Artificial Neural Network model for prediction of monthly Standardized Precipitation and Evapotranspiration Index using hydrometeorological parameters and climate indices in eastern Australia. Atmos Res 161–162: 65–81. https://doi.org/10.1016/j.atmosres.2015.03.018 doi: 10.1016/j.atmosres.2015.03.018
    [75] Sahoo A, Samantaray S, Ghose DK (2022) Multilayer perceptron and support vector machine trained with grey wolf optimiser for predicting floods in Barak river, India. J Earth Syst Sci 131. https://doi.org/10.1007/s12040-022-01815-2 doi: 10.1007/s12040-022-01815-2
    [76] Linh NTT, Ruigar H, Golian S, et al. (2021) Flood prediction based on climatic signals using wavelet neural network. Acta Geophys 69: 1413–1426. https://doi.org/10.1007/s11600-021-00620-7 doi: 10.1007/s11600-021-00620-7
    [77] Panahi M, Jaafari A, Shirzadi A, et al. (2021) Deep learning neural networks for spatially explicit prediction of flash flood probability. Geosci Front 12: 101076. https://doi.org/10.1016/j.gsf.2020.09.007 doi: 10.1016/j.gsf.2020.09.007
    [78] Wang JH, Lin GF, Chang MJ, et al. (2019) Real-Time Water-Level Forecasting Using Dilated Causal Convolutional Neural Networks. Water Resour Manag 33: 3759–3780. https://doi.org/10.1007/s11269-019-02342-4 doi: 10.1007/s11269-019-02342-4
    [79] Song T, Ding W, Wu J, et al. (2020) Flash flood forecasting based on long short-term memory networks. Water 12. https://doi.org/10.3390/w12010109 doi: 10.3390/w12010109
    [80] Wang HW, Lin GF, Hsu CT, et al. (2022) Long-Term Temporal Flood Predictions Made Using Convolutional Neural Networks. Water 14. https://doi.org/10.3390/w14244134 doi: 10.3390/w14244134
    [81] Cho M, Kim C, Jung K, et al. (2022) Water Level Prediction Model Applying a Long Short-Term Memory (LSTM)–Gated Recurrent Unit (GRU) Method for Flood Prediction. Water 14: 2221. https://doi.org/10.3390/w14142221 doi: 10.3390/w14142221
    [82] De Vos NJ (2013) Echo state networks as an alternative to traditional artificial neural networks in rainfall-runoff modelling. Hydrol Earth Syst Sci 17: 253–267. https://doi.org/10.5194/hess-17-253-2013 doi: 10.5194/hess-17-253-2013
    [83] Raghavendra S, Deka PC (2014) Support vector machine applications in the field of hydrology: A review. Appl Soft Comput 19: 372–386. https://doi.org/10.1016/j.asoc.2014.02.002 doi: 10.1016/j.asoc.2014.02.002
    [84] Xiang Y, Gou L, He L, et al. (2018) A SVR–ANN combined model based on ensemble EMD for rainfall prediction. Appl Soft Comput 73: 874–883. https://doi.org/10.1016/j.asoc.2018.09.018 doi: 10.1016/j.asoc.2018.09.018
    [85] Üstün B, Melssen WJ, Buydens LMC (2007) Visualisation and interpretation of Support Vector Regression models. Anal Chim Acta 595: 299–309. https://doi.org/10.1016/j.aca.2007.03.023 doi: 10.1016/j.aca.2007.03.023
    [86] Mosavi A, Rabczuk T, Varkonyi-Koczy AR (2018) Reviewing the novel machine learning tools for materials design. Adv Intell Syst Comput 660: 50–58. https://doi.org/10.1007/978-3-319-67459-9_7 doi: 10.1007/978-3-319-67459-9_7
    [87] Choubin B, Moradi E, Golshan M, et al. (2019) An ensemble prediction of flood susceptibility using multivariate discriminant analysis, classification and regression trees, and support vector machines. Sci Total Environ 651: 2087–2096. https://doi.org/10.1016/j.scitotenv.2018.10.064 doi: 10.1016/j.scitotenv.2018.10.064
    [88] Panahi M, Dodangeh E, Rezaie F, et al. (2021) Flood spatial prediction modeling using a hybrid of meta-optimization and support vector regression modeling. Catena 199: 105114. https://doi.org/10.1016/j.catena.2020.105114 doi: 10.1016/j.catena.2020.105114
    [89] Liu Y, Pender G (2015) A flood inundation modelling using v-support vector machine regression model. Eng Appl Artif Intell 46: 223–231. https://doi.org/10.1016/j.engappai.2015.09.014 doi: 10.1016/j.engappai.2015.09.014
    [90] Shirzadi A, Asadi S, Shahabi H, et al. (2020) A novel ensemble learning based on Bayesian Belief Network coupled with an extreme learning machine for flash flood susceptibility mapping. Eng Appl Artif Intell 96: 103971. https://doi.org/10.1016/j.engappai.2020.103971 doi: 10.1016/j.engappai.2020.103971
    [91] Young CC, Liu WC, Wu MC (2017) A physically based and machine learning hybrid approach for accurate rainfall-runoff modeling during extreme typhoon events. Appl Soft Comput 53: 205–216. https://doi.org/10.1016/j.asoc.2016.12.052 doi: 10.1016/j.asoc.2016.12.052
    [92] Yan J, Jin J, Chen F, et al. (2018) Urban flash flood forecast using support vector machine and numerical simulation. J Hydroinform 20: 232–245. https://doi.org/10.2166/hydro.2017.175 doi: 10.2166/hydro.2017.175
    [93] Bermúdez M, Cea L, Puertas J (2019) A rapid flood inundation model for hazard mapping based on least squares support vector machine regression. J Flood Risk Manag 12: 1–14. https://doi.org/10.1111/jfr3.12522 doi: 10.1111/jfr3.12522
    [94] Cervantes J, Garcia-Lamont F, Rodríguez-Mazahua L, et al. (2020) A comprehensive survey on support vector machine classification: Applications, challenges and trends. Neurocomputing. https://doi.org/10.1016/j.neucom.2019.10.118 doi: 10.1016/j.neucom.2019.10.118
    [95] Li PH, Kwon HH, Sun L, et al. (2010) A modified support vector machine based prediction model on streamflow at the Shihmen Reservoir, Taiwan. Int J Climatol 30: 1256–1268. https://doi.org/10.1002/joc.1954 doi: 10.1002/joc.1954
    [96] Sahoo A, Samantaray S, Ghose DK (2021) Prediction of Flood in Barak River using Hybrid Machine Learning Approaches: A Case Study. J Geol Soc India 97: 186–198. https://doi.org/10.1007/s12594-021-1650-1 doi: 10.1007/s12594-021-1650-1
    [97] Costache R (2019) Flash-flood Potential Index mapping using weights of evidence, decision Trees models and their novel hybrid integration. Stoch Environ Res Risk Assess33: 1375–1402. https://doi.org/10.1007/s00477-019-01689-9 doi: 10.1007/s00477-019-01689-9
    [98] Lawal ZK, Yassin H, Zakari RY (2021) Flood Prediction Using Machine Learning Models: A Case Study of Kebbi State Nigeria. IEEE Asia-Pacific Conference on Computer Science and Data Engineering, CSDE. https://doi.org/10.1109/CSDE53843.2021.9718497
    [99] De'Ath G, Fabricius KE (2000) Classification and regression trees: A powerful yet simple technique for ecological data analysis. Ecology 81: 3178–3192. https://doi.org/10.1890/0012-9658(2000)081[3178:CARTAP]2.0.CO; 2 doi: 10.1890/0012-9658(2000)081[3178:CARTAP]2.0.CO;2
    [100] Chen W, Li Y, Xue W, et al. (2020) Modeling flood susceptibility using data-driven approaches of naï ve Bayes tree, alternating decision tree, and random forest methods. Sci Total Environ 701: 134979. https://doi.org/10.1016/j.scitotenv.2019.134979 doi: 10.1016/j.scitotenv.2019.134979
    [101] Yariyan P, Janizadeh S, Van Phong T, et al. (2020) Improvement of Best First Decision Trees Using Bagging and Dagging Ensembles for Flood Probability Mapping. Water Resour Manag 34: 3037–3053. https://doi.org/10.1007/s11269-020-02603-7 doi: 10.1007/s11269-020-02603-7
    [102] Zahiri A, Azamathulla HM (2014) Comparison between linear genetic programming and M5 tree models to predict flow discharge in compound channels. Neural Comput Appl 24: 413–420. https://doi.org/10.1007/s00521-012-1247-0 doi: 10.1007/s00521-012-1247-0
    [103] Singh KK, Pal M, Singh VP (2010) Estimation of mean annual flood in indian catchments using backpropagation neural network and M5 model tree. Water Resour Manag 24: 2007–2019. https://doi.org/10.1007/s11269-009-9535-x doi: 10.1007/s11269-009-9535-x
    [104] Nguyen DT, Chen ST (2020) Real-time probabilistic flood forecasting using multiple machine learning methods. Water 12: 1–13. https://doi.org/10.3390/w12030787 doi: 10.3390/w12030787
    [105] Alizadeh Z, Yazdi J, Kim JH, et al. (2018) Assessment of machine learning techniques for monthly flow prediction. Water 10: 1–24. https://doi.org/10.3390/w10111676 doi: 10.3390/w10111676
    [106] Hou J, Zhou N, Chen G, et al. (2021) Rapid forecasting of urban flood inundation using multiple machine learning models. Nat Hazards 108: 2335–2356. https://doi.org/10.1007/s11069-021-04782-x doi: 10.1007/s11069-021-04782-x
    [107] Sankaranarayanan S, Prabhakar M, Satish S, et al. (2020) Flood prediction based on weather parameters using deep learning. J Water Clim Change 11: 1766–1783. https://doi.org/10.2166/wcc.2019.321 doi: 10.2166/wcc.2019.321
    [108] El-Magd SAA, Pradhan B, Alamri A (2021) Machine learning algorithm for flash flood prediction mapping in Wadi El-Laqeita and surroundings, Central Eastern Desert, Egypt. Arab J Geosci 14. https://doi.org/10.1007/s12517-021-06466-z doi: 10.1007/s12517-021-06466-z
    [109] Huang M, Lin R, Huang S, et al. (2017) A novel approach for precipitation forecast via improved K-nearest neighbor algorithm. Adv Eng Inform 33: 89–95. https://doi.org/10.1016/j.aei.2017.05.003 doi: 10.1016/j.aei.2017.05.003
    [110] Cuomo S, Di Cola VS, Giampaolo F, et al. (2022) Scientific Machine Learning Through Physics–Informed Neural Networks: Where we are and What's Next. J Sci Comput 92: 88. https://doi.org/10.1007/s10915-022-01939-z doi: 10.1007/s10915-022-01939-z
    [111] Michele A, Colin V, Santika DD (2019) MobileNet Convolutional Neural Networks and Support Vector Machines for Palmprint Recognition. Procedia Comput Sci 157: 110–117. https://doi.org/10.1016/j.procs.2019.08.147 doi: 10.1016/j.procs.2019.08.147
    [112] Gao X, Shan C, Hu C, et al. (2019) An Adaptive Ensemble Machine Learning Model for Intrusion Detection. IEEE Access 7: 82512–82521. https://doi.org/10.1109/ACCESS.2019.2923640 doi: 10.1109/ACCESS.2019.2923640
    [113] Shanmugasundar G, Vanitha M, Čep R, et al. (2021) A Comparative Study of Linear, Random Forest and AdaBoost Regressions for Modeling Non-Traditional Machining. Processes 9: 2015. https://doi.org/10.3390/pr9112015 doi: 10.3390/pr9112015
    [114] Triguero I, García‐Gil D, Maillo J, et al. (2019) Transforming big data into smart data: An insight on the use of the k‐nearest neighbors algorithm to obtain quality data. WIREs Data Min Knowl Discov. https://doi.org/10.1002/widm.1289 doi: 10.1002/widm.1289
    [115] Senthilnath J, Shreyas PB, Rajendra R, et al. (2019) Hierarchical clustering approaches for flood assessment using multi-sensor satellite images. Int J Image Data Fusion 10: 28–44. https://doi.org/10.1080/19479832.2018.1513956 doi: 10.1080/19479832.2018.1513956
    [116] Rahman AS, Rahman A (2020) Application of principal component analysis and cluster analysis in regional flood frequency analysis: A case study in new South Wales, Australia. Water 12: 1–26. https://doi.org/10.3390/w12030781 doi: 10.3390/w12030781
    [117] Engelen JE, Hoos HH (2020) A survey on semi-supervised learning. Mach Learn 109: 373–440. https://doi.org/10.1007/s10994-019-05855-6 doi: 10.1007/s10994-019-05855-6
    [118] Inyang UG, Akpan EE, Akinyokun OC (2020) A Hybrid Machine Learning Approach for Flood Risk Assessment and Classification. Int J Comput Intell Appl 19: 1–20. https://doi.org/10.1142/S1469026820500121 doi: 10.1142/S1469026820500121
    [119] Devi G, Sharma M, Sarma P, et al. (2022) Flood Frequency Modeling and Prediction of Beki and Pagladia Rivers Using Deep Learning Approach. Neural Process Lett. https://doi.org/10.1007/s11063-022-10773-1 doi: 10.1007/s11063-022-10773-1
    [120] He W, Jiang Z (2022) Semi-Supervised Learning With the EM Algorithm: A Comparative Study Between Unstructured and Structured Prediction. IEEE Trans Knowl Data Eng 34: 2912–2920. https://doi.org/10.1109/TKDE.2020.3019038 doi: 10.1109/TKDE.2020.3019038
    [121] Zhao G, Pang B, Xu Z, et al. (2019) Assessment of urban flood susceptibility using semi-supervised machine learning model. Sci Total Environ 659: 940–949. https://doi.org/10.1016/j.scitotenv.2018.12.217 doi: 10.1016/j.scitotenv.2018.12.217
    [122] Silver D, Singh S, Precup D, et al. (2021) Reward is enough. Artificial Intelligence 299: 103535. https://doi.org/10.1016/j.artint.2021.103535 doi: 10.1016/j.artint.2021.103535
    [123] Serrano W (2022) Deep Reinforcement Learning with the Random Neural Network. Eng Appl Artif Intell 110: 104751. https://doi.org/10.1016/j.engappai.2022.104751 doi: 10.1016/j.engappai.2022.104751
    [124] Bowes BD, Tavakoli A, Wang C, et al. (2021) Flood mitigation in coastal urban catchments using real-time stormwater infrastructure control and reinforcement learning. J Hydroinform 23: 529–547. https://doi.org/10.2166/HYDRO.2020.080 doi: 10.2166/HYDRO.2020.080
    [125] Baldazo D, Parras J, Zazo S (2019) Decentralized multi-agent deep reinforcement learning in swarms of drones for flood monitoring. Eur Signal Process Conf. https://doi.org/10.23919/EUSIPCO.2019.8903068 doi: 10.23919/EUSIPCO.2019.8903068
    [126] Hapuarachchi HAP, Wang QJ, Pagano TC (2011) A review of advances in flash flood forecasting. Hydrol Process 25: 2771–2784. https://doi.org/10.1002/hyp.8040 doi: 10.1002/hyp.8040
    [127] Sood A, Smakhtin V (2015) Global hydrological models: a review. Hydrol Sci J 60: 549–565. https://doi.org/10.1080/02626667.2014.950580 doi: 10.1080/02626667.2014.950580
    [128] Grimaldi S, Li Y, Pauwels VRN, et al. (2016) Remote Sensing-Derived Water Extent and Level to Constrain Hydraulic Flood Forecasting Models: Opportunities and Challenges. Surv Geophys 37: 977–1034. https://doi.org/10.1007/s10712-016-9378-y doi: 10.1007/s10712-016-9378-y
    [129] Munawar HS, Hammad AWA, Waller ST (2022) Remote Sensing Methods for Flood Prediction: A Review. Sensors 22. https://doi.org/10.3390/s22030960 doi: 10.3390/s22030960
    [130] Yuan Q, Shen H, Li T, et al. (2020) Deep learning in environmental remote sensing: Achievements and challenges. Remote Sens Environ 241: 111716. https://doi.org/10.1016/j.rse.2020.111716 doi: 10.1016/j.rse.2020.111716
    [131] Maggioni V, Massari C (2018) On the performance of satellite precipitation products in riverine flood modeling: A review. J Hydrol 558: 214–224. https://doi.org/10.1016/j.jhydrol.2018.01.039 doi: 10.1016/j.jhydrol.2018.01.039
    [132] Olson D, Anderson J (2021) Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agron J 113: 971–992. https://doi.org/10.1002/agj2.20595 doi: 10.1002/agj2.20595
    [133] Justice CO, Townshend JRG, Vermote EF, et al. (2002) An overview of MODIS Land data processing and product status. Remote Sens Environ 83: 3–15. https://doi.org/10.1016/S0034-4257(02)00084-6 doi: 10.1016/S0034-4257(02)00084-6
    [134] Qi W, Ma C, Xu H, et al. (2021) A review on applications of urban flood models in flood mitigation strategies. Nat Hazards 108: 31–62. https://doi.org/10.1007/s11069-021-04715-8 doi: 10.1007/s11069-021-04715-8
    [135] Tien Bui D, Hoang ND, Martínez-Álvarez F, et al. (2020) A novel deep learning neural network approach for predicting flash flood susceptibility: A case study at a high frequency tropical storm area. Sci Total Environ 701: 134413. https://doi.org/10.1016/j.scitotenv.2019.134413 doi: 10.1016/j.scitotenv.2019.134413
    [136] Xu T, Liang F (2021) Machine learning for hydrologic sciences: An introductory overview. Wiley Interdisciplinary Reviews: Water 8: 1–29. https://doi.org/10.1002/wat2.1533
    [137] Shen C, Lawson K (2021) Applications of Deep Learning in Hydrology. Deep Learn Earth Sci, Wiley 283–297. https://doi.org/10.1002/9781119646181.ch19
    [138] Sit M, Demiray BZ, Xiang Z, et al. (2020) A comprehensive review of deep learning applications in hydrology and water resources. Water Sci Technol 82: 2635–2670. https://doi.org/10.2166/wst.2020.369. doi: 10.2166/wst.2020.369
    [139] Zounemat-Kermani M, Batelaan O, Fadaee M, et al. (2021) Ensemble machine learning paradigms in hydrology: A review. J Hydrol 598: 126266. https://doi.org/10.1016/j.jhydrol.2021.126266 doi: 10.1016/j.jhydrol.2021.126266
    [140] Merz B, Kreibich H, Schwarze R, et al. (2010) Review article Assessment of economic flood damage. Nat Hazards Earth Syst Sci 10: 1697–1724. https://doi.org/10.5194/nhess-10-1697-2010 doi: 10.5194/nhess-10-1697-2010
    [141] Tarasova L, Merz R, Kiss A, et al. (2019) Causative classification of river flood events. WIREs Water 6. https://doi.org/10.1002/wat2.1353 doi: 10.1002/wat2.1353
    [142] Mahmoodi N, Wagner PD, Kiesel J, et al. (2021) Modeling the impact of climate change on streamflow and major hydrological components of an Iranian Wadi system. J Water Clim Change 12: 1598–1613. https://doi.org/10.2166/wcc.2020.098 doi: 10.2166/wcc.2020.098
    [143] Mangukiya NK, Sharma A (2022) Flood risk mapping for the lower Narmada basin in India: a machine learning and IoT-based framework. Nat Hazards 113: 1285–1304. https://doi.org/10.1007/s11069-022-05347-2 doi: 10.1007/s11069-022-05347-2
    [144] Alipour MH (2015) Risk-Informed Decision Making Framework for Operating a Multi-Purpose Hydropower Reservoir During Flooding and High Inflow Events, Case Study: Cheakamus River System. Water Resour Manag 29: 801–815. https://doi.org/10.1007/s11269-014-0844-3 doi: 10.1007/s11269-014-0844-3
    [145] Coates G, Li C, Ahilan S, et al. (2019) Agent-based modeling and simulation to assess flood preparedness and recovery of manufacturing small and medium-sized enterprises. Eng Appl Artif Intell 78: 195–217. https://doi.org/10.1016/j.engappai.2018.11.010 doi: 10.1016/j.engappai.2018.11.010
    [146] Kienzler S, Pech I, Kreibich H, et al. (2015) After the extreme flood in 2002: changes in preparedness, response and recovery of flood-affected residents in Germany between 2005 and 2011. Nat Hazards Earth Syst Sci 15: 505–526. https://doi.org/10.5194/nhess-15-505-2015 doi: 10.5194/nhess-15-505-2015
    [147] Packer C, Gao K, Kos J, et al. (2018) Assessing Generalization in Deep Reinforcement Learning.
    [148] Ma K, Feng D, Lawson K, et al. (2021) Transferring Hydrologic Data Across Continents – Leveraging Data‐Rich Regions to Improve Hydrologic Prediction in Data‐Sparse Regions. Water Resour Res 57: e2020WR028600. https://doi.org/10.1029/2020WR028600 doi: 10.1029/2020WR028600
    [149] Weiss K, Khoshgoftaar TM, Wang D (2016) A survey of transfer learning. J Big Data 3: 9. https://doi.org/10.1186/s40537-016-0043-6 doi: 10.1186/s40537-016-0043-6
    [150] Boelee L, Lumbroso DM, Samuels PG, et al. (2019) Estimation of uncertainty in flood forecasts—A comparison of methods. J Flood Risk Manag 12. https://doi.org/10.1111/jfr3.12516 doi: 10.1111/jfr3.12516
    [151] Nevo S, Anisimov V, Elidan G, et al. (2019) ML for Flood Forecasting at Scale.
    [152] Hardy J, Gourley JJ, Kirstetter P-E, et al. (2016) A method for probabilistic flash flood forecasting. J Hydrol 541: 480–494. https://doi.org/10.1016/j.jhydrol.2016.04.007 doi: 10.1016/j.jhydrol.2016.04.007
    [153] Han S, Coulibaly P (2019) Probabilistic Flood Forecasting Using Hydrologic Uncertainty Processor with Ensemble Weather Forecasts. J Hydrometeorol 20: 1379–1398. https://doi.org/10.1175/JHM-D-18-0251.1 doi: 10.1175/JHM-D-18-0251.1
    [154] Zhan X, Qin H, Liu Y, et al. (2020) Variational Bayesian Neural Network for Ensemble Flood Forecasting. Water 12: 2740. https://doi.org/10.3390/w12102740 doi: 10.3390/w12102740
    [155] Ivanov VY, Xu D, Dwelle MC, et al. (2021) Breaking Down the Computational Barriers to Real‐Time Urban Flood Forecasting. Geophys Res Lett 48. https://doi.org/10.1029/2021GL093585 doi: 10.1029/2021GL093585
    [156] Wang H, Chen Y (2019) Identifying Key Hydrological Processes in Highly Urbanized Watersheds for Flood Forecasting with a Distributed Hydrological Model. Water 11: 1641. https://doi.org/10.3390/w11081641 doi: 10.3390/w11081641
    [157] Liu Z, Felton T, Mostafavi A (2024) Interpretable machine learning for predicting urban flash flood hotspots using intertwined land and built-environment features. Comput Environ Urban Syst110: 102096. https://doi.org/10.1016/j.compenvurbsys.2024.102096 doi: 10.1016/j.compenvurbsys.2024.102096
    [158] Ding Y, Zhu Y, Feng J, et al. (2020) Interpretable spatio-temporal attention LSTM model for flood forecasting. Neurocomputing 403: 348–359. https://doi.org/10.1016/j.neucom.2020.04.110 doi: 10.1016/j.neucom.2020.04.110
    [159] Vollert S, Atzmueller M, Theissler A (2021) Interpretable Machine Learning: A brief survey from the predictive maintenance perspective, 26th IEEE Int Conf Emerg Technol Fact Autom (ETFA) 01–08. https://doi.org/10.1109/ETFA45728.2021.9613467
    [160] Motta M, de Castro Neto M, Sarmento P (2021) A mixed approach for urban flood prediction using Machine Learning and GIS. Int J Disaster Risk Reduct 56: 102154. https://doi.org/10.1016/j.ijdrr.2021.102154 doi: 10.1016/j.ijdrr.2021.102154
    [161] Qiao L, Livsey D, Wise J, et al. (2024) Predicting flood stages in watersheds with different scales using hourly rainfall dataset: A high-volume rainfall features empowered machine learning approach. Sci Total Environ 950: 175231. https://doi.org/10.1016/j.scitotenv.2024.175231 doi: 10.1016/j.scitotenv.2024.175231
    [162] Khaniya B, Gunathilake MB, Rathnayake U (2021) Ecosystem-Based adaptation for the impact of climate change and variation in the water management sector of Sri Lanka. Math Probl Eng, 2021: 1–10. https://doi.org/10.1155/2021/8821329 doi: 10.1155/2021/8821329
    [163] Islam ARMT, Talukdar S, Mahato S, et al. (2021) Flood susceptibility modelling using advanced ensemble machine learning models. Geosci Front, 12(3): 101075. https://doi.org/https://doi.org/10.1016/j.gsf.2020.09.006 doi: 10.1016/j.gsf.2020.09.006
  • This article has been cited by:

    1. Hongwu Zhang, Yanhui Li, A Modified Iteration Method for an Inverse Problem of Diffusion Equation with Laplace and Riesz-Feller Space Fractional Operators, 2024, 1017-1398, 10.1007/s11075-024-01951-4
  • Reader Comments
  • © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1391) PDF downloads(296) Cited by(0)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog