In this paper, we construct approximated solutions of Differential Equations (DEs) using the Deep Neural Network (DNN). Furthermore, we present an architecture that includes the process of finding model parameters through experimental data, the inverse problem. That is, we provide a unified framework of DNN architecture that approximates an analytic solution and its model parameters simultaneously. The architecture consists of a feed forward DNN with non-linear activation functions depending on DEs, automatic differentiation [
Citation: Hyeontae Jo, Hwijae Son, Hyung Ju Hwang, Eun Heui Kim. Deep neural network approach to forward-inverse problems[J]. Networks and Heterogeneous Media, 2020, 15(2): 247-259. doi: 10.3934/nhm.2020011
In this paper, we construct approximated solutions of Differential Equations (DEs) using the Deep Neural Network (DNN). Furthermore, we present an architecture that includes the process of finding model parameters through experimental data, the inverse problem. That is, we provide a unified framework of DNN architecture that approximates an analytic solution and its model parameters simultaneously. The architecture consists of a feed forward DNN with non-linear activation functions depending on DEs, automatic differentiation [
[1] |
A parameter estimation method for stiff ordinary differential equations using particle swarm optimisation. Int. J. Comput. Sci. Math. (2018) 9: 419-432. ![]() |
[2] | A. G. Baydin, B. A. Pearlmutter, A. A. Radul and J. M. Siskind, Automatic differentiation in machine learning: A survey, J. Mach. Learn. Res., 18 (2017), 43pp. |
[3] | J. Berg and K. Nystr{ö}m, Neural network augmented inverse problems for PDEs, preprint, arXiv: 1712.09685. |
[4] |
A unified deep artificial neural network approach to partial differential equations in complex geometries. Neurocomputing (2018) 317: 28-41. ![]() |
[5] |
G. Chavet, Nonlinear Least Squares for Inverse Problems. Theoretical Foundations and Step-By-Step Guide for Applications, Scientific Computation, Springer, New York, 2009. doi: 10.1007/978-90-481-2785-6
![]() |
[6] |
The Stone-Weierstrass theorem and its application to neural networks. IEEE Trans. Neural Networks (1990) 1: 290-295. ![]() |
[7] |
On the partial difference equations of mathematical physics. IBM J. Res. Develop. (1967) 11: 215-234. ![]() |
[8] |
Approximation by superpositions of a sigmoidal function. Math. Control Signals Systems (1989) 2: 303-314. ![]() |
[9] |
L. C. Evans, Partial Differential Equations, Graduate Studies in Mathematics, 19, American Mathematical Society, Providence, RI, 2010. doi: 10.1090/gsm/019
![]() |
[10] | Solving partial differential equations by collocation with radial basis functions. Proceedings of Chamonix (1996) 1997: 1-8. |
[11] |
Multilayer feedforward networks are universal approximators. Neural Networks (1989) 2: 359-366. ![]() |
[12] | D. P. Kingma and J. Ba, Adam: A method for stochastic optimization, preprint, arXiv: 1412.6980. |
[13] |
Artificial neural networks for solving ordinary and partial differential equations. IEEE Trans. Neural Networks (1998) 9: 987-1000. ![]() |
[14] |
Neural-network methods for boundary value problems with irregular boundaries. IEEE Trans. Neural Networks (2000) 11: 1041-1049. ![]() |
[15] |
A method for the solution of certain non-linear problems in least squares. Quart. Appl. Math. (1944) 2: 164-168. ![]() |
[16] |
Numerical solution of elliptic partial differential equation using radial basis function neural networks. Neural Networks (2003) 16: 729-734. ![]() |
[17] |
J. Li and X. Li, Particle swarm optimization iterative identification algorithm and gradient iterative identification algorithm for Wiener systems with colored noise, Complexity, 2018 (2018), 8pp. doi: 10.1155/2018/7353171
![]() |
[18] |
Simultaneous approximations of multivariate functions and their derivatives by neural networks with one hidden layer. Neurocomputing (1996) 12: 327-343. ![]() |
[19] |
An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Indust. Appl. Math. (1963) 11: 431-441. ![]() |
[20] |
A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. (1943) 5: 115-133. ![]() |
[21] | A. Paszke, et al., Automatic differentiation in PyTorch, Computer Science, (2017). |
[22] |
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. (2019) 378: 686-707. ![]() |
[23] | S. J. Reddi, S. Kale and S. Kumar, On the convergence of ADAM and beyond, preprint, arXiv: 1904.09237. |
[24] |
Adaptive radial basis function methods for time dependent partial differential equations. Appl. Numer. Math. (2005) 54: 79-94. ![]() |
[25] |
P. Tsilifis, I. Bilionis, I. Katsounaros and N. Zabaras, Computationally efficient variational approximations for Bayesian inverse problems, J. Verif. Valid. Uncert., 1 (2016), 13pp. doi: 10.1115/1.4034102
![]() |
[26] |
F. Yaman, V. G. Yakhno and R. Potthast, A survey on inverse problems for applied sciences, Math. Probl. Eng., 2013 (2013), 19pp. doi: 10.1155/2013/976837
![]() |