Purdue University Graduate School
Browse
- No file added yet -

LEAST-SQUARES RELU NEURAL NETWORK METHOD FOR SCALAR HYPERBOLIC CONSERVATION LAW

Download (2.52 MB)
thesis
posted on 2021-04-27, 16:25 authored by Jingshuang ChenJingshuang Chen

The thesis introduces the least-squares ReLU neural network method for solving scalar hyperbolic conservation laws with discontinuous solutions. The method is a discretization of an equivalent least-squares formulation in the set of neural network functions with the ReLU activation function. Evaluation of the LS functional is done by using numerical integration and proper finite difference/volume scheme.


We theoretically and numerically show that the least-squares ReLU neural network is capable of approximating the discontinuous interface of the underlying problem automatically through the free hyper-planes of the ReLU neural network and, hence, outperforms the traditional mesh-based numerical methods in terms of the number of degrees of freedom. Numerical results of some benchmark test problems for linear advection-reaction equations as well as nonlinear equations show that the method can not only accurately approximate the solution with the least number of parameters, but also avoid the common Gibbs phenomena along the discontinuous interface.

History

Degree Type

  • Doctor of Philosophy

Department

  • Mathematics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Zhiqiang Cai

Additional Committee Member 2

Jie Shen

Additional Committee Member 3

Xiangxiong Zhang

Additional Committee Member 4

Min Liu

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC