Purdue University Graduate School
Browse

NUMERICAL METHOD BASED NEURAL NETWORK AND ITS APPLICATIONS

Reason: It is in the process of publication.

2

month(s)

27

day(s)

until file(s) become available

NUMERICAL METHOD BASED NEURAL NETWORK AND ITS APPLICATION IN SCIENTIFIC COMPUTING, OPERATOR LEARNING AND OPTIMIZATION PROBLEM

thesis
posted on 2022-07-22, 17:52 authored by Jiahao ZhangJiahao Zhang

In this work, we develop several special computational structures of Neural Networks based on some existing approaches such as Auto-Encoder and DeepONet. Combined with classic numerical methods in scientific computing, finite difference and SAV method, our model is able to solve the operator learning tasks of partial differential equations accurately in both data-driven and non-data-driven settings. The high dimensional problem requires a large number of samples for training in the normal settings of Neural network training. The proposed

model equipped with auto-encoder performs the dimension reduction for the input operator, which discovers the intrinsic hidden features, to reduce the number of samples needed for training. In addition, the non-linear basis of the hidden variables are constructed

for both the operator variable and the solution of the equation, leading to a concise representation of the solution. For non data-driven setting, our method derives the solution of the equation with only initial and boundary condition, where the normal network can not manage to do it, with the assistance of SAV method. Besides, it preserves the advantages of DeepONet. It performs the operator learning with various initial conditions or parametric equations. The modified energy is defined to estimate the true energy of the system and it has the monotonic decreasing property. It also serves as an indicator of the suitable time step, allowing the model to adjust the time step. Finally, the optimization is a key procedure of network training. We propose a new optimization method based on SAV. It allows a much

larger learning rate compared to SGD and ADAM, which are most popular methods used nowadays. Moreover, It also allows the adaptive learning rate to pursue the faster speed converging to the critical point.

History

Degree Type

  • Doctor of Philosophy

Department

  • Mathematics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Guang Lin

Additional Committee Member 2

Jie Shen

Additional Committee Member 3

Aaron Nung Kwan Yip

Additional Committee Member 4

Xiangxiong Zhang

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC