Purdue University Graduate School
Browse

Advances In Numerical Methods for Partial Differential Equations and Optimization

Download (4.9 MB)
thesis
posted on 2024-07-10, 19:17 authored by Xinyu LiuXinyu Liu

This thesis presents advances in numerical methods for partial differential equations (PDEs) and optimization problems, with a focus on improving efficiency, stability, and accuracy across various applications. We begin by addressing 3D Poisson-type equations, developing a GPU-accelerated spectral-element method that utilizes the tensor product structure to achieve extremely fast performance. This approach enables solving problems with over one billion degrees of freedom in less than one second on modern GPUs, with applications to Schrödinger and CahnHilliard equations demonstrated. Next, we focus on parabolic PDEs, specifically the CahnHilliard equation with dynamical boundary conditions. We propose an efficient energy-stable numerical scheme using a unified framework to handle both AllenCahn and CahnHilliard type boundary conditions. The scheme employs a scalar auxiliary variable (SAV) approach to achieve linear, second-order, and unconditionally energy stable properties. Shifting to a machine learning perspective for PDEs, we introduce an unsupervised learning-based numerical method for solving elliptic PDEs. This approach uses deep neural networks to approximate PDE solutions and employs least-squares functionals as loss functions, with a focus on first-order system least-squares formulations. In the realm of optimization, we present an efficient and robust SAV based algorithm for discrete gradient systems. This method modifies the standard SAV approach and incorporates relaxation and adaptive strategies to achieve fast convergence for minimization problems while maintaining unconditional energy stability. Finally, we address optimization in the context of machine learning by developing a structure-guided GaussNewton method for shallow ReLU neural network optimization. This approach exploits both the least-squares and neural network structures to create an efficient iterative solver, demonstrating superior performance on challenging function approximation problems. Throughout the thesis, we provide theoretical analysis, efficient numerical implementations, and extensive computational experiments to validate the proposed methods.

History

Degree Type

  • Doctor of Philosophy

Department

  • Mathematics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Jie Shen

Additional Committee Member 2

Zhiqiang Cai

Additional Committee Member 3

Jianlin Xỉa

Additional Committee Member 4

Xiangxiong Zhang