Purdue University Graduate School
Browse

UNCERTAINTY QUANTIFICATION IN MACHINE LEARNING: DEEP REINFORCEMENT LEARNING AND PHYSICS-INFORMED NEURAL NETWORKS

thesis
posted on 2025-07-27, 18:18 authored by Frank ShihFrank Shih
<p dir="ltr">Deep learning models have demonstrated remarkable capabilities in addressing complex real-world applications, due to the universal approximation power of deep neural networks. However, as the architectures and training procedures of these models become more complicated, it remains a significant challenge to quantify uncertainty in a statistically rigorous manner. A common strategy involves applying Bayesian methods to deep neural networks. Yet, the complexity of modern architectures and the intractability of their posterior distributions often hinder these approaches from deriving reliable uncertainty estimates.</p><p dir="ltr">In this dissertation, we tackle the problem of uncertainty quantification (UQ) in three modern deep learning paradigms: value-based reinforcement learning (RL), actor-critic RL, and physics-informed neural networks (PINNs). We develop novel algorithms and theoretical frameworks to enable principled and scalable UQ, improving the robustness and interpretability of deep models.</p><p dir="ltr">In the first part, we reformulate the Deep Q-Network (DQN) algorithm as a state-space model and introduce a Langevinized Kalman Temporal Difference (LKTD) method. This method combines Langevinized Ensemble Kalman Filtering (LEnKF) with Kalman Temporal Difference learning, yielding a sampling algorithm grounded in stochastic gradient Markov Chain Monte Carlo (SGMCMC). We prove convergence to a stationary distribution under mild conditions, enabling the quantification and monitoring of uncertainty in the value function and model parameters throughout the RL training process.</p><p dir="ltr">The second part addresses uncertainty quantification in deep actor-critic RL, where the alternating optimization of actor and critic networks makes value estimation particularly unstable. We propose a Latent Trajectory Framework (LTF) that treats transition trajectories as latent variables, allowing for trajectory-independent training. Within this framework, we design an adaptive SGMCMC algorithm that implicitly integrates latent dynamics during optimization. Our approach achieves both theoretical convergence and empirical gains in robustness and performance. Notably, it allows for accurate uncertainty estimation for the value function parameters.</p><p dir="ltr">The final part focuses on uncertainty quantification in scientific machine learning, specifically in PINNs. Existing Bayesian and dropout-based UQ methods often rely on unverifiable prior assumptions or tuning parameters. We propose a new approach based on extended fiducial inference (EFI), utilizing a narrow-neck hypernetwork to parameterize the PINN and estimate uncertainty directly from data. This method enables the construction of statistically valid confidence sets without the need for prior distributions or dropout calibration. Furthermore, we establish a general EFI framework suitable for high-dimensional, over-parameterized models, thereby enhancing both the theoretical grounding and practical reliability of UQ in scientific deep learning.</p><p dir="ltr">Together, these contributions provide statistically rigorous solutions to uncertainty quantification across multiple deep learning paradigms. By grounding our methods in principles from state-space modeling, stochastic gradient MCMC, and fiducial inference, we establish a solid theoretical foundation for understanding and estimating uncertainty in complex neural architectures. This dissertation bridges the gap between modern statistical theory and deep learning practice, offering principled frameworks that enhance the interpretability, reliability, and robustness of deep models in both decision-making and scientific domains. Our work underscores the importance of integrating statistical reasoning into deep learning, paving the way for more trustworthy and theoretically grounded AI systems.</p>

History

Degree Type

  • Doctor of Philosophy

Department

  • Statistics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Faming Liang

Additional Committee Member 2

Qifan Song

Additional Committee Member 3

Antik Chakraborty

Additional Committee Member 4

Nianqiao Ju

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC