Purdue University Graduate School
Browse

A SYSTEMATIC STUDY OF SPARSE DEEP LEARNING WITH DIFFERENT PENALTIES

Download (703.91 kB)
thesis
posted on 2023-04-25, 19:08 authored by Xinlin TaoXinlin Tao
<p>Deep learning has been the driving force behind many successful data science achievements. However, the deep neural network (DNN) that forms the basis of deep learning is</p> <p>often over-parameterized, leading to training, prediction, and interpretation challenges. To</p> <p>address this issue, it is common practice to apply an appropriate penalty to each connection</p> <p>weight, limiting its magnitude. This approach is equivalent to imposing a prior distribution</p> <p>on each connection weight from a Bayesian perspective. This project offers a systematic investigation into the selection of the penalty function or prior distribution. Specifically, under</p> <p>the general theoretical framework of posterior consistency, we prove that consistent sparse</p> <p>deep learning can be achieved with a variety of penalty functions or prior distributions.</p> <p>Examples include amenable regularization penalties (such as MCP and SCAD), spike-and?slab priors (such as mixture Gaussian distribution and mixture Laplace distribution), and</p> <p>polynomial decayed priors (such as the student-t distribution). Our theory is supported by</p> <p>numerical results.</p> <p><br></p>

History

Degree Type

  • Doctor of Philosophy

Department

  • Statistics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Faming Liang

Additional Committee Member 2

Xiao Wang

Additional Committee Member 3

Vinayak Rao

Additional Committee Member 4

Bruno Ribeiro

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC