Purdue University Graduate School
Browse

Faster Matrix Algorithms via Randomized Sketching & Preconditioning

Download (14.15 MB)
thesis
posted on 2021-12-08, 21:34 authored by Agniva ChowdhuryAgniva Chowdhury
Recently, in statistics and machine learning, the notion of Randomization in Numerical Linear Algebra (RandNLA) has not only evolved as a vital new tool to design fast and efficient algorithms, but also promised a sound statistical foundation for modern large-scale data analysis. In this dissertation, we study the application of matrix sketching and sampling algorithms on four problems in statistics and machine learning:

1. Ridge regression: We consider ridge regression, a variant of regularized least squares regression that is particularly suitable in settings where the number of predictor variables greatly exceeds the number of observations. We present a simple, iterative, sketching-based algorithm for ridge regression that guarantees high-quality approximations to the optimal solution vector. Our analysis builds upon two simple structural results that boil down to randomized matrix multiplication, a fundamental and well-understood primitive of randomized linear algebra. An important contribution of our work is the analysis of the behavior of sub-sampled ridge regression problems when the ridge leverage scores and random projections are used: we prove that accurate approximations can be achieved by a sample whose size depends on the degrees of freedom of the ridge-regression problem rather than the dimensions of the design matrix.

2. Fisher discriminant analysis: We develop faster algorithms for regularized Fisher discriminant analysis (RFDA), which is a widely used method for classification and dimensionality reduction. More precisely, we present an iterative algorithm for massively under-constrained RFDA based on randomized matrix-sketching. Our algorithm comes with provable accuracy guarantees when compared to the conventional approach. We analyze the behavior of RFDA when leverage scores, ridge leverage scores and other random projection-based constructions are used for the dimensionality reduction, and prove that accurate approximations can be achieved by a sample whose size depends on the effective degrees of freedom of the RFDA problem. Our results yield significant improvements over existing approaches and our empirical evaluations support our theoretical analyses.

3. Linear programming: We further extend the RandNLA framework to analyze and speed-up linear programming (LP), an extremely useful tool which has been successfully applied to solve various problems including many machine learning applications, such as L1-regularized SVMs, basis pursuit, nonnegative matrix factorization, etc. Interior Point Methods (IPMs) are one of the most popular methods to solve LPs both in theory and in practice and their underlying complexity is dominated by the cost of solving a system of linear equations at each iteration. Using tools from Randomized Linear Algebra, we present a preconditioning technique that, when combined with the iterative solvers such as Conjugate Gradient or Chebyshev Iteration, provably guarantees that IPM algorithms (suitably modified to account for the error incurred by the approximate solver), converge to a feasible, approximately optimal solution, without increasing their iteration complexity.

4. Cost-preserving projections: In context of constrained low-rank approximation problems, we study the notion of cost-preserving projection which precisely guarantees that the cost of any rank-$k$ projection can be approximately preserved using a smaller sketch of the original data matrix. It has recently emerged as a fundamental principle in the design and analysis of sketching-based algorithms for common matrix operations that are critical in data mining and machine learning. In this work, we present a general structural result outlining four sufficient conditions to achieve projection-cost preservation. These conditions boil down to the foundation of sketching-based matrix multiplication and can be satisfied using the tools from randomized linear algebra. Our work unifies and generalizes prior known constructions for projection-cost preserving sketches based on the standard leverage scores, ridge leverage scores, as well as other constructions.

Funding

BIGDATA: F: DKA: Collaborative Research: Randomized Numerical Linear Algebra (RandNLA) for multi-linear and non-linear data

Directorate for Computer & Information Science & Engineering

Find out more...

III: Small: Fast and Efficient Algorithms for Matrix Decompositions and Applications to Human Genetics

Directorate for Computer & Information Science & Engineering

Find out more...

CCF-BSF: AF: Small: Collaborative Research: Practice-Friendly Theory and Algorithms for Linear Regression Problems

Directorate for Computer & Information Science & Engineering

Find out more...

FRG: Collaborative Research: Randomization as a Resource for Rapid Prototyping

Directorate for Mathematical & Physical Sciences

Find out more...

History

Degree Type

  • Doctor of Philosophy

Department

  • Statistics

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Hao Zhang

Advisor/Supervisor/Committee co-chair

Petros Drineas

Additional Committee Member 2

Guang Cheng

Additional Committee Member 3

Anindya Bhadra