Purdue University Graduate School
Browse

DIFFERENTIALLY PRIVATE SUBLINEAR ALGORITHMS

Download (1.34 MB)
thesis
posted on 2023-06-07, 18:54 authored by Tamalika MukherjeeTamalika Mukherjee

Collecting user data is crucial for advancing machine learning, social science, and government policies, but the privacy of the users whose data is being collected is a growing concern. {\em Differential Privacy (DP)} has emerged as the most standard notion for privacy protection with robust mathematical guarantees. Analyzing such massive amounts of data in a privacy-preserving manner motivates the need to study differentially-private algorithms that are also super-efficient.  


This thesis initiates a systematic study of differentially-private sublinear-time and sublinear-space algorithms. The contributions of this thesis are two-fold. First, we design some of the first differentially private sublinear algorithms for many fundamental problems. Second, we develop general DP techniques for designing differentially-private sublinear algorithms. 


We give the first DP sublinear algorithm for clustering by generalizing a subsampling framework from the non-DP sublinear-time literature. We give the first DP sublinear algorithm for estimating the maximum matching size. Our DP sublinear algorithm for estimating the average degree of the graph achieves a better approximation than previous works. We give the first DP algorithm for releasing $L_2$-heavy hitters in the sliding window model and a pure $L_1$-heavy hitter algorithm in the same model, which improves upon previous works.  


We develop general techniques that address the challenges of designing sublinear DP algorithms. First, we introduce the concept of Coupled Global Sensitivity (CGS). Intuitively, the CGS of a randomized algorithm generalizes the classical  notion of global sensitivity of a function, by considering a coupling of the random coins of the algorithm when run on neighboring inputs. We show that one can achieve pure DP by adding Laplace noise proportional to the CGS of an algorithm. Second, we give a black box DP transformation for a specific class of approximation algorithms. We show that such algorithms can be made differentially private without sacrificing accuracy, as long as the function has small global sensitivity. In particular, this transformation gives rise to sublinear DP algorithms for many problems, including triangle counting, the weight of the minimum spanning tree, and norm estimation.

Funding

Bilsland Dissertation Fellowship

History

Degree Type

  • Doctor of Philosophy

Department

  • Computer Science

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Jeremiah Blocki

Advisor/Supervisor/Committee co-chair

Elena Grigorescu

Additional Committee Member 2

Mikhail Atallah

Additional Committee Member 3

Christopher Clifton

Additional Committee Member 4

Ninghui Li

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC