Purdue University Graduate School
Browse
- No file added yet -

ANALYSIS OF LATENT SPACE REPRESENTATIONS FOR OBJECT DETECTION

Download (44.94 MB)
thesis
posted on 2024-09-03, 12:29 authored by Ashley S DaleAshley S Dale

Deep Neural Networks (DNNs) successfully perform object detection tasks, and the Con- volutional Neural Network (CNN) backbone is a commonly used feature extractor before secondary tasks such as detection, classification, or segmentation. In a DNN model, the relationship between the features learned by the model from the training data and the features leveraged by the model during test and deployment has motivated the area of feature interpretability studies. The work presented here applies equally to white-box and black-box models and to any DNN architecture. The metrics developed do not require any information beyond the feature vector generated by the feature extraction backbone. These methods are therefore the first methods capable of estimating black-box model robustness in terms of latent space complexity and the first methods capable of examining feature representations in the latent space of black box models.

This work contributes the following four novel methodologies and results. First, a method for quantifying the invariance and/or equivariance of a model using the training data shows that the representation of a feature in the model impacts model performance. Second, a method for quantifying an observed domain gap in a dataset using the latent feature vectors of an object detection model is paired with pixel-level augmentation techniques to close the gap between real and synthetic data. This results in an improvement in the model’s F1 score on a test set of outliers from 0.5 to 0.9. Third, a method for visualizing and quantifying similarities of the latent manifolds of two black-box models is used to correlate similar feature representation with increase success in the transferability of gradient-based attacks. Finally, a method for examining the global complexity of decision boundaries in black-box models is presented, where more complex decision boundaries are shown to correlate with increased model robustness to gradient-based and random attacks.

Funding

SAIC subcontract # P010238559

Naval Surface Warfare center for Broad Agency Announcement for FY18 SOLICITATION NO.: DOTC INIT018

ONR CDEW: Contract\Grant\MIPR # N00014-20-1-2674

US Department of Defense (Contract W52P1J2093009)

NSWC-CRANE N00164-24-1-1001

US Army Research Lab Congressional Add-on (CACI International W15P7T-19-D-0157)

History

Degree Type

  • Doctor of Philosophy

Department

  • Electrical and Computer Engineering

Campus location

  • Indianapolis

Advisor/Supervisor/Committee Chair

Dr. Lauren Christopher

Additional Committee Member 2

Dr. Brian King

Additional Committee Member 3

Dr. Paul Salama

Additional Committee Member 4

Dr. Maher Rizkalla