Purdue University Graduate School
Browse

Physical Perception Attacks and Defenses in Autonomous Systems

Download (11.03 MB)
thesis
posted on 2025-06-10, 12:28 authored by Raymond Wijaya MullerRaymond Wijaya Muller

This dissertation investigates the security vulnerabilities and defense mechanisms within the visual perception pipelines of autonomous systems, such as self-driving cars and surveillance robots. It first introduces ATTRACKZONE, a physical attack that manipulates object trackers by projecting subtle perturbations onto the environment, which can cause tracked objects to appear to move or disappear, leading to potentially dangerous system behavior. As a countermeasure, the thesis presents VOGUES, a defense framework that mimics human-like reasoning by verifying the consistency between a detected object and its constituent parts over time, thereby detecting tracker hijacking and other attacks. The research further explores availability attacks with DETSTORM, a method that disrupts the timely operation of perception systems by creating adversarial objects that inflate computational load and induce significant processing delays. Collectively, these studies advance the understanding of security risks in modern perception systems by systematically evaluating both physical-world attacks and holistic defenses, contributing new methodologies for securing autonomous technologies against emerging threats.

History

Degree Type

  • Doctor of Philosophy

Department

  • Computer Science

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Z. Berkay Celik

Additional Committee Member 2

Antonio Bianchi

Additional Committee Member 3

Dongyan Xu

Additional Committee Member 4

Tianyi Zhang