Evaluating meaningful human control with general systems performance theory
The rapid advancement and deployment of Lethal Autonomous Weapon Systems (LAWS) presents significant ethical, legal, and operational challenges, particularly regarding the concept of Meaningful Human Control (MHC). While international discussions continue to define MHC, there is a need for a systematic framework to evaluate human performance in maintaining control over these autonomous systems. This study proposes applying General Systems Performance Theory (GSPT) and Nonlinear Causal Resource Analysis (NCRA) as a means to quantitatively assess the human operator's ability to meet the resource demands of high-level tasks necessary for MHC. Through a structured methodology, this research will conduct a literature review of automation bias in safety-critical sectors, explore the Department of Defense's (DoD) evolving guidance on autonomous weapons, and examine historical case studies to highlight the risks of diminished human oversight. By leveraging GSPT's hierarchical performance modeling, the study will identify critical performance resources that enable or hinder an operator's ability to exert MHC over LAWS. The findings will provide a novel approach for assessing and enhancing operator training, system design, and policy development to ensure that human oversight remains feasible and effective in autonomous warfare. Keywords: Artificial Intelligence, Lethal Autonomous Weapon Systems, LAWS, Meaningful Human Control, MHC, General Systems Performance Theory, GSPT, Nonlinear Causal Resource Analysis, Automation Bias.
History
Degree Type
- Doctor of Technology
Department
- Technology
Campus location
- West Lafayette