Machine Learning and the Unknown Unknowns
Abstract: One of the important certification objectives for airborne software is demonstrating the absence of unintended behavior. In current software development processes, unintended behavior is associated with some identifiable structural feature, such as specific lines of code or a model element. However, in machine learning approaches to system development, unintended behavior may emerge from the data used to train the system. New inputs not encountered during training may result in novel activations in a neural network, leading to unexpected (and potentially dangerous) outputs. In this talk we will review the rationale and methods for detecting unintended behavior in current airborne software systems, including the use of model based development techniques and formal methods for software verification. Then we will consider the challenges posed by machine learning and examine new techniques that are being developed to address these challenges, as well as how these techniques may shape new certification guidance. We will also present results from a recent flight demonstration in which run-time assurance techniques were used to guarantee the absence of unintented behaviors in neural network-based aircraft collision avoidance system.
Darren Cofer is a Fellow in the Trusted Systems group at Collins Aerospace. He earned his PhD in Electrical and Computer Engineering from The University of Texas at Austin.
His principal area of expertise is developing and applying advanced analysis methods and tools for verification and certification of high-integrity systems. His background includes work with formal methods for system and software analysis, the design of real-time embedded systems for safety-critical applications, and the development of nuclear propulsion systems in the U.S. Navy.
He has served as principal investigator on government-sponsored research programs with NASA, NSA, AFRL, and DARPA, developing and using formal methods for verification of safety and security properties. He is currently the principal investigator for Collins teams working on DARPA's Cyber Assured Systems Engineering (CASE) and Assured Autonomy programs.
Dr. Cofer served on RTCA committee SC-205 developing new certification guidance for airborne software (DO-178C) and was one of the developers of the Formal Methods Supplement (DO-333). He is a member of SAE committee G-34 on Artificial Intelligence in Aviation, the Aerospace Control and Guidance Systems Committee (ACGSC), and a senior member of the IEEE.