Run-Time Assurance Architecture for Learning-Enabled Systems
Presented as part of the 2020 HCSS conference.
There has been much publicity surrounding the use of machine learning technologies in self-driving cars and the challenges this presents for guaranteeing safety. These technologies are also being investigated for use in manned and unmanned aircraft. However, systems including "learning-enabled components" (LECs) and their software implementations are not amenable to verification and certification using current methods. This limits the functionality that can realistically be fielded, and essentially precludes use of these technologies in safety-critical aerospace applications. Our team is developing new technologies for analysis, testing, and architectural mitigation, with the goal of enabling autonomous systems containing LECs to be safely deployed in critical environments. |
We have produced a demonstration of a run-time assurance architecture based on a neural network aircraft taxiing application that shows how several advanced technologies could be used to ensure safe operation. The demonstration system includes:
- Safety architecture based on the ASTM F3269-17 standard for bounded behavior of complex systems
- AADL architecture model with verification using AGREE
- Architecture-based assurance case using Resolute
- Diverse run-time monitors of system safety
- Formal synthesis of critical high-assurance components
The enhanced system demonstrates the ability of the run-time assurance architecture to maintain system safety in the presence of defects in the underlying LEC.
Darren Cofer is a Fellow in the Trusted Systems group at Collins Aerospace. He earned his PhD in Electrical and Computer Engineering from The University of Texas at Austin.
His principal area of expertise is developing and applying advanced analysis methods and tools for verification and certification of high-integrity systems. His background includes work with formal methods for system and software analysis, the design of real-time embedded systems for safety-critical applications, and the development of nuclear propulsion systems in the U.S. Navy.
He has served as principal investigator on government-sponsored research programs with NASA, NSA, AFRL, and DARPA, developing and using formal methods for verification of safety and security properties. He is currently the principal investigator for Collins teams working on DARPA's Cyber Assured Systems Engineering (CASE) and Assured Autonomy programs.
Dr. Cofer served on RTCA committee SC-205 developing new certification guidance for airborne software (DO-178C) and was one of the developers of the Formal Methods Supplement (DO-333). He is a member of the RTCA Forum for Aeronautical Software, the Aerospace Control and Guidance Systems Committee (ACGSC), and a senior member of the IEEE.