A Method for Formal Verification of Neural Network Collision Avoidance Controller

pdf

ABSTRACT

We present an approach for verifying the safety of an autonomous systemcontroller, such as a collision avoidance controller, implemented by a neuralnetwork (NN). The described approach applies to NN controllers withpiecewise affine activation units, which includes many common NN architectures. Theapproach relies on a combination of reachability analysis and analgorithm foridentifying the critical state space regions where the controller'schoice ofaction determines future safety. We show that by evaluating the safety of the controller on this critical set, itssafety can be guaranteed for alarge subsetof the state space. On the critical set, thecontroller safety can be exhaustively checked using an SMT (Solver Modulo Theory) solver, toevaluate a conservative safetycondition over a discretization partition. We demonstrate the proposedapproachon a simplified 2D collision avoidance NN controller in an adversarial environment simulating vertical aircraft collision avoidance.
 

Dr. Daniel Genin is a Senior Staff Scientist at JHU/APL. He received his
bachelors and doctoral degrees in mathematics from the Pennsylvania State University. Daniel's current interests lie in the area of developing and applying methods for building provably correct cyber-physical systems. At APL he has contributed to a range of projects involving verification of safety and mission critical systems such as ACAS X airborne collision avoidance system, run time assurance UAV geofencing and collision avoidance for UGVs.

Tags:
License: CC-2.5
Submitted by Katie Dey on