|
Modern product development workflows rely on integrated tools for requirements specification, system design, implementation, verification, and overall lifecycle management. The broad applicability, interoperability, and automated nature of these tools make them appealing for rapid development cycles, but they can also result in the loss of process or workflow transparency if their use is not well understood. This effect will be further exacerbated by the adoption of generative AI development assistants. In the context of high-assurance systems development, such loss of transparency can lead to violation of critical system properties and catastrophic failure. Remote attestation is an established technique for gathering, appraising, and reporting evidence of system state at boot and run-time. In this work we propose lifecycle attestation as a new attestation modality that targets design-time integrity of artifacts in model-based software development workflows. Lifecycle attestation takes measurements over a diverse collection of artifacts including system models and requirements, build pipeline tools and their dependencies, tool outputs (test results, code contracts), and system executables. A primary contribuiton of this work is to develop complimentary theories of Appraisal, Sufficiency and Artifact Repair that share a common formal semantics for attestation evidence. Using formally-verified attestation tools allows us to ground and bootstrap out statements of artifact integrity, independent of the hallucinatory environment of LLMassisted development and repair. To develop and validate these capabilities we outline two proof-of-concept implementations. The first is a high-assurance build pipeline that leverages HAMR code generation and Verus contract checking to arrive at Rust executables that conform to formal properties specified by input SysMLv2 models. The second is framework we call HEAL that supports LLM-guided repair and appraisal of standalone Rust/Verus source files. We conclude with initial steps toward security and performance evaluation, including a candidate threat model and profiling of attestation overhead. Finally, we show integration of lifecycle attestation capabilities into the ongoing DARPA PROVERS effort as part of its open platform demonstration system and assurance dashboard. |
| Adam Petz is a Research Scientist of Computer Science at the University of Kansas and the Institute for Information Sciences (I2S). His research interests lie at the intersection of formal methods, programming language semantics, trusted computing, and systems security. During his time as a student and professional researcher, he has contributed to a number of successful federally-funded research efforts that include the DARPA Cyber Assured Systems Engineering (CASE) effort, NSA Science of Security lablet, and currently DARPA PROVERS. Adam completed his PhD in Computer Science at the University of Kansas in June 2022, and prior to that two undergraduate degrees in CS and Mathematics at Emporia State University. |