Developing Security Metrics

Image removed.

“Developing Security Metrics”

The presentation from the NC State Lablet and its collaborators on security metrics offered an overviews and a look at three research projects. The first described the overall security metrics project. Specific projects described included work on vulnerability and resilience prediction metrics and models, attack surface metrics, and using stack traces to approximate attack surfaces.

Developing Security Metrics (Overview)

Andy Meneely, Rochester Institute of Technology; Laurie Williams, Mladen Vouk, Huaiyu Dai, North Carolina State University

Metrics are a most important part of the sciences. Scientists use measurements of many kinds and have been in the forefront of developing new measuring tools and standards for a variety of scientific inquiries. Adding sound metrics to SoS is an important part of its development as a true science.

This project is intended to generate better allocation of resources for engineering secure software by contributing evidence-based knowledge, systematizing research on Intrusion Detection Systems, analyzing vulnerability-proneness and resilience of an overall system, and measuring the attack surface to assess risk. The NCSU Lablet’s approach is to systematize the knowledge, with metrics of IDS evaluation, and to use those metrics internally. They seek to systematize IDS knowledge by way of classification and methods by evaluation, benchmark performances, and identifying and measuring inherent limitations. Currently, they have conducted a systematic literature review, collected over 300 papers, and classified and narrowed them down.

Vulnerability and Resilience Prediction Metrics and Models

Mladen A. Vouk, Laurie Williams, Anoosha Vangaveeti, Da Young Lee, Shweta Subramani, NC State University

The goal of the Vulnerability and Resilience Prediction Metrics and Models Project is to develop a science-based understanding of which security metrics can be used to accurately predict its field resilience and vulnerability-proneness. The hypotheses tested are: Measurable properties of a system and associated software development processes are indicative of the presence of vulnerabilities in released software, and Statistical models based upon current (classical) reliability and availability prediction models and attack profiles can accurately predict the resilience of a system.

Their preliminary results indicate “Steady-state” security problem discovery rate in the field for Fedora and Windows are in the range of a few per week (rate is in the 10e-5 to 10e-7 per inservice-week). A large fraction (in the 30+% range) of problems reported weekly for STABLE field versions of Fedora and Windows are security problems. A very large fraction (65% and above) of security problems detected in the field for open-source Fedora (many different releases) belong to epistemic category (flawed process, knowledge, model, ..). Classical reliability models appear to describe and predict well field discovery of security problems for open-source Fedora.

They add that there are two implications. First, once software and its operational profile stabilize in the anomalies the result of sampling low to very low probability input vectors. Second, if a “white list” filter “closes” at that point, software may now be “immune” to further attacks (at the expense of some functional loss). 10e-5 to 10e-7 security problems per inservice week may be the best we can do given current OTS software development processes and usage patterns.

Their current conclusions are that we are making progress towards a good science-based understanding of which security metrics of a system can be used to (accurately) predict its field resilience and vulnerability.

Future hypotheses to be tested are:

  • Measurable properties of a system and associated software development processes are indicative of the presence of vulnerabilities in released software.
  • Statistical models based upon current (classical) reliability and availability prediction models and attack profiles can accurately predict the resilience of a system.

Attack Surface Metrics

Laurie Williams, Andy Meneely, Christopher Theisen, Nuthan Munaiah, NC State University

The goal of the attack surface metrics project is to assess risk of a software system by way of its input and output space. To do this the team will measure evolution over time assuming more inputs plus more entry points will produce more risk and more outputs plus more exit points will also produce more risk. The object will be to provide an early alert system for developers.

Identifying approaches used so far, they enumerate entry/exit points as functions that call input/output functions and measure ease of attack based on configurations. This methd will allow them to address three research questions:

  • Do vulnerabilities reside near the attack surface historically?
  • Do severe vulnerabilities appear in areas of high reachability?
  • Do applying asset weights and designed defenses improve our measurements?

Attack Surface Approximation via Stacktraces

Laurie Williams, Christopher Theisen, NC State University

The goal of the attack surface approximation via stacktraces project is to aid software engineers in prioritizing security efforts by approximating the attack surface of a system via stack trace analysis. This research will address the following questions:

  • How effectively can stack traces to be used to approximate the attack surface of a system?
  • Can the performance of vulnerability prediction be improved by limiting the prediction space to the approximated attack surface?

A PowerPoint version of the presentation can be found at: http://cps-vo.org/node/15735

The NC State SoS Lablet can be found at: http://research.csc.ncsu.edu/security/lablet/

(ID#:2837)

Note:

Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to SoS.Project (at) SecureDataBank.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.