Is there progress in the Science of Security?
Lablet panel discusses progress in the Science of Security at quarterly meeting
The fall 2016 quarterly Science of Security (SoS) Lablet meeting was hosted, for the first time, by NSA in the Emerson V Auditorium on November 2-3, 2016. The agenda included a panel on progress in the Science of Security. Moderated by Brad Martin, the panel consisted of the four Lablet Principle Investigators, Laurie Williams, NCSU; David Nicol, UIUC, Michel Cukier, UMD; and Bill Scherlis, CMU. Topics of discussion ranged from modelling schemes, privacy, and “science envy” to supply chain problems.
Each lablet introduced its structure and progress to date. NC State’s Laurie Williams described their structure as 15 faculty members from computer science, electrical engineering, psychology, and computer engineering and six external collaborators. Their work addresses verification and revision of normative specifications of privacy, enumeration of potential misuse cases from requirements, attack surfaces, risk-based attack surface approximation, literature reviews and updates on intrusion detection, an automated synthesis of resilient configurations, information flow control in Android, SDN optimization, natural interactions for bot detection, and phishing patterns and users’ personalities.
Michel Cukier, UMd, told of 20 faculty members from computer science and electrical and computer engineering, along with 5 external researchers focused on human behavior issues: understanding how users process security advice, and whether to trust the source or the context. They are applying experience in criminology to cybersecurity with regard to routine activities, rational choice, and deterrence theory. Finally, they are looking at empirical models of vulnerabilities and attack surfaces.
David Nicol identified 20 faculty members from 9 universities as part of the UIUC lablet. Their projects include hypothesis testing for network security, data driven diversity models and model-based decision making, human circumvention of security, static-dynamic analysis of security metrics for CPS, and anonymous messaging. For large scale problems, they are using factor graphs to deal with complexity and to develop predictive metrics. They have developed a testbed and library of attacks for resilient architectures that focus on attack detection that use rule-based machine learning for anomaly detection.
Carnegie Mellon has 15 faculty and 6 external collaborators according to Bill Scherlis. Their project matrix shows emphasis on scalability and composability and human factors research, as well as touching on the other hard problems. Their projects are looking at cognitive and psycho-social factors, as well as technologies, to determine factors that influence susceptibility to phishing. Their Security Behavior Observatory provides a way to observe actual security behavior by users.
Following these brief descriptions of each Lablet’s research focus and performance, the discussion became enthusiastically interactive with the audience. Panelists and the audience offered a range of comments about how to produce research that disrupts the adversary’s ability to get in, stay in, and act within our systems and networks. The NSA panel moderator began the interactive session by asking about the impact of “science envy” and the areas the lablets have improved knowledge. Bll Scherlis offered the view that cybersecurity is so broad that it can’t advance by advancing only one piece, that it requires a multi-disciplinary approach and that “this war is being waged on a synthetic landscape.” Science is growing and must continue to grow to deal with the complexity. David Nicol compared cybersecurity to biology and said our [cybersecurity] laws are not so immutable.” As we learn, our “laws” change and that we need validation of relationships. Michel Cukier suggested the need for a systems approach with the spotlight on metrics and the human side. Laurie Williams directly addressed the concept of “science envy” by defining it as the impulse to apply the right approach and model to a myriad of problems. The SoS Community is making progress in the Science of Security specifically because we are pushing the science model into cybersecurity research and our work is improving as a result. We are gaining systematization of knowledge, doing better with predictability, and getting better over time. Bill Scherlis rejoined with an analogy to being “under the streetlamp;” we academics are seeing what we see under the direct light and now need to move on into the shadows to make greater gains.
An audience member asked if people would be willing to deal better with risk if they are aware of the risk. Nicol stated that most users don’t know what risk is. Scherlis said that work at CMU’s Human Behavior Observatory confirms Nicol’s statement. Cukier said it is a complex problem to understand how people understand security online and cited patches as an example of lack of knowledge and understanding. Scherlis concluded that risk in cybersecurity is no longer actuarially tabulated.
The NSA Director of Trusted Research asked whether the five hard problems identified in the science of security project cover everything or whether there is a sixth. Nicol identified uncertainty and our ability to understand and quantify at as the sixth. Williams said that cryptography and all related is a concern that needs scientific rigor; Scherlis demurred.
Other issues that came up from audience questions include a discussion of human factors. One audience member observed that human behavior is challenging because when one measures human behaviors, people change those behaviors. In the past in physics, new theoretical models predicted new areas and approaches. Scherlis said that model building in an important part of the Science of Security, that it is an aspirational thing to think about designing new systems. Nicol concurred.
The panel moderator asked about other hard problem areas, including the supply chain, which the panel agreed is an important area of concern. Laurie Williams concluded the discussion with the observation that science heightens the need for accurate metrics. The SoS Community has a need for better security metrics, as well as performance metrics.