Spotlight on Lablet Research #24 - Development of Methodology Guidelines for Security Research

Spotlight on Lablet Research #24 -

Development of Methodology Guidelines for Security Research

 

Lablet: North Carolina State University
Sub-Lablet: University of Alabama

The goal of this project is to aid the security research community in conducting and reporting methodologically sound science through the development, refinement, and use of community-based security research guidelines. Led by Principal Investigator (PI) Jeff Carver, the research team proposed the characterization of the security literature based upon those guidelines.

This research project is aimed at providing support to researchers interested in the quality of scientific reporting in the cybersecurity community by developing guidelines that provide insight into the scientific rigor of the information included in a research report (i.e., a journal or conference paper). By providing guidelines to help researchers report the most important information relative to scientific rigor, this project will help ensure that other researchers can easily replicate published cybersecurity research. It will also help the readers of these publications better analyze their importance and usefulness in the current environment. Lastly, by helping to ensure that all crucial information is present in papers, this project will support theory building that can provide a foundation for additional research in the future. To create these guidelines, the team interviewed experts from the cybersecurity community, both from within the Lablets and from outside. The goal of these interactions is to determine what type of information is most important for judging scientific rigor in different portions of the cybersecurity community. To ensure that the guidelines are widely usable and accepted, it is important to include experts from different parts of the cybersecurity landscape who can provide different perspectives. In addition to interviewing researchers who are connected to the Lablets, the team also interviewed researchers outside the Lablets who can provide different perspectives. This approach not only accommodates the diversity of the field itself but allows for a larger portion of the security research community to have input into the contents of the guidelines. By gathering input from a wide swath of researchers representing different perspectives, the guidelines should be more widely accepted in the larger cybersecurity research community.

Interviews with experts have yielded valuable insights and knowledge into what makes cybersecurity research scientific. The team broadened the scope of expertise by including experts who focus on industry-led research, the philosophy of ethics in cybersecurity, reverse-engineering and hardware assurance, security in distributed web systems, and research ethics. The diverse representation of topics within and around cybersecurity research allowed the researchers to include new perspectives and valuable data points in their analysis that would have otherwise been missed from the traditional cybersecurity settings. To this end, the team continues to work towards a better and more versatile rubric or set of guidelines as they continue to interview additional experts. The interviews reveal the scope of information the Science of Security and the Paper Review guidelines will have to contain to address different types of cybersecurity research papers. The key finding from the interviews is that guidelines that address a wide variety of cybersecurity topics will be complex and potentially large. The team's engagement with the community has been frequent and largely positive in regards to the project goals despite the challenges of the pandemic.

The researchers have made significant progress on their "Good Examples" paper that presents examples of good practices in scientific reporting from papers published in IEEE S&P and ACM CCS. The knowledge gained from analyzing these publications will be helpful not only for providing validity to the interview findings but also for increasing the acceptance rate of the conclusions drawn in the larger community. The team plans to develop an initial draft of the guidelines based on the data from the interviews and "Good Examples" paper, and would like to get feedback on this initial draft either through a workshop or through some other types of interactions. The feedback would include how well the guidelines are organized, how well they can be used, and what is missing or needs modification. In parallel to this first draft of the guidelines, the team will begin identifying cybersecurity experts outside the Lablets who can help ensure the validity of the conclusions and provide any additional perspectives that may have been missed.

The researchers did a study that supplemented the interview data by analyzing the comments left by reviewers on submissions to the 2020 HotSoS symposium. They are now integrating the results of this analysis into the guideline development process. Because the HotSoS conference is focused on the science of security, the content of these reviews helps to better understand the types of information science of security experts (the HotSoS reviewers) find important when reviewing cybersecurity research. This data will be used to further verify and support the guidelines created from the interview data. For the guidelines for scientific reporting in cybersecurity, the team has refined the thematic groups which categorize the data gathered from the interviews. These thematic groups are integral to the work in finalizing the first version of the guidelines for improving scientific rigor and validity in cybersecurity reporting. The researchers plan to release version 1 of the guidelines by the end of 2021.

Background on the project can be found here.

Submitted by Anonymous on