Spotlight on Lablet Research #28 - Designing for Privacy

Spotlight on Lablet Research #28 -

Designing for Privacy

 

Lablet: International Computer Science Institute (ICSI)

Design interventions for privacy can occur at a lot of stages and levels, and the goal of the project is to develop a new toolbox of techniques and help designers understand when best to apply tools.

The project focuses on designing for privacy holistically: from "privacy by design" to "privacy with design," i.e., designing with privacy throughout the whole life cycle. Privacy is defined in contextual, situational, and relational ways, and its dimensions are theory, protection, harm, provision, and scope. Principal Investigator (PI) Serge Egelman and his team plan to hold privacy design workshops to address engineering practices, methods, and tools, bringing together practitioners, researchers, and policy-makers. One goal for this series of workshops is to examine how current approaches to privacy engineering (e.g., applying Privacy by Design principles) are actually being applied in practice–that is, are there human limitations that are preventing these recommended practices from being used? Another goal is to examine how privacy engineering practices can be improved via policy, both at the organizational level and governmental.

The research team recently conducted a narrative literature review to understand the factors that impact developers' adoption of privacy and security practices in product design and software development. On the basis of this survey, they developed a model that categorizes the factors that affect developers' decision-making: from the environmental level (context outside of organization or team) to organizational, product and development process-related levels, and finally, personal levels.

With students at UC Berkeley, the researchers evaluated the familiarity of smartphone users with privacy and security settings, their expectations about their ability to configure those settings, their understanding of the privacy and security threats against which the settings are supposed to protect them, and their expectations about the effectiveness of the settings. In an online survey with 178 users with diverse backgrounds and demographics, they found that many people were not aware of smartphone privacy/security settings and their defaults, and had not configured them in the past, though they expressed willingness to do it in the future. Some participants perceived low self-efficacy and expected difficulties and usability issues with configuring those settings. Researchers compared the findings across various socio-economic groups of participants to draw conclusions about what groups are especially vulnerable to the identified issues. The findings showed that, compared to so-called "average users," certain user groups, such as older adults, racial/ethnic minorities, and females, were less concerned about online privacy and security, engaged less in configuring smartphone privacy and security settings, and expected more difficulties with configuring them and more negative impacts on user experience. But even the "average users" in that survey expressed low levels of awareness and engagement, and some concerns about their expected difficulties with configuring smartphone privacy/security settings. Building on these findings, the research team is conducting a new study to investigate whether those self-reported expectations are confirmed by behavioral observations. The goals of the new study are to better understand what difficulties users actually face when configuring smartphone privacy settings (specifically, finding and enabling them), where negative user experiences may stem from, what users understand about the implications of changing those settings, and how changes to the design of settings interfaces could make configuration easier. As with the survey, researchers are particularly interested in examining potential differences among users with different socio-economic backgrounds and the implications of those differences for design.

In another study, the researchers conducted cognitive walkthrough interviews with a demographically diverse sample of iOS and Android users. They asked the participants to talk through configuring three smartphone privacy settings, then asked follow-up questions about difficulty and clarity. The data from the walkthroughs and follow-up interviews is currently being analyzed.

Researchers are continuing to analyze and thematically code interviews with nannies and au pairs about their experiences with, and views of, working in homes with cameras and smart home devices. The analysis particularly focuses on identifying the most feasible points of intervention for improving domestic employees' control over the privacy effects of such devices, whether technical controls or in terms of negotiations between nannies and employers, such as guidelines and education. Based on the studies with nannies, the team is working with colleagues at the University of Oxford to help them develop workshop content for events for domestic workers about their privacy rights and options.

Background on this project can be found here. 

Submitted by Anonymous on