Privacy through Accountability

pdf

Presented as part of the 2015 HCSS conference.

Abstract:

Privacy through accountability refers to the principle that entities that hold personal information about individuals are accountable for adopting measures that protect the privacy of the data subjects. In this talk, I will cover computational treatments of this principle. This emerging research area, which my research group has played a pivotal role in developing, has produced precise definitions of privacy properties and computational accountability mechanisms to aid in their enforcement. After providing an overview of the research area, I will focus on two of our recent results in Web privacy.

First, I will present our joint work with Microsoft Research on building and operating a system to automate privacy policy compliance checking in Bing. Central to the design of the system are (a) LEGALEASE -- a language that allows specification of privacy policies that impose restrictions on how user data is handled; and (b) GROK -- a data inventory for Map-Reduce-like big data systems that tracks how user data flows among programs. GROK maps code-level schema elements to datatypes in LEGALEASE, in essence, annotating existing programs with information flow types with minimal human input. Compliance checking is thus reduced to information flow analysis of big data systems. The system, bootstrapped by a small team, checks compliance daily of millions of lines of ever-changing source code in the data analytics pipeline for Bing written by several thousand developers.

Second, I will describe the problem of detecting personal data usage by websites when the analyst does not have access to the code of the system nor full control over the inputs or observability of all outputs of the system. A concrete example of this setting is one in which a privacy advocacy group, a government regulator, or a Web user may be interested in checking whether a particular web site uses certain types of personal information for advertising. I will present a methodology for information flow experiments based on experimental science and statistical analysis that addresses this problem, our tool AdFisher that incorporates this methodology, and findings of opacity, choice and discrimination from our experiments with Google's Ad EcoSystem.

Biography:

Anupam Datta is an Associate Professor at Carnegie Mellon University where he holds a joint appointment in the Computer Science and Electrical and Computer Engineering Departments. His research focuses on the scientific foundations of security and privacy. Datta's work has helped develop the research areas of Compositional Security and Privacy through Accountability. He serves as an Editor-in-Chief of Foundations and Trends in Privacy and Security, as an Associate Editor of the Journal of Computer Security and the Journal of Computer and System Sciences, as well as the 2013 & 2014 Program Co-Chair of the IEEE Computer Security Foundations Symposium. Datta obtained Ph.D. (2005) and M.S. (2002) degrees from Stanford University and a B.Tech. (2000) from IIT Kharagpur, all in Computer Science.

Tags:
License: CC-2.5
Submitted by Anupam Datta on