Policy Impacts of Statistical Uncertainty and Privacy

pdf

ABSTRACT

Differential privacy (1) is an increasingly popular tool for preserving individuals’ privacy by adding statistical uncertainty when sharing sensitive data. Its introduction into US Census Bureau operations (2), however, has been controversial. Scholars, politicians, and activists have raised concerns about the integrity of census-guided democratic processes, from redistricting to voting rights. The debate raises important issues, yet most analyses of trade-offs around differential privacy overlook deeper uncertainties in census data (3). To illustrate, we examine how education policies that leverage census data misallocate funding because of statistical uncertainty, comparing the impacts of quantified data error and of a possible differentially private mechanism. We find that misallocations due to our differentially private mechanism occur on the margin of much larger misallocations due to existing data error that particularly disadvantage marginalized groups. But, we also find that policy reforms can reduce the disparate impacts of both data error and privacy mechanisms.

BIO

Ryan Steed is a PhD student at Carnegie Mellon's Heinz College of Information Systems and Public Policy. His research leverages empirical methods to examine privacy and equity in algorithmic systems, especially in relation to tech policy and governance. His current work examines the practical applications and impacts of algorithmic techniques for privacy-preserving analytics. Previously, he studied Computational Economics at the George Washington University.

Tags:
License: CC-2.5
Submitted by Alessandro Acquisti on