"How Companies Should Think About Compensating Users for Private Data"

As data-hungry Artificial Intelligence (AI) and Machine Learning (ML) technologies become more efficient, the key question, according to Ali Makhdoumi, an associate professor of decision sciences at Duke University's Fuqua School of Business, is how to incentivize data sharing while protecting users' privacy. Makhdoumi and co-authors Alireza Fallah of the University of California, Berkeley, Azarakhsh Malekian of the University of Toronto, and Asuman Ozdaglar of the Massachusetts Institute of Technology argue in a new paper that the solution may lie in developing a mechanism that measures users' privacy sensitivity and compensates them for disclosing personal information. They used differential privacy, a method widely adopted in the technology industry. The team designed a new data acquisition mechanism that takes into account users' privacy sensitivity, assigns a value to it, and determines the best incentive mechanism for data-dependent platforms. This article continues to discuss the data acquisition mechanism that maximizes platforms' utility while compensating privacy-sensitive users.

Duke University's Fuqua School of Business reports "How Companies Should Think About Compensating Users for Private Data"

Submitted by grigby1

Submitted by Gregory Rigby on