Private Disclosure of Information

pdf

Presented as part of the 2015 HCSS conference.

Abstract:

The healthcare domain is prime for investigating privacy. Any information about the subject’s physical, cognitive and/or mental states is private. A medical technology with increasing attention is health tele-monitoring, where a technology is used to collect health-related data about patients, which are later submitted to a medical staff for monitoring. The data are then used to assess the health-status of patients and provide them with feedback and/or intervention.

We are inundated with different body sensors that can monitor physical and physiological states of the subject. These sensors can report to the subject who wears them but they can also wirelessly report the data to the healthcare station in a tele-monitoring fashion. This configuration makes health tele-monitoring a privacy-sensitive cyber-physical system. In the phase of communication, there is a serious privacy risk since an adversary who can intercept the communication, can also misuse the data. In order to address the privacy risk in the aforementioned scenario, we devised a technical framework that aims to protect the privacy of individuals during communication, called Private Disclosure of Information (PDI).

PDI is aimed to prevent an adversary from inferring certain sensitive information about subjects using the data that they disclosed during communication with an intended recipient. PDI is an information-theoretic, statistical framework, which introduces a view of considering the interpretation of data, through inference, in maintaining privacy. PDI considers the case where each subject belongs to a private class (such as a health-condition or a disease). The subject-class membership can be statistically inferred from the data that the subject discloses during communication (data such as vital signs, symptoms, sensory data, etc...). PDI considers cases, such as tele-monitoring, where this class membership is known to both the subject herself and to the intended recipient, but not to the adversary. PDI leverages this advantage in knowledge in order to minimize the amount of information leaked by the communication to an adversary, who is trying to infer the class membership of subjects from the communicated data.

We leverage encoding schemes, where the encoding function of messages is different per class, for the communication. Out of all such schemes, PDI uses the encoding scheme that minimizes the amount of information which the transmitted messages carry to inferring the private class. PDI adopts the Mutual Information measure to model this quantity. The framework results in a learning problem for the encoding scheme, which we implemented in a form of a MATLAB toolbox.

In this talk, we present the technical details of the PDI framework, in addition to technical results on its privacy guarantees. We present the resulting learning problem of the framework. We also present a sufficient condition that guarantees perfect privacy, regardless of the adversary’s auxiliary knowledge, while preserving full utility of the information to the intended recipient. We demonstrate this result with specific examples and demonstrate the applicability of PDI on a real- world data set that simulates a health tele-monitoring scenario.

Biography:

Daniel Aranki received the B.Sc. degree in computer engineering from the Department of Electrical Engineering at Technion - Israel Institute of Technology, Haifa, Israel in 2011. Between the years 2007 and 2011, he worked in the Mobile Wireless Group at Intel Corporation, Haifa, Israel. During his time there, he worked on WiFi receivers design, design and verification flows automation and WiFi system architecture design. In 2011, he joined the Teleimmersion Laboratory at the department of Electrical Engineering and Computer Science at University of California, Berkeley for doctoral studies. His research interests include machine learning, privacy, information disclosure and health tele-monitoring.

Tags:
License: CC-2.5
Submitted by Anonymous on