"A New Era of Data Privacy Choices"

There are organizations that use Machine Learning (ML) and Artificial Intelligence (AI) algorithms to analyze massive amounts of browsing data, social network data, location data, voice data, and contact information that people share through their devices. They then draw conclusions about users' personalities, preferences, moods, and beliefs, which influence advertising, marketing, hiring, and more. Certain types of data are more privacy-sensitive than others. The division of data into sensitive and non-sensitive data is being challenged by advances in ML and predictive analytics due to refinements in prediction and inference, which allow potentially sensitive data to be inferred from otherwise innocuous data. In a project called "Digital Hermits," Jeanine Miklos-Thal, a professor in the Economics & Management and Marketing groups at the University of Rochester's Simon Business School, in collaboration with researchers from the University of Toronto, University of Rochester, and MIT, are using a theoretical model to explore how data sharing will change in the future. In their model, each user's data includes various personal characteristics, some of which users might prefer to keep private. Users can choose to share some, all, or none of their data with the company. Companies use ML and accumulated data to discover the relationships between various personal feature dimensions. Their model demonstrates that a firm's ability to predict more accurately and gather more data has a polarizing effect on users. Users eventually divide into two distinct categories: full data sharers and digital hermits. For data sharers, as companies get better at using partial data to predict unshared data, sharing partial data is no longer beneficial. Even when firms provide more incentives for data sharing, the digital hermit group of privacy-sensitive users chooses not to share any data at all, including data that they do not consider private. According to the researchers, to avoid all data sharing, the user must delete all social media accounts, disable all smart speakers and listening devices in the home, and use a dumbphone. This is a theoretical model, and real-world users are not always aware of how their data is being used. Large-scale data breaches make headlines from time to time, but users do not always relate these events to their personal lives unless they see evidence of data collection. This article continues to discuss findings from the joint "Digital Hermits" research project.

Simon Business School reports "A New Era of Data Privacy Choices"

Submitted by Anonymous on