TEE-based decentralized recommender systems: The raw data sharing redemption
Author
Abstract

Recommenders are central in many applications today. The most effective recommendation schemes, such as those based on collaborative filtering (CF), exploit similarities between user profiles to make recommendations, but potentially expose private data. Federated learning and decentralized learning systems address this by letting the data stay on user's machines to preserve privacy: each user performs the training on local data and only the model parameters are shared. However, sharing the model parameters across the network may still yield privacy breaches. In this paper, we present Rex, the first enclave-based decentralized CF recommender. Rex exploits Trusted execution environments (TEE), such as Intel software guard extensions (SGX), that provide shielded environments within the processor to improve convergence while preserving privacy. Firstly, Rex enables raw data sharing, which ultimately speeds up convergence and reduces the network load. Secondly, Rex fully preserves privacy. We analyze the impact of raw data sharing in both deep neural network (DNN) and matrix factorization (MF) recommenders and showcase the benefits of trusted environments in a full-fledged implementation of Rex. Our experimental results demonstrate that through raw data sharing, Rex significantly decreases the training time by 18.3 x and the network load by 2 orders of magnitude over standard decentralized approaches that share only parameters, while fully protecting privacy by leveraging trustworthy hardware enclaves with very little overhead.

Year of Publication
2022
Conference Name
2022 IEEE International Parallel and Distributed Processing Symposium (IPDPS)
Google Scholar | BibTeX