"Google Is Open-Sourcing a Tool for Data Scientists to Help Protect Private Information"
Google will be releasing an open-source version of its differential privacy library, which will allow organizations to study their data without impacting the privacy of user information. Differential privacy is described as a cryptographic approach to data science in which user data is mixed with random noise. The approach leads to the inability to identify individuals using the results of an analysis. Differential privacy can be used by a number of different sectors, including healthcare and sociology. Google also released an open-source tool earlier this year, called TensorFlow Privacy, to maintain the anonymity of user data when training AI algorithms. This article continues to discuss Google's open-source differential privacy tool, the concept of differential privacy, and the adoption of this approach by different sectors.