"NIST Offers Draft Guidance on Evaluating a Privacy Protection Technique for the AI Era"

A new publication from the National Institute of Standards and Technology (NIST) offers guidance on using a type of mathematical algorithm known as differential privacy to help data-centric organizations strike a balance between privacy and accuracy. Using differential privacy, the data can be made public without revealing the identities of the individuals in the dataset. Differential privacy is one of the more mature Privacy-Enhancing Technologies (PETs) used in data analytics, but its implementation can be difficult due to a lack of standards, potentially creating a barrier for users. This work moves NIST closer to completing one of the tasks outlined in the recently issued Executive Order on Artificial Intelligence (AI), which is to advance research into PETs, such as differential privacy. This article continues to discuss the "Draft NIST Special Publication (SP) 800-226, Guidelines for Evaluating Differential Privacy Guarantees." 

NIST reports "NIST Offers Draft Guidance on Evaluating a Privacy Protection Technique for the AI Era"

Submitted by grigby1

Submitted by grigby1 CPVI on