"Counterfit: Open-Source Tool for Testing the Security of AI Systems"

Microsoft has decided to open-source a tool it developed to test the security of its own Artificial Intelligence (AI) systems and assess them for vulnerabilities. The tool named Counterfit will help organizations verify the robustness, reliability, and trustworthiness of the AI algorithms that they use. Counterfit started out as a set of attack scripts written to target individual models. Microsoft then transformed Counterfit into an automation tool that can attack multiple AI systems at scale. Counterfit is a command-line tool that organizations can install and use locally or in a cloud. Security professionals can use it to perform a penetration test and conduct red teaming operations on AI systems. They can also use it to scan the systems for vulnerabilities and log attacks against a target model. The tool also works on AI models using different types of data, including text, images, and generic input. This article continues to discuss the purpose, capabilities, and applications of the Counterfit tool. 

Help Net Security reports "Counterfit: Open-Source Tool for Testing the Security of AI Systems"

Submitted by Anonymous on