"Software Detects Backdoor Attacks on Facial Recognition"
The growing use of facial and object recognition by the US Army to train artificial intelligent (AI) systems in the identification of threats call for increased efforts toward bolstering the security of such technology against attacks. Researchers at Duke University have made a significant advancement in an Army project aimed at improving mitigation against backdoor attacks on facial and object recognition systems. Backdoor attacks are executed by poisoning the data fed to a machine learning model so that the model produces incorrect output or predictions. This article continues to discuss the importance of safeguarding the recognition systems used by the Army, the concept of backdoor attacks, and the success of software developed by researchers to detect such attacks.
The United States Army reports "Software Detects Backdoor Attacks on Facial Recognition"