"New Tools Use AI 'Fingerprints' to Detect Altered Photos, Videos"

Digitally manipulated "deepfake" photos and videos are getting increasingly harder to spot as Artificial Intelligence (AI) networks improve and become more accessible. New research led by Binghamton University breaks down images using frequency domain analysis techniques and identifies anomalies indicating that AI generated them. The study compared real and fake images. The researchers created thousands of images with Adobe Firefly, PIXLR, DALL-E, and other generative AI tools, then analyzed them using signal processing to understand their frequency domain features. This article continues to discuss the study "Generative Adversarial Networks-Based AI-Generated Imagery Authentication Using Frequency Domain Analysis."

Binghamton University reports "New Tools Use AI 'Fingerprints' to Detect Altered Photos, Videos"

Submitted by grigby1

Submitted by Gregory Rigby on