"Four Ways Criminals Could Use AI to Target More Victims"

Daniel Prince, a cybersecurity professor at Lancaster University, explores how criminals could use Artificial Intelligence (AI) to target victims. AI is a tool used to improve productivity, process and organize large volumes of data, and offload decision-making. However, AI tools are accessible to anyone, including criminals. Observing how criminals have adapted to and adopted technological advancements in the past can provide insight into how they may use AI. AI tools such as ChatGPT and Google's Bard provide writing assistance, enabling, for example, inexperienced writers to compose effective marketing messages, but this technology could also help criminals sound more credible when contacting potential victims via phishing emails and text messages. The technique known as "brute forcing" could also benefit from AI. This is where numerous character and symbol combinations are tried to determine if they match passwords. This article continues to discuss the different ways in which criminals could use AI to target victims.  

The Conversation reports "Four Ways Criminals Could Use AI to Target More Victims"

Submitted by Anonymous on