"Apple to Scan iPhones For Child Sex Abuse Images"

In new versions of iOS and iPadOS coming this year,  before an image is stored onto iCloud Photos, the technology will search for matches of already known Child Sexual Abuse Material (CSAM).  Apple stated that if a match is found, a human reviewer will then assess and report the user to law enforcement. The system works by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations. Those images are translated into "hashes", numerical codes that can be "matched" to an image on an Apple device.   Apple stated the technology will also catch edited but similar versions of original images. The company claimed the system had an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.  The company says that the new technology offers "significant" privacy benefits over existing techniques, as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account.  However, some privacy experts have voiced concerns that the technology could be expanded to scan phones for prohibited content or even political speech.  Experts also worry that authoritarian governments could use the technology to spy on their citizens.

 

BBC reports: "Apple to Scan iPhones For Child Sex Abuse Images"

Submitted by Anonymous on