"New Cyber Software Can Verify How Much Knowledge AI Really Knows"

As a result of the developing global interest in generative Artificial Intelligence (AI) systems, University of Surrey researchers have developed software that can verify how much information an AI gathered from an organization's digital database. As part of a company's online security protocol, Surrey's verification software can be used to determine if an AI has learned too much or accessed sensitive data. The software can also determine whether AI has identified and can exploit software code vulnerabilities. For example, in the context of online gaming, it could identify if an AI has learned to always win at online poker by exploiting a coding bug. The verification software can infer how much AI can learn from their interaction, whether they have sufficient knowledge to facilitate successful cooperation, and whether they have an excessive amount of knowledge that will compromise privacy. This article continues to discuss the new cyber software developed to verify how much knowledge AI has.

The University of Surrey reports "New Cyber Software Can Verify How Much Knowledge AI Really Knows"

Submitted by Anonymous on