"This New Chip Could Lead To Faster, More Secure AI"

A team of researchers led by University of Pittsburgh Assistant Professor Rajkumar Kubendran in the Swanson School of Engineering has contributed to the development of a new type of computer chip that could run Artificial Intelligence (AI) programs locally instead of relying on the cloud. This compute-in-memory (CIM) chip is a step toward developing new ways to use more secure, faster, inexpensive, and even more environmentally friendly AI. Anyone can have this computer processor in their phones or any mobile device, Kubendran explained. It would save battery life when using AI apps and make them easier to run on any device. According to the researchers, this advancement has the potential to be useful in almost any device. Most AI applications are currently hosted on the cloud, and one of the main bottlenecks for traditional AI hardware is the time and energy required to move data over long distances between servers and other devices. With CIM technology, AI can be offloaded from the cloud and to the device directly, thus eliminating the power-hungry data movement process between separate parts of the computer. Kubendran says that previous CIM chips required too much power to run complex AI applications directly on the devices that used them. However, the team's new design is efficient enough to be used by battery-powered devices such as smart wearables or drones. This article continues to discuss the CIM chip aimed at making AI faster and more secure. 

Pittwire reports "This New Chip Could Lead To Faster, More Secure AI"

Submitted by Anonymous on