"SpikeGPT: Researcher Releases Code for Largest-Ever Spiking Neural Network for Language Generation"

Language generators such as ChatGPT are growing more popular for their ability to transform how humans engage with Artificial Intelligence (AI) and search engines. However, these algorithms are computationally expensive to run and rely on maintenance from a small number of organizations to prevent downtime. Therefore, Jason Eshraghian, an assistant professor of electrical and computer engineering at UC Santa Cruz developed a new model for language generation that addresses both of these problems. Language models use modern deep learning techniques known as neural networks. Eshraghian uses an alternative algorithm called a spiking neural network (SNN) to power a language model. Recently, he and two students released the open-source code for SpikeGPT, the largest language-generating SNN, which uses 22 times less energy than a similar deep learning model. The use of SNNs for language generation can have significant effects on accessibility, data security, and other factors. SpikeGPT provides advantages for data security and privacy. With the language generator on a local device, data put into the systems are more secure and protected from potential data-harvesting enterprises. This article continues to discuss the research, development, and benefits of SpikeGPT.

The University of California Santa Cruz reports "SpikeGPT: Researcher Releases Code for Largest-Ever Spiking Neural Network for Language Generation"

Submitted by Anonymous on