AI Computing in Large-Scale Era: Pre-trillion-scale Neural Network Models and Exa-scale Supercomputing | |
---|---|
Author | |
Abstract |
The development of AI computing has reached a critical inflection point. The scale of large-scale AI neural network model parameters has grown rapidly to “pre-trillion-scale” level. The computing needs of training large-scale AI neural network models have reached “exa-scale” level. Besides, AI Foundation Model also affects the correctness of AI applications, and becoming a new information security issue. Future AI development will be pushed by progress of computing power (supercomputer), algorithm (neural network model and parameter scale), and application (foundation model and downstream fine tuning). In particular, the computational efficiency of AI will be a key factor in the commercialization and popularization of AI applications. |
Year of Publication |
2023
|
Date Published |
apr
|
URL |
https://ieeexplore.ieee.org/document/10134466
|
DOI |
10.1109/VLSI-TSA/VLSI-DAT57221.2023.10134466
|
Google Scholar | BibTeX | DOI |