While most of us associate with Nvidia as a tech company that has some of the best Graphics Processing Units (GPUs), its product portfolio is much broader. This perception was further solidified with Nvidia’s roll-out of Grace CPU.
Grace is the company’s first CPU that has been purpose built for the data center. The CPU is built upon an Arm based processor and promises a performance delivery that is ten times more than that of today’s most powerful and fast servers.
The Grace CPU has been designed specifically for training giant AI models and other High Performance Computing (HPC) workloads. Present day giant AI models are comprised of over one billion different parameters.
With exponential growth in the sheer volumes of data, it is expected that the number of parameters in AI models is likely to double every two and a half months. As soon as Nvidia’s Grace was launched, two institutions have already geared up for its adoption.
The Swiss Supercomputing Center and the Los Alamos National Laboratory of the US Department of Energy will be the first two institutions to build Nvidia’s Grace powered supercomputers.
It is estimated that nearly 10,000 engineering years of work has gone into the design of Nvidia Grace CPU. It is expected to power some of the world’s most advanced apps such as natural language processing (NLP), recommender systems and AI supercomputing.
This CPU is capable of handling and analysis of massive datasets on the basis of ultra fast computing power and immense memory. Grace is a powerful combo of Arm’s energy efficient CPU cores and low power memory sub-systems.
It is estimated that Nvidia Grace will be capable of training NLP models with over one trillion various parameters. The CPU has been named as Grace after Grace Hopper, the US computer programming pioneer.
This is yet another milestone in the domains of Artificial Intelligence (AI) and big data processing. After being fully incorporated, this powerful CPU by Nvidia could be responsible for many technological breakthroughs in the times to come.