On April 14, chip giant Nvidia announced that it plans to produce the next generation of artificial intelligence chips ( AI chips) and supercomputers (Supercomputers ) entirely in the United States for the first time . The next generation of artificial intelligence chips ( AI chips) refers to the Blackwell chips recently launched by Nvidia, which are planned to be produced and tested in Arizona. The supercomputers are planned to be manufactured in Texas, and it is reported that the two supercomputer production plants will enter the mass production stage in the next 12 to 15 months.
What is an AI chip? What is a supercomputer?
AI chip, or Artificial Intelligence Chip, refers to a processor chip (ASIC) designed for AI algorithms to perform AI-related tasks (especially computationally intensive tasks such as deep learning and machine learning). Its goal is to run AI algorithms more efficiently than traditional general-purpose processors (such as CPUs and GPUs). Traditional CPUs and GPUs can be used to execute AI algorithms, but they are slow and have low performance.
type | Full name | illustrate |
GPU | Graphics Processing Unit | Most commonly used for AI training (such as NVIDIA's chips) and have strong parallel computing capabilities. |
TPU | Tensor Processing Unit | Google's custom chip for AI reasoning and training. |
ASIC | Application Specific Integrated Circuits | Custom chips designed from the ground up for specific AI functions are extremely efficient. |
FPGA | Field Programmable Gate Array | High flexibility, logical functions can be configured according to needs. |
Supercomputers, as Nvidia describes them, are high-performance computing clusters made up of hundreds or thousands of AI chips (such as GPUs) dedicated to AI training and reasoning. Supercomputers are designed specifically for training and running large AI models.
In its announcement, Nvidia referred to these two concepts collectively as AI Supercomputers.
It is expected that in the future, AI supercomputers will be an indispensable infrastructure in the era of artificial intelligence and the computing power pillar behind AI models such as GPT-4, Sora, and Midjourney.
In the era of AI industrialization and "AI as infrastructure", ultra-high-performance computing infrastructure may become a strategic resource for countries to compete for, just like new energy and chips.
In its announcement , Nvidia described AI supercomputers as “AI factories” — the core powerhouses of a new generation of data centers designed to handle artificial intelligence tasks. “They form the infrastructure of the emerging AI industry…expected to create hundreds of thousands of jobs (in the United States) and deliver trillions of dollars in economic security benefits over the coming decades.”
This emerging AI industry infrastructure is being built by NVIDIA in the U.S. In the field of AI infrastructure, NVIDIA's latest announcement indicates that it will only take the next four years for NVIDIA to produce up to $500 billion worth of value in the U.S. by collaborating with other platforms.
However, as the concept of "Web3+AI" becomes more popular, it is still unclear whether Internet platforms will continue to follow the traditional Internet model of relying on chip giants in the AI era. The decentralized AI concept is to return data sovereignty, data privacy and security to users in the coming AI and metaverse era, and let users decide.
What is certain is that, driven by Web3, the AI industry following the current centralized Internet development model will not be the mainstream in the future.
PowerBeats believes that in an era of decentralized Internet where users are in charge, AI infrastructure must also be decentralized. The AI infrastructure built by NVIDIA may serve as the underlying driving factor to empower distributed on-chain projects or platforms. For example, the DeCloud (decentralized cloud computing power) platform will be able to achieve higher performance, lower latency, and cheaper computing power resources based on powerful AI infrastructure. It is reported that the decentralized cloud computing power platform PowerVerse is using NVIDIA chips to build cloud computing power distributed nodes.
According to Nvidia's announcement, it will use advanced AI, robotics and digital twin technologies to design and operate the above-mentioned AI factories, including creating digital twin models of the factories and building automated manufacturing robots.
We are concerned about the privacy and security of data used in application scenarios such as digital twin model creation, robot training, and agent-based AI task execution in the era of emerging technologies. We also look forward to the development of AI and the metaverse based on the underlying operating logic of Web3.
refer to:
https://blogs.nvidia.com/blog/nvidia-manufacture-american-made-ai-supercomputers-us/