laitimes

What does Nvidia's $10,000 chip look like? It will become the "workhorse" of the AI industry

Cai Lian News Agency, February 24 (editor Zhou Ziyi) At present, artificial intelligence (AI) is setting off a gold rush in the technology industry.

While big tech companies like Microsoft and Google are looking to integrate cutting-edge AI technology into their search engines and work to develop AI software, Nvidia is trying to forge a new path in the chip field it does best.

The operation of AI application software is inseparable from the chip. NVIDIA is producing a high-performance chip specifically for the AI race: the NVIDIA A100.

Valued at $10,000 per piece, the chip is designed to power these AI software and has become one of the most critical tools in the AI industry.

The "workhorse chip" in the field of AI

The A100 is ideal for machine learning models like ChatGPT, Bing AI or Stable Diffusion. It is capable of performing many simple calculations simultaneously, which is important for training and using neural network models.

The technology behind the A100 was originally used in gaming to deliver complex 3D images, and it is often referred to as a graphics processing unit or GPU.

Recently, with the popularity of the AI industry, NVIDIA's A100 was reconfigured with the primary target for machine learning tasks and running in data centers.

Nathan Benaich, a well-known investor in the field of AI, praised the A100 as a "workhorse" in the artificial intelligence industry.

Race to buy the A100

Large companies or startups that develop software such as AI chatbots and image generators often need hundreds or thousands of NVIDIA chips. These chips must be powerful enough to quickly process terabytes of data to recognize language patterns.

These AI companies also need hundreds of GPUs to train AI models, use models to generate text, make predictions, or recognize objects in photos.

This means that AI companies need to use a large number of A100s. Some entrepreneurs even see the number of A100s they own as a sign of the company's progress.

Emad Mostaque, CEO of Stability AI, tweeted in January, "A year ago, we had 32 A100 chips. We dream big, let's keep buying GPUs. ”

Last fall, Stability AI developed Stable Diffusion, an AI image-generating software that attracted a wave of attention. The company is currently reportedly valued at more than $1 billion.

NVIDIA is riding AI hitchhikers

Nvidia will benefit from this AI hype cycle. On Wednesday (22nd), NVIDIA CEO Jensen Huang also hinted in a conference call with investors that the recent boom in artificial intelligence is the core of the company's strategy.

The company reported fourth-quarter earnings on Wednesday, showing continued growth in sales of the artificial intelligence chip business (data center business) that rose 11 percent in the quarter, despite a 21 percent drop in total quarterly revenue. The company said in the report that it is optimistic about its revenue outlook for the current quarter.

Nvidia stressed that its foray into AI processors will help offset weak demand for PC chips. By Thursday's close, the company's shares had pushed up about 14 percent.

So far this year, Nvidia's stock is up 65 percent, outperforming the S&P 500 and other chip stocks.

The price is high

Compared to other types of software, machine learning tasks can take up almost the entire processing power of a computer. It is difficult to support the operation with only one A100 chip.

So, in addition to a single A100 that can be plugged into an existing server, many data centers need a system that includes eight A100 GPUs working together.

The system is NVIDIA's DGX A100, and the company's suggested price is close to $200,000. On Wednesday, Nvidia said it would sell the DGX system's cloud access interface directly.

New Street Research estimates that Microsoft's ChatGPT-based AI model in Bing Search could require 8 GPUs to respond to a question in less than a second.

At this rate, Microsoft would need more than 20,000 DGX systems to deploy the Bing model to every user, which could cost Microsoft $4 billion in infrastructure spending on this feature.

Antoine Chkaiban, a technology analyst at New Street Research, added, "If you want to scale up to Google, which serves 8-9 billion queries per day, the actual cost on the DGX system will be $80 billion." ”

However, Nvidia Huang said in an interview that the company's GPU products are actually cheap in terms of the amount of computing required for such models.

New competition

Nvidia isn't the only company that makes graphics processors for artificial intelligence. AMD and Intel have similar competitors, and large cloud computing companies such as Google and Amazon are also developing and deploying their own AI chips.

Nevertheless, at present, NVIDIA still occupies a large share of the AI chip market.

According to New Street Research, NVIDIA has a 95% share of the graphics processor market that can be used for machine learning.

But for the NVIDIA A100, the biggest competitor may come from itself. The A100 was launched in 2020, which is a breakthrough for the chip industry.

Last year, Nvidia launched a GPU chip called H100, which has now begun mass production. According to the company, the H100 has a powerful accelerator containing 80 billion transistors and several new features, and only 20 such GPUs can carry all the world's Internet traffic.

(Cai Lian News Zhou Ziyi)

Read on