laitimes

NVIDIA soared, and its market value rose 65% this year

With the rise of the AI boom, Nvidia is poised to be the biggest winner among chipmakers — though not the only one, according to Reuters.

AI has become a bright spot for investment in the tech sector, where slowing growth has led to massive layoffs and reduced experimental investment.

But the surge in interest helped Nvidia report better-than-expected quarterly earnings on Wednesday and forecast sales that were higher than Wall Street estimates, in stark contrast to rival Intel Corp.'s expected losses and cut dividends.

Nvidia's market capitalization dwarfs Intel and AMD

Nvidia shares rose nearly 14 percent to $236.70 on Thursday. Since the beginning of the year, they have risen more than 65%, almost three times the increase in the Philadelphia Semiconductor Index. The company started the graphics chip business for PCs by helping video games look more realistic, and then rode the cryptocurrency wave as its chips were used for mining. Now, the next impetus comes from generative AI.

Nvidia's surge on Thursday added more than $70 billion to its market value. This gives it a market cap of more than $580 billion, about five times that of Intel. It is the seventh largest public company in the United States. The key to the company's success is that it controls about 80 percent of the graphics processing unit (GPU) market, a specialized chip that gives services like OpenAI's popular ChatGPT chatbot the computing power they need.

The king of the GPU market

Graphics processing units are designed to handle the specific types of math involved in AI computing very efficiently, while Intel's general-purpose central processing units (CPUs) can handle a wider range of computing tasks with lower efficiency. But AI is taking over the tech industry, and according to research firm Gartner, the share of specialized chips such as GPUs used in data centers is expected to rise to more than 15% by 2026, up from less than 3% in 2020.

AMD, the second-largest player in the GPU industry, also saw its shares rise after Nvidia's earnings report on Wednesday, with a market share of about 20%.

"The two companies leading the AI revolution in hardware and processing are Nvidia and AMD, which in our view are far ahead of the rest," said Piper Sandler analyst Harsh Kumar.

AMD, led by Lisa Su, has invested heavily in AI in recent years, including a range of chips designed to compete with Nvidia's fastest products. For comparison, Intel has less than 1% of the market.

"The enthusiasm for ChatGPT and the potential use cases it unlocks could represent an inflection point in AI adoption," said Lei Qiu, portfolio manager at the Alliance Bernstein Technology Fund, which owns 0.54% of NVIDIA. "While it's hard to pinpoint exactly how big AI is a percentage of [Nvidia's] revenue today, it has the potential to grow exponentially as big tech companies race to develop similar types of AI applications," Qiu said.

Nvidia's prowess in the AI industry has also caught the attention of venture capitalists and startups, who are investing billions of dollars and promising improvements such as reducing electricity consumption. But so far, none of them have had a significant impact on Nvidia's business.

All of this is bad news for Intel, whose CPU market share in the data center and PC industry is being taken away by AMD, which once dominated. The company now risks losing the industry's next growth point. In recent months, it has worked to sharpen its focus on GPUs, including splitting its graphics chip division in two in December: one focused on personal computers and the other on data centers and artificial intelligence.

Still, analysts say the company still has a long way to go before Intel has a foothold in the market.

Matthew Bryson, an analyst at Wedbush Securities, said: "Intel has more designs trying to penetrate the (AI) market, but so far its traction has been disappointing, despite its plethora of solutions. ”

ChatGPT helps NVIDIA soar

The Wall Street Journal noted in a recent report that chipmakers are keen on the latest hot spot in tech: AI tools can generate text with minimal prompts, but this requires a lot of computing power to operate and promises lucrative new revenue streams for related suppliers.

Analysts estimate that for semiconductor manufacturers, the new tool, if widely adopted, could bring in tens of billions of dollars in annual net sales.

Since the release of San Francisco-based OpenAI chatbot ChatGPT late last year, the market excitement for so-called generative AI has reached feverish levels. The technology has helped attract users by generating convincing, real, (and sometimes inaccurate) responses, helping it attract billions of dollars from Microsoft.

Huang Jenxun, CEO of NVDA Nvidia, even said that the technology has reached an inflection point. "The versatility and capabilities of generative AI have raised a sense of urgency for enterprises around the world to develop and deploy AI strategies," he said Wednesday as the company announced quarterly earnings and unveiled a new cloud computing initiative to capitalize on business opportunities.

Interest in such AI tools is prompting companies to recalibrate their business expectations, he said. "There's no question that because of the last 60, 90 days, our perception of this year has changed quite a bit as we enter the new year."

With sales of personal computers, smartphones and other electronics weakening, the chip industry is grappling with a sharp decline in the semiconductor industry at an exciting time. Most chipmakers have reported slowing sales as recession fears have led consumers and businesses to cut back spending.

Nvidia is the undisputed market leader in AI chips, and in the monotonous world of data centers, tools like ChatGPT do calculations and output results. Omdia estimates that its share of such AI processors is around 80% as of 2020.

Still, there is so much money to compete for that other chipmakers want to join.

Intel Corp. CEO Pat Kissinger said Wednesday that his company has a broad set of chips to address opportunities to generate AI, including specialized chips for AI computing, graphics chips in data centers and a new generation of data center CPUs — the digital brains of computers — that he said are doing well in AI efforts.

"As AI is integrated into every future application, the performance we expect will become mainstream in computing," he said.

AMD tailors CPUs, graphics chips, and other hardware for AI, while betting that large cloud companies that are likely to invest heavily in chips for many of the computing necessary to run the technology. AMD CEO Lisa Su said late last month that the business should start to make more sense next year.

Bank of America analyst Vivek Arya said generative AI could add $20 billion a year to the overall AI chip market by 2027. Nvidia should be able to maintain at least 65 percent of the AI chip market share, he said.

Internet search giant Google, a subsidiary of Alphabet Inc., this month showcased its homegrown ChatGPT competitor called Bard. China's Baidu is developing a ChatGPT-like AI chatbot called Ernie Bot, which it plans to launch next month. Microsoft already offers users a limited ChatGPT experience in its Bing search engine results.

At least in the short term, Nvidia's dominance in AI may be the best position for it to profit. The company gained a lead by allowing software developers to take advantage of the features of its graphics chips, which have proven to be good at AI since about 15 years ago. Right now, the company's chips are the only viable product that could be used to create large-scale AI language systems, UBS analysts said in a note, adding that they estimate that ChatGPT would require about 10,000 of the company's graphics chips to train.

Huang suggested in his earnings briefing that the company may update its outlook for the size of its addressable market next month, after about a year ago Huang predicted its $1 trillion business from providing video game chips to cars.

"Because of the incredible power and versatility of generative AI, and all the fusion breakthroughs that happened in the middle and end of last year, we may reach that [market size] sooner or later," he said. "Without a doubt, this is a very important moment for the computer industry."

Nvidia is trying to achieve this faster by starting to provide cloud computing services to enterprises to develop generative AI chatbots and other tools using its hardware and software. The service will be delivered through established cloud computing companies and aims to lower the barrier to entry for the adoption of AI in business.

Nvidia says it is working with all major cloud providers, including Amazon, Microsoft, and Google, on generating AI tools as well as consumer internet companies and startups.

The tens of thousands of dollars A100 is a new weapon

In CNBC's view, in the current hot AI chip market, the Nvidia A100 priced at tens of thousands of dollars has become a new weapon in the market.

The A100 is now a "workhorse" for AI professionals, says investor Nathan Benaich. He publishes a newsletter and report covering the AI industry, which includes a partial list of supercomputers using the A100. According to New Street Research, Nvidia has a 95% share of the graphics processor market available for machine learning.

In his opinion, the A100 is ideal for machine learning models that support tools like ChatGPT, Bing AI or Stable Diffusion. It is capable of performing many simple calculations simultaneously, which is important for training and using neural network models.

The technology behind the A100 was originally used to render complex 3D graphics in games. It's often referred to as a graphics processing unit or GPU, but today Nvidia's A100 is configured and targeted for machine learning tasks and runs in data centers, not in glowing gaming PCs.

Large companies or startups that develop software like chatbots and image generators need hundreds or thousands of Nvidia chips and either buy them themselves or have secure access to computers from cloud providers.

For example, to develop the large voice models required for the current hot ChatGPT, hundreds of GPUs are required to train. These chips need to be powerful enough to quickly process terabytes of data to recognize patterns. After that, a GPU like the A100 is needed for "inference" or using the model to generate text, make predictions, or identify objects in photos.

This means that AI companies need to acquire a lot of A100, and some entrepreneurs in the field even see the number of A100 they get as a sign of progress.

"A year ago we had 32 A100s," Emad Mostaque, CEO of Stability AI, wrote on Twitter in January. “Dream big and stack moar GPUs kids. Brrr," he continued. Stability AI is the company that helped develop Stable Diffusion, an image generator that gained traction last fall and reportedly valued it at more than $1 billion.

Stability AI now has access to more than 5,400 A100 GPUs, according to an estimate from the State of AI report, which plots and tracks which companies and universities have the most A100 GPUs — although it doesn't include cloud providers, which don't publicly publish their numbers.

Nvidia CEO Jensen Huang talked nonstop about AI on a conference call with analysts on Wednesday, suggesting that the recent AI boom is central to the company's strategy.

"The last 60 days have just exploded around the AI infrastructure we've built, and the inference activity using Hopper and Ampere to influence large language models," Huang said. "There's no question that whatever we think about this year, because the last 60 days, 90 days, as we enter the year, there's been a lot of change."

New possibilities brought about by "cloud services"

Compared to other types of software, such as web services, it occasionally uses processing power bursts in microseconds, while machine learning tasks can take up the processing power of an entire computer, sometimes for hours or days.

This means that companies that find themselves with hot AI products often need to buy more GPUs to handle spikes or improve their models. These GPUs are not cheap. In addition to a single A100 on a card that can be plugged into an existing server, many data centers use a system with eight A100 GPUs working together.

This one, from Nvidia, called the DGX A100, has a suggested price of nearly $200,000, but it comes with the required chips. On Wednesday, Nvidia said it would sell cloud access to DGX systems directly, which could lower the cost of entry for tinkerers and researchers.

It's easy to see how the cost of the A100 has increased.

For example, an estimate by New Street Research found that an OpenAI-based ChatGPT model in Bing Search might require 8 GPUs to respond to a question in less than a second. At this rate, Microsoft would need more than 20,000 8-GPU servers to deploy the model in Bing to everyone, suggesting that Microsoft's capabilities could require $4 billion in infrastructure spending.

"If you're from Microsoft and you want to scale it, at Bing's scale, that's probably $4 billion. If you want to scale to Google, serving 8 or 9 billion queries a day, you actually need to spend $80 billion on DGX. Antoine Chkaiban, a technical analyst at New Street Research, said. The numbers we came up with were enormous. But they just reflect the fact that every user with such a large language model needs a large supercomputer when using it. ”

According to information posted online by Stability AI, the latest version of the image generator Stable Diffusion was trained on 256 A100 GPUs or 32 machines, each with 8 A100s, for a total of 200,000 compute hours.

Stability AI CEO Mostark said on Twitter that it cost $600,000 to train the model alone at market prices, and he hinted in Twitter that this price is unusually cheap compared to competitors. This doesn't include the cost of "inference" or deploying the model.

In an interview with CNBC's Katie Tarasov, Nvidia CEO Jensen Huang said that the company's products aren't actually expensive in terms of the amount of computation required for these models.

"We reduced the data center that was originally worth $1 billion to a $100 million data center running CPUs to a $100 million data center," Huang said. "Now, $100 million, when you put it in the cloud and shared it by 100 companies, it's almost nothing."

Huang said Nvidia's GPUs allow startups to train models at a much lower cost than using traditional computer processors. "Now you can build something like a large language model, like GPT, for about $10 to $20 million," Huang said. "It's really affordable." He added.

PERHAPS, WHEN TALKING ABOUT THIS NVIDIA "CLOUD SERVICE", MANY PEOPLE WILL SEE HIM AS A PUBLIC CLOUD COMPETITOR LIKE AWS, AZURE AND GOOGLE. But according to nextplatform, Nvidia is simply putting its own DGX system literally into the big cloud so that customers can use the exact same services on the cloud as they can install in their own data centers.

"It's similar to VMware's attempt to abandon the cloud in the fall of 2016 and partner with Amazon to build VMware Cloud on AWS," nextplatform said.

But regardless, while other competitors are still struggling, Nvidia seems to have officially recovered and entered a new era.

Read on