laitimes

Ding Qi: 2024 AI Industry Investment Guide

author:NewEconomist
Ding Qi: 2024 AI Industry Investment Guide

Ding Qi, chief analyst of cloud infrastructure at CITIC Securities, data map. This article is an excerpt from Ding Qi's speech at the New Economist Think Tank Seminar.

Ding Qi: 2024 AI Industry Investment Guide

60s Speed Reading:

The rapid development of AI technology has driven the exponential growth of computing power demand, and the number of parameters of large language models is expected to reach hundreds of billions or even trillions.

NVIDIA, AMD and Google TPU are competing fiercely in the field of computing chips, NVIDIA's GPU products dominate the AI market, and AMD and Google TPU are seeking breakthroughs.

HBM's storage technology is expected to be upgraded from HBM2e to HBM3e, increasing storage bandwidth and reducing power consumption to support the growing demand for AI models.

Server manufacturers are introducing higher-performance products to meet the growing demand for processing power and memory for AI models.

Optical transceiver manufacturers are developing next-generation products to meet the demand for higher bandwidth and lower latency for AI models.

The application of AI technology in areas such as image generation, video production, education, and translation is expected to bring significant changes. The domestic substitution of computing power, the urgent demand for AI technology in Huawei's industry chain and the media industry provide investors with new investment opportunities.

Body:

Computing power market outlook for 2024: The demand for computing power is growing exponentially

With the rapid development of AI technology, the demand for computing power is growing at an exponential rate, which is far faster than Moore's Law.

In 2024, it is expected that large language models, especially models with hundreds of billions or even trillions of parameters, will become the dominant force in the market. The rise of multimodal technology, in which models are able to process multiple types of data (e.g., text, images, sounds, etc.) at the same time, will further drive the demand for computing power.

Innovation in storage technology: HBM2e will be upgraded to HBM3e

In 2024, the storage technology will be upgraded from HBM2e to HBM3e, which will significantly increase the storage capacity and speed to support the continuous iteration and upgrade of computing power.

This will not only increase the speed of data processing, but will also open the door to new application scenarios and technological innovations. With the maturity and widespread application of HBM3e technology, we have reason to believe that it will become one of the key technologies driving the development of AI and data-intensive applications in the coming years.

Competitive landscape for chip giants: NVIDIA will dominate

In the field of computing chips, NVIDIA, AMD and Google TPU are becoming increasingly competitive. NVIDIA dominates AI training and inference with its GPU offerings, and is expected to reach $80 billion in GPU revenue in 2024.

AMD has made significant progress in the optimization of large-scale model software with the MI300, and although its market share is small, it is actively expanding its presence in the AI field. Google's TPU focuses on dedicated AI chip design, aiming for low power consumption and high efficiency, providing an alternative for AI applications.

NVIDIA's consensus estimate for 2024 revenue is close to $60 billion. AMD has shown ambition to launch a new product, the MI300, and the company expects to ship 300,000-400,000 units in 2024, and will strive to build its own ecosystem and application cycle.

Google focuses on the design of dedicated AI chips, aiming at the direction of large models, focusing on specificity, and meeting the needs of a single application. Dedicated AI chips may be a better solution for large model applications. Google's purpose-built design meets the specific needs of large models, improving performance and efficiency. With the advancement of technology and the development of applications, the field of large models will be more stable and mature.

AI chips are not currently the optimal solution for large models, but they may be an ideal choice in the long run. Google has chosen to develop dedicated AI chips, specifically TPUs, which continue to iterate and improve every year. Domestic manufacturers such as Cambrian and Huawei can learn from Google's route and launch special chips for large model optimization to compete and segment the market with low power consumption.

Server Market: The AI server market will continue to grow

The server market size is expected to continue to grow in 2024, especially the AI server market, due to the increasing computing demand for AI models. Liquid cooling technology will be widely used in 2024 to solve the problem of high power consumption and heat dissipation, and improve the stability and efficiency of servers.

In 2023, the server market performed strongly, in contrast to the chip market. Globally, companies including Chaowei of the United States, Inventec of Taiwan, Fortune Federation of Industry and Inspur have performed well in the capital markets. With technological advancement and application development, the field is expected to be more stable and mature. Consulting firms such as The Next Platform predict that the market size of AI servers will reach $40 billion in 2023, and the actual market may be even higher. The server market supported by NVIDIA chips has huge potential.

However, the server market as a whole is in a boom stage. The advent of AI has made the server market more concentrated. Large models and clusters require a large capital investment, raising the barrier to entry. Leading companies have benefited more from the AI wave and have a high degree of market concentration.

The gross profit margin of AI servers is better than that of general-purpose servers, and the competition between manufacturers is relatively less fierce, which is a track worthy of continuous attention.

In 2024, liquid cooling technology is expected to be applied on a large scale in the server field. At present, the heat dissipation power consumption of kilocalorie clusters is large, and the failure rate is high, so liquid cooling has become the solution. According to CCID consulting data, it is expected that by 2024, the liquid cooling market may reach the level of 100 billion. By 2025, the market size may further expand to the level of 130 billion.

The mainstream liquid cooling method is mainly based on cold plates, which take away heat through metal plates and liquids. Immersion liquid cooling immerses servers in liquids, but they are currently less widely used. With the increase of power consumption and heat dissipation requirements, immersion liquid cooling has higher heat dissipation efficiency, and the proportion may increase in the future.

Evolution of communication networks and optical modules

In 2024, optical modules will continue to iterate, increasing speed and reducing power consumption to support higher data transmission requirements. At the network level, direct memory interconnection (RDMA) will be used to reduce latency to meet the requirements of low latency in the AI era.

The demand for optical modules, communication networks, and AI forms the current strong sectors. Pursue high HBM and data throughput, reduce the waste of computing resources, and improve transmission efficiency. As computing power increases, throughput increases between tiers, involving storage, board, and network upgrades.

AI networks require high bandwidth and low latency. Single-card capability and multi-card efficiency are important. High-efficiency clusters can significantly improve the acceleration ratio. In order to meet the demand for high throughput, it is necessary to strengthen the construction of communication networks.

The throughput rate of optical modules has increased, but the problems of power consumption and failure rate have followed. This year, the trend is to improve the reliability of optical modules by reducing power consumption. The loss between the optical module and the switch chip needs to be reduced to improve the voice of the switch chip.

In order to meet the requirements of low latency in the AI era, the data transmission mode has been transformed. A variety of hardware solutions exist, such as NVIDIA Mellox-based and Ethernet.

There will be big changes in computing power, first of all, chip iterations, followed by server and storage upgrades. Optical modules continue to iterate, with high speed and low power consumption as the main directions. At the network level, RDMA is used to reduce latency. There are two routes in the market.

Application field: AI graphic video companies will take the lead in rising

The field of AI applications is very concerned, and everyone is paying attention to what kind of companies and applications can run out first.

AI graphics and video generation are the first application areas to emerge because they require a low fault tolerance rate and can greatly improve efficiency. Follow-up office, medical care, education and other fields with high rigor requirements and low fault tolerance will gradually follow. Fields with high fault tolerance and dozens of times more efficiency, such as AI drawing, are easy to be the first to apply. Other opportunities include AI for spoken English conversation and translation in education. AI applications will go from easy to difficult, and we will focus on companies that excel in areas such as graphics, text, and video.

In the future, labor-intensive industries such as customer service, creative design, and programming will be the first to benefit from AI technology, as these fields can significantly reduce labor costs through AI automation.

Industry trends and investment opportunities

In 2024, AI technology will show its growth potential in several fields, especially in industries such as gaming, education, and healthcare. In the field of gaming, the application of AI technology will enhance the gaming experience, creating a richer and more immersive gaming experience through automated and intelligent methods, such as AI-generated game characters and environments, as well as intelligent game design tools. The education industry will also benefit from AI to improve the quality and efficiency of education through personalized learning platforms and intelligent teaching assistants. In the medical field, AI will play an important role in disease diagnosis, patient monitoring, and drug development, improving the accuracy and responsiveness of medical services.

The rapid development of autonomous driving technology will promote the prosperity of related industrial chains. LiDAR, millimeter-wave radar, and algorithm companies will see significant growth due to their use in autonomous driving systems. These technologies are essential for autonomous navigation and safe driving of vehicles, and as autonomous driving technology matures and regulations improve, companies will see more market demand.

The media industry will undergo a revolution brought about by AI technology. The application of AI in content creation, editing, and distribution will greatly reduce production costs and improve the efficiency of content production. Virtual streamers in the game industry's concept art production and live streaming will significantly benefit from AI technology, and the application of these technologies will change the traditional mode of content production and consumption.

With the widespread application of AI technology, investment opportunities are also increasing. Domestic substitution of computing power has become a trend, and Huawei and companies in the relevant industry chain of the Chinese Academy of Sciences will be favored by investors, especially in the field of server and storage technology. These companies have the potential to provide high-performance computing solutions and storage solutions, and the market prospects for these companies are promising as the demand for autonomous and controllable technologies increases in the country.

Server manufacturers in Huawei's industrial chain will also be a hot spot for investment. As Huawei expands in the global market, vendors that provide servers for it will gain stable orders and growth opportunities. While these companies meet Huawei's growing demand for servers, they may also improve their competitiveness through technological innovation and market expansion.

Companies in the media sector that have been impacted by AI technology are also worth watching. Companies that can use AI technology to reduce content production costs, improve content quality, and distribution efficiency will have an advantage in the market of the future. As AI technology continues to advance, these companies are poised to achieve new breakthroughs in areas such as content creation, editing, and distribution. ■

Read on