The semiconductor market, which has fallen into a downturn due to inventory downturn, has recently received renewed attention due to the popularity of ChatGPT.
Since ChatGPT belongs to generative AI, NVIDIA, known as the first stock of "AI chips", rose in response. At the close of trading on February 13, NVIDIA's latest share price had reached $217, up 50% from $143 on January 3 this year. In a speech, Nvidia founder and CEO Jensen Huang even called ChatGPT "the iPhone moment of artificial intelligence" and called it "one of the greatest technologies in computing."
In addition, the demand for high-performance memory chips for emerging AI products such as ChatGPT is also driving shipments from related manufacturers. According to the Korea Economic Daily, thanks to ChatGPT, Samsung and SK Hynix received orders for high bandwidth memory (HBM).

In the view of institutions, ChatGPT, as an application of AI-generated content (AIGC), will drive the overall demand of the chip industry. Guo Junli, research director of IDC China, told the first financial reporter that although domestic manufacturers have not yet had a significant shipment pull, with the landing of application scenarios and the limited supply of foreign high-performance chips, domestic manufacturers will catch up.
How much does the chip industry affect?
On February 14, the semiconductor sector strengthened, Tongfu Microelectronics (002156) rose to the intraday limit, Yong Si Electronics (688362) rose more than 9%, Huatian Technology (002185), Changjian Technology (600584), VeriSilicon-U (688521), China Wafer Square Technology (603005) and so on. As of the close of the 14th, the third-generation semiconductor index (885908) was reported at 1365.239 points, up 0.02%. The index is up 3.38% since February 1.
In terms of ETFs, as of today's close, the chip leading ETF (516640) and chip 50 ETF (516350) rose more than 2%. Chip ETF Fund (516920), Semiconductor Leading ETF (159665), Semiconductor ETF (159813), Sino-Korean Semiconductor ETF (513310), Chip ETF Leading (159801) followed suit.
In the industry's view, the popularity of high-computing AI applications such as ChatGPT will bring a new round of opportunities to domestic chip manufacturers.
Chai Daixuan, executive director of CIC Consulting, told First Financial Reporter that ChatGPT, as an application of AI-generated content (AIGC), will drive the overall demand of the chip industry.
"ChatGPT has a large number of AI models with complex computing requirements, and the computing power consumption is very huge, which requires a powerful AI chip to provide the computing power foundation. AI chips are specialized modules that handle a large number of computing tasks in AI applications, including GPUs (graphics processing units), FPGAs (Field Programmable Gate Arrays), ASICs (Application-Specific Integrated Circuits), and DPUs (Processor Distributed Processing Units). Chai Daixuan told reporters that ChatGPT's high computing power demand will have a substantial impact on these sectors. In addition, high-computing power chips and high-speed memory complement each other, and memory interface chips may also be affected.
Guo Junli holds a similar view, she believes that the actual impact of ChatGPT on the chip track is mainly concentrated in several aspects. "First of all, ChatGPT is based on Transformer technology, and as the model continues to iterate, the number of layers increases, and the demand for computing power increases; Secondly, the three conditions for ChatGPT to run: training data + model algorithm + computing power, which requires large-scale pre-training on the basic model, ChatGPT has undergone 3 iterations, the number of parameters has increased from 117 million to 175 billion, and the training volume has increased significantly. ”
Guo Junli told the first financial reporter that in the new era of AI represented by ChatGPT, computing power will become the core competitiveness. The traffic of new scenarios + original scenarios brought by it has been greatly improved, driving the growth of demand for AI chips. Chips GPU, CPU, FPGA, ASIC that provide computing power, and chips HBM and DRAM that provide memory functions will benefit from the application explosion of ChatGPT, promote AI industrialization from software to hardware, semiconductor + AI ecology will gradually become clear, and AI chip products will achieve large-scale landing.
"GPUs can support the demand for powerful computing power, and GPUs are widely used in acceleration chips due to their parallel computing capabilities and compatibility with training and inference capabilities. GPU suppliers such as NVIDIA, Haiguang Information, Jingjiawei and other domestic and foreign enterprises will become beneficiaries. In addition, ChatGPT performs large computing power while also requiring large memory support, and NVIDIA GPUs are equipped with a large number of DRAM, including high-bandwidth memory (HBM). Samsung Electronics, SK Hynix, Micron, etc. are expected to directly or indirectly benefit from the rapid growth of demand for NVIDIA AI chips. Guo Junli said.
Domestic manufacturers have not significantly increased their shipments
In the early days of ChatGPT commercialization, NVIDIA was sought after by the capital market. The reason is mainly that the artificial intelligence server that drives ChatGPT to achieve natural language processing, machine learning and other functions is composed of CPU and acceleration chip, acceleration chip includes GPU, FPGA and ASIC, etc., the combination of CPU and acceleration chip can meet the needs of high-throughput interconnection, of which the role of GPU is key.
The performance of the GPU is the source of the model's powerful computing power. In 1999, NVIDIA introduced the GeForce 256 graphics card and defined the graphics processor as a "GPU" for the first time, thus establishing its dominant position in the GPU field. Guo Junli said that according to IDC's estimates, ChatGPT is likely to drive Nvidia-related products to sales of $3.5 billion to $10 billion in 12 months.
However, GPU chips also have problems such as weak management control capabilities and high power consumption. Zheshang Securities pointed out that as the use of ChatGPT surges, OpenAI needs more computing power to respond to the needs of millions of users, so it has increased the demand for GPU chips.
At present, GPU is also the main confrontation battlefield for global chip giants, Intel, AMD, Apple and so on have layout. Apple previously introduced the M2 series of chips designed by AI accelerators (M2 pro and M2 max), and said that the two chips will be equipped in the new computer, AMD immediately announced plans to launch TSMC's 4nm process "phoenix" series chips competing with Apple's M2 series chips, and the "Alveo V70" AI chip designed using the Chiplet process. Both chips are scheduled to be launched this year, respectively for the consumer electronics market and AI inference.
Major domestic GPU manufacturers include Jingjiawei, Loongson Zhongke, and Haiguang Information. However, from the public response of institutions and companies, the domestic ChatGPT-related chip industry is still in the early stage of development.
In the latest response, Jing Jiawei (300474) said that the company's products are not involved in AI training and ChatGPT-related businesses. "The company's GPUs are mainly used for graphics processing, and ChatGPT requires high-speed computing that we don't currently do."
Loongson Zhongke also said that the company does not have internal products for ChatGPT.
However, from the perspective of development trends, Chai Daixuan told reporters that the technical route of GPU or CPU + FPGA provides computing power support for ChatGPT. Therefore, in addition to the GPU market, the FPGA market is also expected to develop rapidly. "FPGA has the advantages of high flexibility, short development cycle, low latency, high flexibility, etc., compared with CPU/GPU/AS, it has higher speed and very low computing energy consumption, and is an accelerator for large computing power chips." Although Xilinx and Intel currently account for a relatively high proportion of FPGA production, domestic manufacturers represented by Unigroup Guowei, Fudan Microelectric, Anlu Technology, etc. also have broad room for improvement and development. ”
In addition, Guo Junli told reporters that the CPU + acceleration chip that can be used for reasoning prediction can realize the demand for high-throughput interconnection, and its suppliers Loongson, China Great Wall, and ASICs have extreme performance and power consumption, and its suppliers Cambrian and Montage Technology.
"However, at present, domestic manufacturers have not yet had a significant shipment pull." Guo Junli believes that in addition to market demand, the development of ChatGPT still needs to pay attention to many aspects, the first is the construction of industrial ecology, such as the integration with Microsoft-related applications is a good idea. Secondly, according to the research report, the cost of a reply to ChatGPT is about 6 times to 28 times the average cost of Google search queries, and if you want to achieve sustainable development, you need to constantly optimize the cost structure and reduce costs. Finally, ChatGPT faces high content risks, such as when it comes to ideology, values, violence, intellectual property and other issues, whether AI can effectively deal with it, and whether the risk control model is intelligent enough are all aspects that need to be considered.