laitimes

Generative AI computing chip requirements: Cloud computing is the most important basic platform for ChatGPT large models

ChatGPT is setting off a wave of AI big models around the world. In the United States, start-ups such as OpenAI and Anthropic and technology giants represented by Microsoft and Google have led the United States to run blindfolded on the road of AI models, and the maximum parameters have been rolled up to 562 billion. In China, Meituan Wang Huiwen, Ali Jia Yangqing, former Sogou CEO Wang Xiaochuan, former JD AI head Zhou Bowen and many other long-established technology bigwigs have once again wore on the mantle.

A new computing era is coming, and reserve computing power has become a must

With the full integration of digital technology into all aspects of human economic and social development, computing power has long been ubiquitous. Since the beginning of this year, ChatGPT has become popular all over the world, and its core engine behind the scenes, computing power, has come to the spotlight, becoming the focus of heated discussions from all walks of life, and is also one of the themes of the World Artificial Intelligence Conference.

Generative AI computing chip requirements: Cloud computing is the most important basic platform for ChatGPT large models

The report points out that at present, computing power is shifting from "passive" development driven by demand such as terminal computing to "active" development represented by promoting AI large model training, realizing general artificial intelligence, and surpassing classical computing. New hardware and new architectures are emerging, existing chips, operating systems, application software, etc. may be overturned and restarted, and a new computing era is about to emerge.

Before the advent of deep learning, the growth of computing power for AI training doubled about every 20 months, basically in line with Moore's Law; After that, the computing power used for AI training doubles approximately every 6 months; After 2012, the global demand for computing power for head AI model training accelerated to double every 3-4 months, that is, the average annual growth rate of computing power reached a staggering 10 times.

At present, the development of large models is in full swing, and the demand for training computing power is expected to expand to 10 to 100 times the original, and the exponential growth curve of computing power demand will be steeper. As Sam Altman, the "father of ChatGPT" and CEO of OpenAI, said on social media, a new Moore's Law may soon appear.

Generative AI computing chip requirements: Cloud computing is the most important basic platform for ChatGPT large models

Development boom, from 100-model battle to 10,000-model group dance

Since 2023, the hottest topic is artificial intelligence big models. With the development of ChatGPT large models, more than 80 large models have now been publicly released. At the "2023 Global Digital Economy Conference Artificial Intelligence Summit Forum" held a few days ago, Zhou Hongyi, founder of 360 Company, even predicted that the future may not be a hundred model war, but a group dance of ten thousand models.

In the field of large-scale model industry, Baidu, Ali, Huawei and other companies have earlier layouts. When ChatGPT set off a wave in 2023, Baidu released the big language model "Wenxin Yiyan" in March, becoming the first ChatGPT-like product in mainland China. After that, large models released by a number of companies competed to appear.

As of July 3, there are more than 80 large models with more than 1 billion parameters in the mainland. Among them, there are products released by Internet giants, artificial intelligence companies such as Megvii Technology, SenseTime, and iFLYTEK, as well as start-ups such as Lightyear Away and Baichuan Intelligence, as well as scientific research institutes such as the Institute of Automation of the Chinese Academy of Sciences and the Shanghai Artificial Intelligence Laboratory.

Generative AI computing chip requirements: Cloud computing is the most important basic platform for ChatGPT large models

With the advent of more and more large-scale model products, the issue of commercialization has also been raised. In fact, only by achieving commercialization can the sustainable development of large-model products be ensured. Zhou Bowen, founder of Zhiyuan Technology, pointed out: "Chinese enterprises have not yet generally achieved large-scale revenue growth and profit contribution through artificial intelligence technology, and we should face this reality. ”

"The best application scenario of the big model is to help enterprises do multi-link, multi-department, multi-process knowledge management, open up all financial, travel and rank systems, complete various qualification management, market insights and customer interaction, and more pre-sales interaction with customers, and provide employees with productivity tools."

In fact, some people have already begun to discuss whether there is a problem of duplicate construction in the "100-model war", and predict that only a few strong people will be able to win and survive in the future. However, no matter how many enterprises can survive in the future, only enterprises that can truly realize applications and bring value to society can develop in competition.

Generative AI computing chip requirements: Cloud computing is the most important basic platform for ChatGPT large models

Read on