laitimes

ChatGPT and Smart Pension Finance: A New Perspective for Analyzing Artificial Intelligence Application Models | Fintech

author:Tsinghua Financial Review
ChatGPT and Smart Pension Finance: A New Perspective for Analyzing Artificial Intelligence Application Models | Fintech
ChatGPT and Smart Pension Finance: A New Perspective for Analyzing Artificial Intelligence Application Models | Fintech

Text/Liu Xuguang, Deputy Director of the Second Business Department of China Internet Finance Association, and Li Gen, Research Fellow, Investment and Financing Research Center, Chinese Academy of Social Sciences

The breakthrough of ChatGPT has charted a new path for the development of the artificial intelligence industry. This paper follows the logical perspective of the integration of technology, industry and finance, and proposes that the vertical and horizontal application of artificial intelligence will continuously improve the supply quality of the elderly care industry and services, empower smart pension finance to accelerate innovation and breakthroughs, and continuously explore new service scenarios and tap new functional values in the process of financial digitalization, pension digitalization and the integration of industry and technology.

In the decades of artificial intelligence development, technological breakthroughs have crossed the troughs and climax of business development, leading round after round of application prosperity. In recent years, the emergence of ChatGPT has set off a wave of new technological innovation and application exploration. According to the "Chinese Intelligence Big Model Map Research Report", since 2020, artificial intelligence large models have developed in a spurt, more than 200 artificial intelligence large models have been released worldwide, and the number of artificial intelligence large models in China has increased to 79 at the end of May 2023. The innovative exploration of large models has also injected strong impetus into the growth of the artificial intelligence market, with International Data Corporation (IDC) predicting that the market size of Chinese intelligent software and applications will increase from $5.1 billion in 2021 to $21.1 billion.

ChatGPT briefly describes and its main features

ChatGPT stands for Chat Generative Pre-trained Transformer (Chat Generative Pre-trained Transformer), which is a conversational interaction model released by the American artificial intelligence company OpenAI in November 2022. ChatGPT uses reinforcement learning algorithms based on human feedback to achieve smooth human-computer interaction like natural dialogue between people. According to reports, ChatGPT gained 1 million users in just one week after its release, known as the "fastest growing user application in history", and UBS predicted that ChatGPT users would exceed 100 million in the report. The breakthrough of ChatGPT has described an emerging path for the development of the artificial intelligence industry, and the development model with the most concentrated resource investment is to improve the technical factors that support ChatGPT to mature and robust.

In terms of application, ChatGPT was released at the beginning of the main realization of text content interaction, after technical iteration and new model development, the current picture, audio, video has also become interactive content, this series of support multi-type content interaction (multi-modal) technology cluster is generally known as generative artificial intelligence (AIGC). Various generative artificial intelligence are also vigorously promoting the application of various scenarios, and many application solutions have been born in scenarios such as office software, social entertainment, business marketing, home assistant and finance. Generative AI is often considered to be the technical path corresponding to discriminative AI, the latter is good at identifying differences and then classification, and the former is identifying connections and then combining. In essence, the role of discriminative artificial intelligence can be regarded as the deconstruction of information, and the role of generative artificial intelligence can be regarded as the reconstruction of information, and the relationship between the two can be understood as relative and unified rather than isolated or opposite. Deconstruction is the premise of reconstruction, and refactoring is the purpose of deconstruction.

In terms of algorithms, ChatGPT itself has developed from the initial 110 million parameters of GPT-1 to the number of said trillion-level parameters of GPT-4, and the industry has also developed new models represented by ELMo, BERT, ChatGLM2, Llama2, PaLM2, etc. Despite the differences in technical routes, the parameter scale of these models generally reaches tens of millions or even hundreds of millions, according to which it is also collectively referred to as large models by the industry. The journal Nature Machine Intelligence once gave a definition, which believes that a large model is a pre-trained deep learning algorithm with a network parameter scale of more than 100 million. According to the different types of content, large models that are mainly trained by text content and process text content are also collectively referred to as large language models (LLM). The industry summarizes the characteristics of large models as "three major and one fast", that is, the development and operation of large models rely on large computing power, based on big data, and use large algorithms, and large models can be quickly iterated and empowered after development, and have the ability to be used as infrastructure. The corresponding is a small model, which is usually considered to have the characteristics of small size, easy development and management, relatively more flexible and more suitable for the application needs of vertical fields or subdivided scenarios.

In terms of computing power, ChatGPT's total computing power is reported to reach 3640PF-days (3640 days at 1000 trillion times per second), which requires 7 to 8 data centers with an investment of 3 billion yuan to support such computing power consumption. According to Huawei's research, the computing power demand for artificial intelligence in 2030 will reach 390 times that of 2018. It is foreseeable that the operation and development of large models in the future will require a large number of chips and equipment support, which will promote major technological innovation of chips and computing equipment. The world's major chip companies Nvidia and AMD have launched super chips designed for artificial intelligence large models.

In terms of computing base, previous versions of ChatGPT used billions of words and tens of gigabytes (gigabytes) of training datasets, while the training data used by current large models generally reached hundreds of billions of words and thousands of gigabytes. In addition to the scale of data, data quality will also have a key impact on the training effectiveness of large models, in fact, the R&D team of domestic ChatGPT-like model MOSS has attributed the insufficient Chinese ability of the model to the presence of too much advertising and other interference information on the Chinese network. Since large models consume data too quickly, data shortage is also considered to be the main bottleneck limiting the further development of large models. To this end, it has been argued that synthetic data may be an effective solution.

The optimization of large models enhances the capabilities of artificial intelligence, but many of the current hotly debated problems such as poor experience and potential risks seem to indicate that artificial intelligence is far from perfect. Judging from the results of the evaluation of large models carried out by domestic think tanks, MIT Science and Technology Review, International Data Corporation and other research institutions, the capabilities of different large models have their own strengths, and there is still a lot of room for improvement overall. At the same time, many studies have revealed the risks of knowledge infringement, algorithm black box, algorithm discrimination, and information leakage caused by large models. People are both amazed at the great capabilities shown by artificial intelligence, but also cautiously vigilant because of the shortcomings and shortcomings of artificial intelligence...

Paid ¥5

Source: Tsinghua Financial Review, September 2023, Issue 118

Edited by Sun Shixuan

Review of Past Articles -

01

02

ChatGPT and Smart Pension Finance: A New Perspective for Analyzing Artificial Intelligence Application Models | Fintech

Read on