laitimes

Artificial intelligence is "played"?

author:Vegetable root Tan Vision

Today I want to talk about a special otaku topic - graphics cards and artificial intelligence.

You see these two days, the 2023 World Artificial Intelligence Conference (referred to as "WAIC") was held in Shanghai. At this meeting, Musk opened, AMD's Su Zifeng and a number of Nobel Prize, Turing Award winners and other bigwigs always mentioned a word when discussing artificial intelligence - computing power, behind the computing power is the graphics card.

As an important hardware foundation, it can be said that the graphics card is related to the prospects and future of the entire artificial intelligence.

You may feel strange that the graphics card, such a thing for otaku to play games, has come to today's status step by step, and has become the key to artificial intelligence.

Nowadays, major manufacturers want to train artificial intelligence models, and the first thing is to stock graphics cards. Recently, the Wall Street Journal reported that the United States is expected to ask American chip companies to stop selling AI chips to Chinese customers as soon as this month. The GPU of the graphics card is also essentially a chip, that is, rounding up, from the second half of the year we may face a graphics card crisis.

Maybe you don't know, for the future development of science and technology, graphics cards are indeed a vital thing, and some people even say that the key to victory on the future battlefield is also strongly related to graphics cards, which makes people feel a little magical.

For readability, the article does not distinguish between the concepts of "graphics card", "GPU", and "GPGPU", and all are expressed uniformly by "graphics card".

There were certainly no graphics cards at the beginning, and certainly no video games. Later, after the first video game came out at MIT in 1961, no one expected that everyone would like to play games so much. The earliest obsession with games happened to be the early computer bigwigs.

For example, everyone knows the Unix operating system, which was first developed by the super god Richard to play the Star Wars game, and later became the operating system of the US military, universities and research institutes.

There is also Musk, who created a small game through programming at the age of 12, and is still a die-hard fan of "Elden Law Ring" and "Cyberpunk 2077" at his age.

However, the earliest games did not run on graphics cards, there was no graphics card at first, and the image at that time was relatively simple and not used at all.

Until the 90s of the 20th century, as the scale of games became larger and larger, the picture became more and more refined, and began to be 3D, and the CPU requirements became higher and higher, and it happened that the CPU was not good at graphics processing. So many people realized that there should be a special hardware to do this, improve efficiency and experience, and someone should buy it.

Driven by this concept, dozens of companies were soon established, and the graphics card entered the stage of group dominance.

In the process, three young men decided to make a big deal in a remote fast-food restaurant in San Jose and formed a small company that no one noticed at the time, but later became famous and now exceeds the trillion-dollar market capitalization, known as Nvidia.

Nvidia did not lead the way as soon as it appeared, but struggled to survive, moving to the super enterprises of the time to provide them with better graphics processing chips.

It has only one purpose, which is to try to develop products that gamers are willing to pay a lot of money to improve the user experience. Therefore, there is no complicated business war, it is purely to continuously hire top talents, develop stronger graphics cards, and provide a better gaming experience.

From now on, NVIDIA's success responds to Ren Zhengfei's phrase, "do what the market needs them to do", that is, constantly develop products that otaku are willing to spend money on.

As we all know, men's spending power is not as good as dogs, but in terms of graphics cards and games, men have supported a huge plate.

If NVIDIA has any decisive and correct strategy, it is that Huang Jenxun deeply bound NVIDIA and Microsoft in the late 90s of the 20th century after a period of confusion.

You don't know if you have an impression, we played games when we were young, it seems that there are a bunch of special game consoles, what backgammon, blue and white machines, later handhelds, and Atari's arcade machines and so on. But it seems that since college, playing games is basically on computers, and mainly on computers with Windows operating systems.

Apple's computer game game is really can't be seen, as for Linux or something, mainly for office development and deployment of servers, that is, everyone usually basically can't see Linux, but it is almost impossible to avoid Linux every day, everyone opens the app with the mobile phone, and the server that the app to request data is basically Linux.

Without in-depth cooperation with Microsoft, NVIDIA cannot become the trillion scale it is now, because Microsoft's user base is too large. Such a large user base provides enough purchasing power to allow NVIDIA to sell enough products and invest in the next round of research and development to consolidate its position.

Then again, without Nvidia, Microsoft's operating system audience would not be so wide, because Microsoft really can't compare with Macbooks in terms of office efficiency. So that the basic configuration of our senior otaku is usually to use a Mac at work, compile code with a computing cloud, go home to play games with a desktop equipped with a Microsoft operating system and NVIDIA graphics card, in many people, Microsoft's operating system is used to play games.

In other words, Intel, Microsoft, and NVIDIA have achieved each other and played a grand slam.

The general process is that after the new graphics card is developed, everyone buys it back, installs the game they want to play for two years, and then the game manufacturer engages in development on new hardware, and the more development the more self-indulgence, there are games on the market that the current graphics card really can't be brought with.

The majority of otaku began to press hard, and constantly urged Lao Huang to hurry up with a higher performance graphics card. High performance is generally more expensive, but otaku are willing to pay, so Lao Huang also dares to borrow money from investors to engage in research and development, anyway, there are people who buy it after research and development, and then they can recover their investment.

It is precisely because the otaku have always been willing to spend money to support, so that NVIDIA has more funds to develop stronger graphics cards, forming a positive feedback loop, lasting for more than twenty years, burning hundreds of billions of funds before and after, eliminating countless peers, with today's scale, there is no intention to stop now. Today's graphics card is basically only NVIDIA and AMD, and from the perspective of market size, NVIDIA is basically the only one.

The game graphics have also been iterated over decades, and the early ones look like this:

Artificial intelligence is "played"?

It became like this:

Artificial intelligence is "played"?

So the question is, how is the graphics card that plays the game related to artificial intelligence?

First of all, I will tell you a story, mentioned many years ago when our teacher told you about the difference between graphics cards and CPUs.

In 1979, Iranian demonstrators stormed the U.S. Embassy and took 66 embassy employees hostage. A check of the number of people found that there were a few missing, but all the information about all was put into a shredder by the Americans and turned into thin strips the width of several sacks of matchstick.

The Iranians were also unambiguous, using thousands of elementary school students to compare the paper piece by piece, and finally restored the original data, opening a new chapter in violent decryption. This was later filmed in the movie "Escape from Tehran."

Speaking of which, it is actually very clear, what is the difference between CPU and graphics card? The CPU is similar to a college student, you can assemble a very complex imported equipment for the laboratory and debug it well, but do you still have an advantage if you let him calculate the addition and subtraction of a sack of multiplication and division?

Obviously not, at this time, it is not as good as a group of elementary school students who have just mastered the four arithmetic, just like the Iranians just mentioned playing puzzles, relying on the blessing of brute force, everyone rushes up, each person takes a few copies, and then calculates, and quickly calculates.

Since the advent of artificial intelligence, today's graphics cards have felt omnipotent, but the reality is that its basic principle is that it is stuffed with a bunch of elementary school students and can perform super-large-scale parallel computing.

I will explain to you a little, everyone can immediately understand what the graphics card is counting, the process of playing games, the essence is that the graphics card is there to continuously draw for you, such as the well-known "frame rate", 100 frames, in fact, the graphics card draws a hundred pictures for you in a second, one picture after another, you feel that the video is very smooth.

But the "drawing" we are talking about here is not really painting with a canvas, but a bunch of tedious but not complex graphic operations. Graph operations are basically some matrix operations. It is estimated that everyone is confused when they see the "matrix operation". I found a random picture for everyone, and the following looks like this:

Artificial intelligence is "played"?

This thing is "linear algebra", generally science and engineering freshman began to learn, in the eyes of the learned partners, this thing is a primary school operation, it looks complicated, but in fact it is not complicated, it is a bunch of addition and multiplication, very cumbersome. But this kind of operation is very common, such as the GPS that everyone usually uses, in fact, it is also such a matrix operation.

Everyone must have heard that when the mainland made the first atomic bomb, a bunch of college students from Harbin Institute of Technology went to help knock out the abacus, right? I have professors in my previous university who participated in those projects, to put it bluntly, it is actually to solve calculus with bare hands, first convert the differential equations into this matrix, and then there is the cumbersome addition and multiplication operations, and the advantages of the abacus come out.

That is to say, the main advantage of the graphics card is to stuff a bunch of elementary school students, very good at multiplication and addition, and after decades of iteration, players vote with money, graphics cards continue to evolve, "parallel brute force" is more and more terrifying, but at first everyone did not understand that it has other functions besides playing games, until someone began to mine with it.

Mining, that is, mining bitcoin or Ethereum those cryptocurrencies, the essence is to calculate a very complex and cumbersome encryption algorithm, graphics card is naturally suitable for this kind of powerful miracle operation, so NVIDIA also caught this express train, during the epidemic stock price skyrocketed.

However, NVIDIA management itself did not expect that their graphics card could do something else.

Around 2011, the management of NVIDIA, which had already worked as a leader in the game industry, suddenly heard something that they did not expect.

It turned out that MIT was conducting its earliest artificial intelligence training, and the researchers wanted the computer to quickly recognize a picture of a cat.

How to identify it? Can only let the machine pixel by pixel processing, which involves a lot of repetitive and tedious operations, at the beginning MIT researchers used CPU to process, the effect is extremely poor, just said, CPU is a bit like a very good doctoral student, can do very complex things, but can only do one thing at a time, in the face of the sea like addition, subtraction, multiplication and division, a little "strong can't make up" feeling, the calculation is extremely slow.

Later, I began to use the GPU, which is known as the graphics card, and the graphics card was like the Iranians commanding a group of elementary school students to play puzzles, which is often heard of "parallel computing", which is naturally suitable for "simple and large" operations.

That is, graphic displays, artificial intelligence, and decryption algorithms are essentially "brute force operations". Now everyone knows why graphics cards can be used in aerospace, weather forecasting, energy exploration and other fields at the same time, because these fields involve super large-scale matrix operations, as long as they involve ultra-large-scale brute force operations, that is the world of graphics cards.

This is why many people say that the future battlefield is also the world of graphics cards, especially air combat, how fast can people react? How fast those smart weapons with graphics cards react, quickly collect information, quickly calculate to evade or detonate operations, it feels mad to think about.

So now NVIDIA also has an important business, that is, "data center", build a large computer room, provide computing power to others, and charge by the way.

It is precisely because of the large-scale application of NVIDIA graphics cards in games, data centers and artificial intelligence that the market prospects are promising, and now NVIDIA has become a member of the trillion club.

Yes, although the game still retains a bit of an impression in the minds of the public, the use and evolution of artificial intelligence is closely related to games, and even the first product that subverts human cognition about artificial intelligence is born in the form of "Go battle". Today, artificial intelligence graphics cards are an important foundation for training artificial intelligence.

At WAIC this year, we also saw that some research institutions released game and artificial intelligence reports, analyzing the symbiotic development relationship between them from the three levels of theoretical research, hardware iteration such as graphics cards, and application scenarios. Xiamen University has also joined hands with Communication University of China, Central Academy of Fine Arts, Beijing Institute of Technology, Shanghai Jiao Tong University and other university departments to prepare for the establishment of the "Joint Research Center for Game Artificial Intelligence Universities". Pretty good.

Some friends may say, how can this help us? Can China come up with it quickly?

Small, small pattern. China's biggest advantage is the huge consumer market, and market demand is often pregnant with infinite possibilities. If we cultivate well, under the wave of global artificial intelligence, we will definitely be able to occupy an important position. This may also be the reason why Musk affectionately confessed at the opening ceremony of WAIC that he "believes that China will have very powerful artificial intelligence capabilities in the future".

End:

We can also see from the rise of artificial intelligence:

The consumer market is the petri dish of hard science, the essence of the consumer market, is people's yearning for a better life, after all, everyone works to make money, save money to buy what they like, use more intelligent tools, mainly these things can bring themselves joy or solve trouble. This yearning for a better life is the most primitive driving force for scientific and technological progress.

And we can also see that many times greatness is not planned, but grown, and the people who do PS don't know what their products can make, and ASML, which makes lithography machines, can't get the same process as TSMC. Huang did not expect that his game card would one day become the cornerstone of artificial intelligence.

Of course, the conversation is about games and graphics cards, games and artificial intelligence, the reality is far more than games, too many fields are like this, you can't predict what it will become in the future, no one can say which in a few years, which branch of a fruit will fall to the ground, will smash out a new industry.

After the full text, since you see this, if you think it's good, feel free to like it.

Transferred from the WeChat public account Jiubian

Read on