laitimes

Artificial intelligence is a big energy consumer, but what about "human intelligence"?

author:Popular Science China

Not long ago, OpenAI's CEO Sam Altman declared that the limiting factor for the future development of AI will be energy, and the development of nuclear fusion is urgently needed. In fact, back in 2021, he invested $375 million in a nuclear fusion company, which recently said its first power plant was expected to come online in 2028.

This is not the first time that the issue of energy consumption has attracted attention. During the boom of blockchain and cryptocurrency, many people expressed concern about the energy consumption of blockchain. Although the heat has subsided slightly, the energy consumption is still considerable, and the annual energy consumption of Bitcoin mining is about equivalent to that of the whole of the Netherlands, and the 2024 report of the US Department of Energy believes that about 0.6%~2.4% of the annual energy consumption of the United States is spent on cryptocurrencies.

At the moment, it seems unlikely that cryptocurrencies will grow indefinitely, engulfing the entire power grid. At the moment, ChatGPT provides about 200 million answers per day and consumes 500,000 kWh of electricity (click here for details→ ChatGPT consumes more than 500,000 kWh of electricity per day, and it is energy that is stuck in the development of AI?), and the proportion of the power grid is not large, and Ultraman's prediction is suspected of boasting. But if his prediction comes true, and the scale and energy consumption of artificial intelligence grow rapidly over the long term, it is indeed possible to trigger an energy crisis.

Artificial intelligence is a big energy consumer, but what about "human intelligence"?

The stock copyright picture, reprinting and using may cause copyright disputes

How "efficient" is the human brain?

In contrast, traditional natural intelligence (the human brain) seems to be much stronger. But in fact, no one can say how efficient the human brain is – because it is difficult to find a benchmark for comparison.

For example, a common measure of a computer's speed is the clock speed, which is how many electrical impulses a processor clock can produce per second, which largely determines how many basic operations the processor can perform per second.

If the human brain is considered by this standard, then it seems that the main frequency of the human brain should be less than 1,000 Hz, because neurons can send nerve impulses as fast as 1,000 peaks per second, and synapses need to transmit messages at the fastest in a thousandth of a second. The frequency of 1,000 hertz is very pitiful by computer standards, the first commercial microprocessors in the 70s were already 700 times more clocked than it, and today's mainstream processors are millions of times more than it.

Does this mean that the human brain runs only a millionth as fast as a modern computer? Apparently not, because there is a fundamental difference in the basic architecture between the two. For example, each neuron in the human brain is often connected to thousands of other neurons, which means that a "basic operation" often involves more than a thousand inputs, which is completely incomparable to a transistor that can only handle three input-outputs. In fact, even if it is limited to the inside of the computer, the clock speed between processors of different architectures cannot be compared casually.

In its resting state, the human brain weighs about 2% of the body's weight, but consumes 19% of the body's energy. It sounds like an exaggeration, but it's actually not very special.

The liver and spleen weigh only a little more than the brain, but they consume 27% of energy. The combined weight of the two kidneys is less than one-fifth of the brain, but the energy consumed accounts for 10% of the human body, which is equivalent to half of the brain. The weight of the heart is also less than one-fifth of the brain, and the energy consumption accounts for 7% of the human body, which is equivalent to one-third of the brain. It is to be expected that a small number of active organs consume most of their energy, and the brain is only a regular level in these organs.

Artificial intelligence is a big energy consumer, but what about "human intelligence"?

The stock copyright picture, reprinting and using may cause copyright disputes

Now that neural networks have become the mainstream of artificial intelligence, it also provides a new way of comparison: not to make a pure hardware-level comparison, but to compare the human brain with specific neural networks.

Of course, there is no neural network that can be functionally comparable to the human brain, but in terms of scale alone, if humans can unlock the secrets of the human brain in the future, then about 1,000 agents running on 1,000 GPUs can achieve a model at the scale of the human brain. Each agent needs about 1 kilowatt of power, and 1,000 of them require 1 megawatt, which is 50,000 times that of the corresponding human brain. (By the way, the floating-point computing power in the world at the moment can support about 5 million of these brain models.) Again, this comparison depends on future theoretical advances, and neural networks are still not comparable at this point.

In any case, the human brain does seem to be more efficient than a computer. This is, of course, the product of billions of years of natural selection – the primitive nervous system must have been subject to energy, and with the increase in energy efficiency, the emergence of the brain became possible. However, the human brain has not necessarily reached the limits of theory at this moment.

Is the human brain necessarily the "optimal solution"?

A common misconception about evolution is that it necessarily produces optimal solutions. This misconception has multiple implications:

First, there is no such thing as an optimal solution in the general sense, and all judgments about the advantages and disadvantages can only be discussed in a given environment, which is constantly changing.

Second, even in a stable environment, the global optimal solution may not be reached. Evolution is gradual and short-sighted in the vast majority of cases, often trapped in the local best, like a climber who insists that every step must be up a mountain, and may end up on a small hill and unable to reach the true highest peak.

Third, the speed of evolution is directly proportional to the selection pressure, and when the selection pressure is not large, it will take a long time to reach the local optimal solution, and there is no reason to think that the human brain has reached the top at this moment.

Fourth, there are a large number of contingencies in evolution, the importance of which has not yet been determined, but it should be enough to prevent the realization of an optimal solution.

Some AI theorists are very concerned about whether the human brain has reached the energy efficiency limit of neural networks, because this fact determines the long-term direction of artificial general intelligence (AGI): if the human brain is still far from the theoretical limit, then AGI can surpass the human brain in the future, triggering accelerated technological progress, and may even lead to the birth of technological singularity.

But if the human brain is the limit, then AGI will be severely limited by the energy output of humans at the moment, the take-off speed will be very slow, and the possibility of the singularity will be greatly reduced, which also means that the simulated human brain will become the only route for practical AGI.

So far, however, there has been a lack of substantial evidence on both sides of the debate. It is possible that future advances in the field of AI will prove that the neural network route is actually a dead end, and that the real AGI will come from other directions, in which case these discussions will be meaningless.

Artificial intelligence is a big energy consumer, but what about "human intelligence"?

The stock copyright picture, reprinting and using may cause copyright disputes

In any case, the struggle for computational efficiency is a great progress for human civilization, because aside from the limitations of neural networks, computation itself is still far from the limit. This is very different from almost all the forces of nature that humans have controlled in the past.

You have to exert as much force as you want to lift as much load, and as much kinetic energy as you want to make the car move at as much speed as you want to make it. The manipulation of physical matter has the most basic guaranteed energy requirements, and there is not much room for saving, and at this moment, the most efficient power source of human beings is basically more than ten percent, leaving less than ten times the room for improvement. These spaces are not unimportant, and they may be enough to reverse the climate crisis we are experiencing right now, but they are far from enough to support an infinitely growing civilization.

But computing doesn't manipulate entities, it's information. There is indeed a theoretical lower limit to the amount of energy consumed in the calculation, but it is so small that it is about 2.9 × 10^-21 joules at room temperature. Therefore, the progress in this area is unprecedented.

The UNIVAC I in 1951 consumed one joule of energy to complete 0.015 operations, and the 2022 supercomputing "Henry" consumed the same energy to complete 65 billion operations, which has increased by more than a dozen orders of magnitude in 70 years, but is still ten orders of magnitude away from the room temperature limit.

If the room temperature limit is abandoned, the efficiency can be further improved. The lower limit of energy consumption is proportional to the ambient temperature, and most of the calculations of a highly developed technological civilization are likely to be carried out in space, with the temperature of 2.7K of microwave background radiation as the benchmark for the lower limit.

If a civilization has been around long enough, it may even choose to wait for the universe to expand and cool down in order to achieve higher computational efficiency—in 10^12 years, the computational efficiency limit will be 30 orders of magnitude higher than today's limit.

Artificial intelligence is a big energy consumer, but what about "human intelligence"?

The stock copyright picture, reprinting and using may cause copyright disputes

To put it another way, humanity still has a long way to go when it comes to calculating energy consumption. Perhaps, in the past few decades, humanity has been spoiled by Moore's Law, obsessed with the cheapness and speed of chips, and neglected the efficiency of computing at many levels - even just adjusting the temperature to run an Android system.

But Moore's Law is not a law of nature, but rather a KPI – the actual pace of chip progress has been a bit out of step since 2010. At the moment, neural networks are not the main body of human energy consumption, but one day computing will reach this point, and we should be ready for everything possible before that day comes.

Planning and production

Author丨Fan Gang is a popular science writer

Audit丨Yu Yang, head of Tencent Xuanwu Laboratory

Planning丨Xu Lai

Editor-in-charge丨Wang Mengru

Reviewer丨Xu Lai Linlin

Read on