laitimes

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

"We pursue artificial intelligence in order to finally better understand human beings."

As one of the few immersive racing games of this generation, GT Racing Sport players may never have imagined that the game they played would one day appear on the cover of Nature, the world's top scientific journal.

Yesterday, Sony unveiled an artificial intelligence technology developed by its AI division, and it has correspondingly become the "cover character" of this week's Nature, and the achievement of this artificial intelligence is to beat the world's top racing game players in GT Racing Sport.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Cover of Nautre Issue 7896

Or, the word "conquest" is more appropriate. In a showdown between four AI drivers and four pro racing players demonstrated by Sony, the champion AI's top lap time was more than two seconds faster than the best of the humans. For a 3.5-mile track, that advantage is as good as AlphaGo conquering Go.

In the past five years of research and development, the AI jointly developed by Sony's AI division, SIE and PDI Studios (the developer of GT Racing) has achieved this goal.

Sony named the AI GT Sophy. "Sophie" is a common personal name, derived from the Greek σοφ α, meaning "knowledge and wisdom".

What is the difference between Sophy and general game AI?

It is not unusual for AI to defeat humans in games. OpenAI beat the Ti8 champion OG after "meditating on" thousands of Dota2 games, Google's AlphaStar also showed a crushing posture against the top pros of StarCraft 2, and each of us ordinary players has also tasted the pain of "computer [crazy]".

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

In 2019, OpenAI once defeated OG under the restriction of only opening up some heroes

But these "defeats" are not the same thing. To understand what THE AI driver Sophy in GTS means, first clarify the difference between Sophy and its simple "AI you can't run."

For ai in past racing games, although the presentation form is a non-player-controlled "agent" in the game, the traditional AI driver is usually just a set of preset behavior scripts, and does not have the real sense of intelligence.

The difficulty design of traditional AI is generally achieved by relying on "unfair" methods, such as in racing games, the system will weaken or even eliminate the physical simulation of THE AI car as much as possible, so that the environmental parameters that the AI car needs to deal with are far simpler than the player.

And to create a more difficult AI enemy to defeat, it is only like the AI in the RTS game stealing the economic thug through the way of stealing cheating, so that the AI car quietly accelerates in the moment of not being noticed.

So for players with a certain level, the traditional AI in racing games has almost no points worth referencing in terms of behavioral logic and strategy choices, let alone professional racing game players.

Like AlphaGo, Sophy, through deep learning algorithms, gradually becomes stronger in the process of simulating human behavior: learning to drive, adapting to rules, and defeating opponents.

What this AI brings to players is a completely "beaten in a level playing field" experience. After being defeated by Sophy, a human driver gave this assessment: "(Sophy) is of course fast, but I think this AI is a bit beyond the scope of machines... It seems to have human nature and does some behavior that human players have never seen before. ”

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

This is reminiscent of AlphaGo, which rewrote humanity's understanding of Go.

Compared with the highly abstract game of information transparency such as Go, video games with more gameplay dimensions and higher computational complexity have been difficult to ensure the concept of "fair competition" after adding deep learning AI.

For example, AlphaStar, which competed in StarCraft 2 in 2019, basically did not produce new tactical ideas, but only won by infinitely learning the tactics of human players and then achieving victory through sophisticated multi-line operations - even if AlphaStar's APM is artificially limited, the efficiency of AI without invalid operations is not comparable to humans.

This is also why in AlphaStar's confrontation record with human professionals, when AI defeated the Polish star spirit MaNa with a fairy performance such as "three-line flash hunting", the unconvinced MaNa said in the post-match interview that "this situation cannot happen in the same level of human matches".

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

AlphaStar uses the Hunter "Reverse Restraint Relationship" against MaNa's Immortal Forces

Similarly, GT Racing is a realistic racing game with the same complexity as StarCraft 2.

In the eyes of professional racing players, route, speed, direction, these most basic motorsport elements can be disassembled into countless small reactions and feelings, the weight of the vehicle, the slippage of the tires, the feedback of the sense of the road... Every turn in every corner, there may be a great throttle opening, and only the top drivers can touch the feeling of "control".

In a sense, these "limits of manipulation" can certainly be explained by physics, and the scope of AI can obviously be greater than that of humans. So, Sophy's reaction speed is limited to the same level as humans, and Sony has set reaction times of 100 milliseconds, 200 milliseconds and 250 milliseconds for it, respectively — while human athletes can respond to specific stimuli after practice to about 150 milliseconds.

Undoubtedly, this is a fairer battle than AlphaStar.

Sophy learned something

Like Sophi's numerous AI predecessors, it also uses deep learning algorithms such as neural networks to train driving skills.

Sophy will be rewarded or punished for different behaviors in the training environment - it is good to move at high speeds, it is better to go beyond the car in front; correspondingly, hitting the wall when out of bounds or cornering is "bad behavior", and the AI will receive negative feedback.

In a matrix of thousands of PS4s connected in series, Sophy underwent countless simulated driving trainings to update his understanding of GT Racing Sport in the above learning. It took Sophy hours from a "baby" who couldn't drive to driving on the track; a day or two later, starting with the basic "outside and out" line, Sophy had learned almost all of the common motorsport tricks, surpassing 95 percent of human players.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Sony's AI division builds a "training ground" for Sophy

However, racing is not a one-man game. Even though Sophy was able to surpass the top human players in the race last July, in a real multiplayer game, Sophy still needs to learn to play against the opponent and understand the behavior logic of other drivers.

Therefore, researchers in Sony's AI department have carried out more "training" on Sophy, such as how to plug in to overtake and block card slots when facing other cars. In the end, Sophy was even "educated" to understand and observe racing etiquette in motorsport — such as giving way when used as a slow train, while avoiding impolite malicious collisions.

The AI car in the racing game, even if it will try to dodge the collision with the player, its implementation is only unnaturally dodged. The "game understanding" presented by Sophy is something that traditional racing AI that relies on scripts cannot do.

By October, Sophy was able to beat the top human players in the official same match.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Sony invited four human drivers, including takumi Officin, the GT Championship Treble

Take, for example, the first race on the Dragon Trail. As the driving school tail level for GT Racing Sport, every GTS player should be fairly familiar with the track (and the "Hamilton Challenge" in the DLC). After tens of thousands of hours of training, the #1 Sophy rider can already stay on top of the course of the absolute best.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

And on the second race day when four Sophi competed against four human drivers, the AI's advantages were further expanded – almost crushing the top human players.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

If you are only stronger than humans in the choice and judgment of routes, and use more stable corners to accumulate lap advantages, this may not be a big deal.

But the researchers believe that Sophy has hardly used its absolute advantage in lap speed to shake off opponents (that is, the AI as a non-human part of the "hard power"), but has also surpassed human players in its understanding of the game, such as predicting the opponent's route.

In the case given in the Nature paper, two human riders tried to interfere with the preferred route of the two Sophys by legally blocking, but Sophy managed to find two different trajectories to achieve transcendence, making the human blocking strategy work out of the blue, and Sophy could even come up with an effective way to disrupt the overtaking intention of the rear vehicle.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Sophy has also been shown to be able to perform a classic high-level maneuver on the simulated Sarthe circuit (also known as the "Le Mans Circuit"),: quickly exit the rear of the car in front of it, increase the resistance to the car in front of it, and thus surpass the opponent.

What's even more surprising to the researchers is that Sophy also fiddled some unconventional behavioral logic, which sounds like AlphaGo is using a new stereotype. Usually, racers are taught to "slow in and fast out" when cornering, with loads on only two front wheels. But The Sophi doesn't necessarily do that, it selectively brakes when cornering, putting one of the rear wheels under load as well.

In reality, only the top F1 drivers, such as Hamilton and Verstappen, are experimenting with this three-tyre quick-in technique – but Sophy learned it entirely autonomously in the game world.

Takuma Miyaen, a three-time GT Championship world champion, said after losing the battle against ai, "Sophy took some racing routes that human drivers would never have thought of... I think a lot of textbooks on driving skills will be rewritten. ”

"To better understand humanity"

Unlike the advanced AIs that have appeared in video games in the past (such as AlphaStar), Sophy's research clearly has a broader and more direct practical significance.

J. Christian Gerdes, a professor at Stanford University who participated in the paper in Nature, pointed out that Sophy's success shows that the role of neural networks in autonomous driving software may be greater than it is now, and in the future, this AI based on "GT Racing" will provide more help in the field of autonomous driving.

Hiroaki Kitano, CEO of Sony's AI division, also said in a statement that this AI research will bring more new opportunities for the research and development of high-speed robots and self-disciplined driving technology.

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Introduction to the official website of the Sophy Project

But if we turn our gaze back to GT Racing itself as a realistic racing game, the emergence of Sophy has a lot of significance for mass gamers and professional drivers alike.

As mentioned earlier in the article, in most of the immersive racing games currently on the market, "traditional AI" is already a thing that does not bring any fun to the player at all. This kind of human-machine confrontation, which relies on unfair conditions, is contrary to the driving experience that racing game developers hope to bring to players, and human players cannot learn anything from it.

In the documentary released by Sony's AI division, "father of GT racing" Kazunori Yamauchi said that developing unparalleled AI may be a great technical achievement, but it may not be a straightforward pleasure for the average player.

As a result, Yamauchi promised that at some point in the future, Sony would bring Sophy into the GT Racing 7, which will be released in March. When Sophy can better understand the environment and conditions on the field and judge the level of other drivers, an AI that is so intelligent and personable can provide players with more real happiness when racing against humans.

In the immersive racing game gradually "small circle", many manufacturers are not able to face the pure new player entry experience today, perhaps the existence of an AI teacher, there is an opportunity to bring more fun to the virtual world of the immersive driving, as the "GT Racing 4" promotional film title said, "experience car life".

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

This is probably the most important thing that a game-based AI can bring to the player — as Kazunori Yamauchi commented on the Sophy project, "We're not making AI to beat humans — we're pursuing AI to finally get to know humans better."

How did sony racing AI, which was on the cover of Nature, beat the top human drivers?

Read on