laitimes

Cars are all intelligent, but the AI is high

Cars are all intelligent, but the AI is high

At a time when every car brand is competing hard for smart cockpits, remember where the concept of smart cockpits began?

In fact, the arrival of China's automotive intelligent era is a relative time period, but the clear time point can be calculated from the 2016 Roewe RX5 equipped with zebra system to clearly shout out the slogan of "Internet car" and go public. Five years have passed, and more and more cars are equipped with intelligent car machine systems. But objectively speaking, the vast majority of products are "able but not smart": although there are many functions, they cannot reach the level of intelligence. However, consumer requirements are getting higher and higher, and the demands have changed from "whether there is any" to "whether it is good or not".

Cars are all intelligent, but the AI is high

So the development of intelligent vehicles and machines after 2022, the direction will be clearer?

Evolution of car machines: from imitation functions to hardware, consumer demand has been upgraded from 1.0 to 3.0

Looking back at the evolutionary history of vehicles and machines, it has to be said that "being able to understand" is a main line, and the development logic behind it is "subverting the process of man-machine operation".

In the era of traditional fuel vehicles, physical buttons such as knobs and buttons are one of the main interaction methods on the car. When driving, if the driver wants to carry out complex control, he often faces a situation of chaos. Therefore, the real driving element of smart cars is speech recognition.

"Computer hearing" began to enter the field of view of consumers in the iPhone 4s era, but it was first focused on English recognition. Chinese speech recognition did not begin to enter the mature stage until 2016, and the accuracy rate of about 95% of machine speech recognition was close to the human level for the first time, and intelligent speech technology also had the basic conditions for entering the car cockpit.

The 1.0 generation of internet cars is coming from then on. In 2016, SAIC motor and Alibaba cooperated to launch the world's first mass-produced Internet car equipped with Zebra Zhixing system, the Roewe RX5. If you sit in roewe's cabin this year, you can control your car through voice to complete commands, such as " Hello, Zebra, open a skylight" . Since then, this in-car voice function has spread in controversy, and even Japanese and European car companies that are regarded as conservatives have begun to introduce domestic models.

Cars are all intelligent, but the AI is high

Since then, intelligent and networked technologies have been continuously upgraded and iterated, prompting the rapid expansion of intelligent applications in automobile cockpits, and the on-board intelligent voice system has gradually entered the 2.0 era. More Internet companies have become third-party technology suppliers, from the independent iFLYTEK, Cerence, and later BAT's comprehensive entry into the joint, more mobile phone applications began to appear on the car machine, such as maps, real-time road conditions, music playback software, and even cross-talk, reading APP, etc., so that the entertainment system in the car is more and more abundant, and the voice control is more fluent and extensive.

From 2019 to the present, the concept of automotive intelligent cockpit has been gradually finalized, and Internet car manufacturers represented by the new forces of electric vehicles have even proposed the concept of "intelligent new species evolution". The central control and instrument in the car have produced a lot of interaction, the product form has also begun to appear multi-screen, HUD-head display, etc., and the system's UI and screen are also more refined, and the voice control has changed from imaginary instructions to "see can be said" full-scene voice, making the experience of the intelligent cockpit more immersive.

Cars are all intelligent, but the AI is high

However, realistically speaking, although there are more screens, the actual experience has not been doubled, and the early car chips have to drive more screens and more complex animations. Most of the experiences of touch screen and voice control are only differences in quantity, and there is no real difference in system-level experience, so this can only be called the 2.0+ era.

Review the three phases of smart cockpit development from 2016 to 2021:

The 1.0 era is from scratch, solving the problem of car machines from traditional to intelligent, similar to mobile phones from telephone calls to Symbian systems.

The 2.0 era is from the best, adding OTA functions, maps, music, entertainment functions are rich, similar to the emergence of Apple systems and Android systems in the mobile phone era, the software explosion.

The 2.0+ era is from excellent to refined, that is, at this stage, most of the cockpit systems are more accumulated on the basis of 2.0, such as large screens, multi-screens, sub-screens, video software, etc. Every car company is still fighting hardware.

Cars are all intelligent, but the AI is high

So what should the smart cockpit 3.0 era look like? This point still has to start from the needs of users.

The one who breaks the inner volume must understand you better

To define the requirements of the Smart Cockpit 3.0 era, we must first understand who defines this requirement. The answer is obviously easy to see, not car companies, not suppliers, not Internet companies, but users, drivers.

Just like the trajectory of marketing development, the end point of the functional experience is also the user. What the earliest enterprise produces, the user consumes; later what the user lacks, the enterprise manufactures what; now it is the user who defines his own needs, and the enterprise goes to meet the user. This is how corporate marketing has changed from the 1.0 era to the 3.0 era.

Cars are all intelligent, but the AI is high

Therefore, 1.0 solves the problem of existence or not, 2.0 solves enough, then the 3.0 era of the smart cockpit is naturally "easy to use". This "easy to use" must be defined by the user, and if the users of thousands of people feel "easy to use", it must be that the system can meet the different needs of each person, that is, it has the ability to understand perception, think, and evolve.

This seemingly "bragging" function has gradually spread out in our life scenes.

If you've ever used an iPhone, you'll find that phones are always getting smarter. When your alarm goes off, you can see the weather forecast, click on the phone screen to see the APP you are used to presenting on your homepage; when you pick up your phone and click on the map, the first choice is to give a forecast of when you expect to arrive at the company.

All of this is a prediction made by the mobile phone based on the time and place you use a certain APP, so the more you use it, the more comfortable it is. Obviously, the self-learning of this system will make you feel that using your phone is getting smoother and smoother.

The evolution of Smart Cockpit 3.0 also follows this line of thinking, as long as the car has enough smart AI, fast enough network connection, plus enough powerful computing power. Unlike the visual presentation of flipping through the mobile phone screen all the time, the interaction of the smart cockpit is visual on the one hand, and hearing on the other hand. To achieve "intelligence" in voice control in the 3.0 era, it is necessary to self-learn, continuously upgrade, and even predict the driver's vague instructions for long semantics.

Cars are all intelligent, but the AI is high

In the 2.0 era, although more and more cars have achieved voice control car functions, the car-machine dialogue is generally "very silly", and it must use very accurate instructions to wake up. To this end, many car brands often only increase the thesaurus in order to enhance the "intelligence" of the car machine. However, this stupid method is like "crashing", one is that the user has to remember more and more words, and the other is that the machine has to constantly load these instruction words, and over time the efficiency becomes lower.

The "cultivated AI semantics" of the Roselle intelligent cockpit subverts the above conventional practices and opens the first year of the emotional needs of the car machine 3.0 era. This set of semantic systems jointly created by Zebra Zhixing and Ali Damo Academy is based on the massive semantic library of SAIC and Ali and has a deep language learning model system, for simple problems, users can also understand in seconds, easily identify various personalized instructions. Most importantly, "cultivated AI semantics" is automatically labeled and trained through the cloud, and it can also learn independently and share results, so that it can be understood, learned faster, and will be more proficient, so that the dialogue can evolve.

For example, some car owners will directly say "open the window" for the command "open the window". In traditional voice commands, because this very small probability of instruction is generally not solidified in the in-vehicle system. However, when you have "cultivated AI semantics", then it is possible for the cloud to label the functions that these voice requirements want to achieve, and then when you issue such commands, the car can respond and complete more and more smoothly.

Warmer car machines lead to more understanding cars

The Roselle intelligent cockpit system in the 3.0 era is a kind of "companionship", it is not just a set of systems, but more like a good partner for young people.

Cars are all intelligent, but the AI is high

One reason behind this is that it is not until the end of 2021 that in-vehicle systems usher in smart chips that are close to this era. Before the automotive industry first used Qualcomm Snapdragon 820A, but the civilian product of this chip is the Snapdragon 820 chip launched in 2016, the computing power and processing power are not enough to support the intelligence of 3.0.

Cars are all intelligent, but the AI is high

In addition to the blessing of the chip, in order to solve the experience of "understanding", SAIC Roewe has also made more enhancements to the voice function: the main driver does not need to wake up to directly initiate commands; multiple microphones to achieve the control of the four-tone area in the car, the communication between the front and rear members and the vehicle is more natural, and the driver can be more focused. This means that the system is always "on standby" and can accurately capture the user's instructions.

In terms of interaction, the instrument screen and the central control entertainment screen are integrated into one left and one right, and the most needed information is integrated and intelligently presented according to the driving scene. In the navigation state, when you need to swipe the screen to select music and other functions, the map overview information will intelligently flow to the meter. Under the condition of complex intersection and collision risk, the instrument automatically switches AR-Driving, and the lane-level VR navigation intelligently flows to the central control screen.

At the same time, the new cockpit system also supports two kinds of operator interface settings: one is the "map is the desktop" setting, and the other is the personalized DIY interface setting. The function has also added NetEase Cloud and B Station APP to be closer to users. Although the video has been available before, watching Station B and iQiyi on the bus is of course two different experiences.

The driver concluded with >>>

Cars are all intelligent, but the AI is high

And we can't help but ask, why is it SAIC that is in front of us this time? The answer may be related to the concept of enterprise development, SAIC Passenger Vehicles continues to be active in intelligence, and recently became the only independent brand car company to win the honor of "Intelligent Manufacturing Benchmarking Enterprise" at the national level in 2021.

As we all know, the current automotive industry is experiencing major changes unprecedented in a century, everything is innovating, everything is reconstructing. Any automobile company in the tide of the times should fully recognize the changes in the situation, and keep pace with the times, constantly break through innovation, and seize the opportunities of the times.

The author | Liu Xuexiao and Zhang Junling

Figure | network

Cutting-edge information Original perspective

The most compelling original automotive new media brand

Sina Weibo: @Driver Pie

Driving Pie has now entered the major media platforms

The average daily view of the whole network exceeds 1,000,000 times

Read on