laitimes

Is Mobileye really not working?

Is Mobileye really not working?

Now the domestic singing of decline Mobileye's rhetoric is intensifying, and there is only one core reason: from 2020 to 2021, the head of smart electric vehicle brands basically all choose NVIDIA Orin as their own autonomous driving chip.

Is Mobileye really not working?

Before 2019, it is still a Mobileye EyeQ4 dominating the state of the entire industry, and by 2020, the models equipped with Mobileye EyeQ5 are only Extreme Kr 001 and BMW iX, and Extreme Kr has also been rumored to be equipped with other brands of chips on the next generation of models, and BMW has made it clear that it will use the Qualcomm Snapdragon Ride platform to create autonomous driving by 2025.

There are also two reasons why these car companies turn to NVIDIA, on the one hand, Mobileye adopts a black box solution, and the head car companies that pursue breakthroughs have no operating space at the software level and have to switch to a more open NVIDIA or horizon platform.

Is Mobileye really not working?

On the other hand, the head car companies hope to achieve automatic driving as soon as possible, so they will be equipped with as much perception hardware as possible, in the face of the huge data generated by these hardware, the demand for chip computing power will naturally become more and more intense, and Mobileye's latest EyeQ5H chip computing power is only 24 TOPS, it is difficult to meet everyone's needs, and have to switch to Nvidia or horizon chips with large computing power.

But Mobileye's accumulation since 2007 is so worthless? Can the new self-driving chips really replace Mobileye? Is Mobileye really not working?

After watching this year's CES Mobileye conference, I think it's too early to make up my mind.

Mobileye's technical accumulation of leadership is unquestionable, but in the transformation of commercialization has not kept up with the changes of the times, and Nvidia stepped on the needs of manufacturers, born at the right time, but the use of Nvidia chip models, algorithm capabilities depend on the ability of manufacturers, at least not yet a car company has shown muscle.

Therefore, it is more worth observing whether the adjustment speed of Mobileye's commercialization ideas is faster, or the speed of the manufacturer's self-developed algorithm is faster.

From the perspective of the development of the times, the development of autonomous driving chips has entered a stage of rapid growth, and competitors have also surged rapidly, competition is indeed fierce, Mobileye has lost the orders of head car companies, but in the face of a larger market and no algorithmic power of waist car companies, Mobileye is still the best choice.

At CES this year, Mobileye released a total of 3 chips and revealed the current progress of its 3 strategic technologies.

After the launch, we also had the opportunity to interview Erez Dagan, Executive Vice President of Mobileye Products and Strategy and Vice President of Intel Corporation, during which many sharp questions were asked to Erez, such as:

"Because Mobileye adopts the black box technology route, resulting in doubts and hesitations in the cooperation between domestic manufacturers and Mobileye, how does Mobileye view this phenomenon?"

"The hash rate of the EyeQ Ultra chip is 176 TOPS, which is not high compared to some industry competitors such as Nvidia, how does Mobileye view the relationship between computing power and L4 or above unmanned driving?"

But before we get into these explicit questions, let's take a look back, what did CES Mobileye release this year?

01

What is the level of three chips?

At CES this year, Mobileye released three chips, EyeQ Ultra, EyeQ6L and EyeQ6H, which is also the most concerned part of cess this year, let's start with the most powerful EyeQ Ultra.

Is Mobileye really not working?

EyeQ Ultra

The EyeQ Ultra has a hash rate of 176 TOPS, 12 RISC-V CPUs, each with 24 threads, a total of 64 core accelerators, using a 5nm process process, and consuming less than 100 W.

Is Mobileye really not working?

The accelerators mentioned above contain a total of four types, one is a pure deep learning computation; the second is an FPGA accelerator called DGRA; the third is a long instruction set similar to a DASP accelerator; and the fourth is a multithreaded CPU, each core is responsible for a different workload.

EyeQ Ultra's engineering samples are expected to be produced in the fourth quarter of 2023, with official mass production in 2025.

Mobileye defines the chip as "a single chip that can support L4 level autonomous driving."

However, after the release of the above data, there are two common doubts, on the one hand, questioning whether the computing power of 176 TOPS can achieve L4-level automatic driving, after all, it has been equipped with chips with more than 1,000 TOPS of computing power on Weilai ET7 and ET5; on the other hand, the question is that from 2022, advanced assisted driving will enter a stage of rapid development, and Mobileye's chip will be mass-produced in 2025, which is too late.

From the perspective of time, the current pace of the head car companies is indeed faster than Mobileye, but there is no shortage of car companies that are purely bragging about the possibility, after all, the lidar storm that began in early 2020, and only one model of Xiaopeng P5 was mass-produced in early 2022.

Speaking of computing power, compared with the NVIDIA Orin chip delivered in 2022, the EyeQ Ultra delivered in 2025 does not have any advantage in computing power data, but Mobileye CEO Shashua seems to have guessed that everyone will say so, and Shashua also explained at the press conference:

"Compared to the competitor's number, 176 sounds like a small number, about one-fifth of the competitor's claimed hash rate." But the key is not only computing power, but efficiency, which requires in-depth understanding of what the interaction between software and hardware is, understanding what is the core, and what algorithm is used to support the corresponding core."

At the same time, Shashua also took the SuperVision equipped with Extreme Kr as an example, using 11 8-megapixel cameras of Extreme Kr 001, relying on 2 Mobileye EyeQ5 chips to complete the full link cycle of perception, regulation and execution, and the computing power is only 48 TOPS.

In a follow-up interview, we also asked Erez Dagan, executive vice president of mobile product and strategy and vice president of Intel Corporation, and then further developed.

EyeQ6H

EyeQ6H is an advanced version of EyeQ5H, with a hash rate of 34 TOPS, using a 7-nanometer process, according to the official introduction, the computing power is nearly 3 times more than EyeQ5, but the energy consumption is only 25% more.

At this stage, Mobileye implements SuperVision through 2 EyeQ5s, and the next generation of SuperVisions will be implemented through 2 EyeQ6Hs, and even 1 EyeQ6H can be achieved.

EyeQ6 will deliver engineering samples in the fourth quarter of this year and officially mass production in 2024.

Is Mobileye really not working?

EyeQ6L

EyeQ6L You can understand as a low-end version of EyeQ6, also using a 7-nanometer process, has a computing power of 5 TOPS, power consumption of only 3 watts, half a year ago has been delivered engineering samples, is expected to be mass production in 2023.

Is Mobileye really not working?

Everyone's evaluation of this chip is mixed, and many netizens believe that because there is not much improvement in computing power, it is a kind of regression.

Compared with the large computing power chip that has left a deep impression on everyone's heart, and then seeing a chip released in 2022 but with a computing power of only 5 TOPS will indeed have a certain gap in the heart, aside from the statement that "not based on computing power as the only standard", a more powerful data is that the order volume of EyeQ6L has exceeded 9 million pieces.

In an interview at the press conference, Erez Dagan also said: "EyeQ6L has more computing power and lower power consumption to meet the needs of the underlying ADAS segment, where we need a highly integrated, very efficient, cost- and power-efficient solution, and EyeQ6L undoubtedly fully meets the needs of the market in both areas."

In addition, the product takes into account the potential of additional cameras, such as driver monitoring systems or AEB. EyeQ6L is the foundation of ADAS that was created to meet different standards around the world."

02

Compared with chips, this is mobileye's more core competitiveness

In addition to the chips mentioned above, Mobileye also revealed the progress of its three pillars of strategy at the press conference.

Crowdsourced map mapping REM

At this year's CES, Mobileye announced that the existing crowdsourcing fleet has collected a total of 4 billion kilometers of data during 2021, and can now collect 25 million kilometers of data per day, at this rate 9 billion kilometers in 2022.

Is Mobileye really not working?

Compared with high-precision maps, REM maps have the characteristics of higher collection convenience, faster collection speed, lower cost and higher freshness.

However, it is regrettable that Mobileye did not give detailed data at the press conference, and because of the legal and regulatory restrictions of China's roads, REM did not officially enter the country, so for the majority of consumers, they did not enjoy the advantages brought by REM.

Is Mobileye really not working?

According to Garage 42, due to the lack of REM, the landing time of the extreme Kr 001 navigation assistance driving function is not determined, and the compliance problem can only be solved by relying on Geely.

At present, the more bright progress in foreign countries is that Volkswagen ID.4 has realized the auxiliary driving function of laneless route sections based on REM maps.

In addition, Ford's next-generation Blue Cruise and Krypton equipped with Mobileye SuperVision will also use REM maps for higher-level intelligent driving.

Is Mobileye really not working?
Is Mobileye really not working?

Imagable radar

In most people's cognition, Mobileye is similar to Tesla's pure visual route, which is true, but it is not particularly rigorous, Mobileye adopts a visual-based route, but it also has its own achievements in the two fields of millimeter-wave radar and lidar.

Is Mobileye really not working?

This millimeter-wave radar is not a millimeter-wave radar

At present, the millimeter-wave radar on the mass production vehicle only has the ability to measure distance and speed, and does not have the ability to image, which means that the millimeter-wave radar can only know how far away there is a thing compared to the visual camera, and does not know what this thing is, nor can it produce a model of the surrounding environment, providing more valuable information for intelligent driving, and the ability in a congested environment is even more limited.

Shashua also said bluntly at the press conference: "In the past, this traditional radar was basically meaningless as a standalone sensor."

Mobileye's definition of millimeter-wave radar has imaging capabilities similar to 4D millimeter-wave radar, and information about the radar was released at LAST YEAR's CES.

Is Mobileye really not working?

Shashua said that after calculation by deep learning algorithms, the perception results of millimeter-wave radar can achieve an effect similar to that of lidar.

At this year's CES, the detection results of this radar were displayed, and from the results, the accuracy of the detection and imaging of the surrounding environment has been very close to the lidar. Compared with lidar, millimeter-wave radar has stronger severe weather performance and lower cost.

Is Mobileye really not working?
Is Mobileye really not working?
Is Mobileye really not working?
Is Mobileye really not working?
Is Mobileye really not working?

The figure shows the processed radar imaging results Of this lidar is not the other lidar

In addition to the above-mentioned imaging millimeter-wave radar, Mobileye established a division last year to develop FMWC lidar.

Unlike most Of the ToF LiDAR currently on the market, FMCW radar is not measured by the speed of light round-trip, but by the change of light frequency to measure the distance, its advantage is not affected by other light sources, while each point cloud information not only has distance information, but also has speed information, so it can be better used to track the surrounding traffic participants, but the disadvantage is that the technical difficulty is higher, and the price of mass production is higher.

Mobileye's price target is below $1,000.

Is Mobileye really not working?

Based on the above two radars, and the complementarity of vision, the automatic driving system has achieved 3 redundancies, in addition, compared to the perception integration achieved by car companies using lidar and millimeter wave radar, Mobileye's radar, lidar and camera are divided into two independent subsystems, which are not interconnected with each other, respectively, creating an end-to-end driving experience using only cameras, and an end-to-end driving experience using only radar and lidar, Mobileye hopes to improve robustness through the complementarity of the two to achieve security redundancy.

The key to efficient use of computing power – the RSS Responsibility Sensitive Safety Principle

Whether it is at the CES conference or in the interview after the meeting, Mobileye repeatedly emphasized the views of "not taking computing power as the only criterion" and "comparing computing power, efficiency, soft and hard combination is equally important", in order to make everything more convincing, Shashua also shared some technical details.

Is Mobileye really not working?

Here I also try to use as concise a text as possible, share what I understand to everyone, if there is a deviation welcome to correct.

Achieving autonomous driving requires three steps, perception– decision-making – execution, and it sounds like a complex project to stuff an elephant into a refrigerator.

One of the most costly are the two environments of "perception" and "decision-making".

Is Mobileye really not working?

At the perception level, in order to improve the perception ability, the number of cameras is increasing, the resolution is also increasing, and the data generated behind it is also growing exponentially.

In order to use computing power more efficiently, Mobileye will first perform scence segmentation NSS after obtaining visual perception information, giving priority to calculating road surface information rather than blindly processing globally.

At the decision-making level, the most expensive computing power is based on perceived information to predict the trajectory of the surrounding traffic participants, and then decide on a reasonable and safe route, but once it comes to forecasting, the longer the forecast is in the future, the demand for computing power will enter an exponential growth process.

Is Mobileye really not working?
Is Mobileye really not working?

Here Mobileye introduces the RSS (Responsibility Sensitive Safety) responsibility-sensitive safety principle, which aims to ensure that self-driving cars never actively lead to accidents by comparing several subjective common senses in data-based human driving, including:

What is a hazardous situation?

What is the correct reaction in a hazardous situation?

Who is responsible for the accident?

Safety distance in different driving scenarios.

Mobileye will be based on the above driving strategy to generalize the calculation, so based on RSS, the system will only calculate the possible future, not all the future, such logic to help Mobileye use computing power more efficiently.

The above 3 technologies will support Mobileye to better capture the L3/L4 scene.

According to information released by CES, at the L3 level Mobileye has cooperated with Honda and Valeo, in which Mobileye is responsible for visual perception.

At the L4 level, Mobileye's Robotaxi will begin road testing in mid-2022, and after approval at the end of the year, it will conduct unmanned testing of the main driver in Germany and Israel to achieve true driverlessness, when the cost of a car is expected to be $150,000.

Unlike Robotaxi, the consumer L4 can be driven over a wider range of areas, but the system relies heavily on REM, while retailing at $10,000 and projecting less than $5,000 at a cost of less than $5,000.

Is Mobileye really not working?

In Mobileye's view, Robotaxi and consumer L4 are not in conflict, Robotaxi's earlier investment can help Mobileye valuable data, and consumer L4 can achieve cost reduction through scale.

Is Mobileye really not working?

03

Write at the end

The above is all the information revealed by Mobileye at this year's CES, it is undeniable that Mobileye still has an absolute leading edge in visual technology for many years, and it can be seen in our 42Mark test that the models equipped with EyeQ4 chips have good basic capabilities, and when we communicate with the head of the car company ADAS, they also admit that the quality of Mobileye's perception results is very high.

But also because the perception algorithm is a black box, can not meet the needs of car companies to develop their own upwards, so lost a large number of orders from head car companies, some insiders revealed to us that compared to Mobileye, Nvidia and Horizon provide complete development tools are more conducive to car companies to achieve algorithm self-research, while communication efficiency is also higher.

Mobileye also said that it will launch an open computing platform with Intel in January, and at the same time, in October last year, Israel also had relevant technical personnel to support the extreme krypton project in China, and Intel formed a team of 50 people to cooperate with the development of extreme Kr SuperVision.

So it is too early to say that Mobileye is no longer good, and it is more worth observing whether it is the faster adjustment of Mobileye's commercialization ideas or the faster speed of the manufacturer's self-developed algorithms.

But what needs to be considered more is that with the strong demand for autonomous driving chips, more and more suppliers can provide products, and this market has gradually shifted from a seller's market in 19 years to a buyer's market.

Finally, we still want to pay the highest respect to Mobileye, a company that pioneered the autonomous vision technology route and is still committed to promoting the development of technology in the industry.

Author: June 3

EDIT: Daji

These are also worth reading

Is Mobileye really not working?
Is Mobileye really not working?

Read on