laitimes

From assisted driving to autonomous driving: Mobileye's today and tomorrow

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In the exploration of autonomous driving, Mobileye is undoubtedly one of the pioneers in the industry.

Mobileye has been involved since 2004 and has a huge advantage in areas such as vision, maps and chips.

At the same time, Mobileye has a favorite story in the capital market: Mobileye's ADAS system has the endorsement of mainstream car companies.

In 2021, Mobileye's chips have received orders from 30 car companies, with a total order of 50 million pieces, which will be installed on 188 new models. In terms of revenue, it will receive more than $1.4 billion in revenue in 2021.

By the end of 2021, Mobileye's cumulative sales of EyeQ chips have exceeded 100 million pieces. In the future, EyeQ chips will continue to evolve to support semi-automatic and fully autonomous driving functions.

In 2024, if Mobileye's autonomous driving business can fully explode, Mobileye's "money scene" will be unlimited. Even at its current growth rate of 40%, its revenue growth in the coming years will be very rapid:

2022 - $1.8 billion

2023 - $2.5 billion

2024 - $3.6 billion

2025 - $5 billion

2026 - $7 billion

Last December, Intel announced plans to promote the listing of its self-driving car subsidiary Mobileye in the United States in mid-2022, with a valuation of more than $50 billion.

What is more curious about the outside world is that after the IPO, can Mobileye tell its own story and become a high-growth autonomous driving company? And can the driver assistance giant support a $50 billion valuation?

Perhaps we can find the answer to the question from the sharing of Mobileye CEO Amnon Shashua.

As the soul of Mobileye and a technical evangelist, Professor Shashua's annual speech at CES is like a feast of ideas for autonomous driving. He will not only review Mobileye's progress over the past year and look forward to the future, but also think deeply about the trend and judgment of the autonomous driving industry.

This article is compiled by Heart of the Car based on Amnon Shashua's speech at CES (slightly abridged). In this keynote, you will learn:

Mobileye's 2021 transcript

What is Mobileye's 2025 strategy?

The three segments that Mobileye focuses on

How to judge the merits of a set of autonomous driving solutions?

What is the deployment of Mobileye in L4 Autonomous Driving?

From assisted driving to autonomous driving: Mobileye's today and tomorrow

01

Mobileye 2021 Transcript:

Total chip sales exceeded 100 million

2021 was a record year for Mobileye, which won the favor of more than 30 car companies with 41 new products.

Compared to 37 million vehicles in 2020, Mobileye products will have a record 50 million vehicles in 2021.

Compared to 2020, Mobileye's revenue is up 40%. For the full year 2021, Mobileye revenue reached $1.4 billion, with 188 models on the market equipped with Mobileye products.

As for Mobileye's flagship EyeQ chip, sales are also rising year by year, with this figure of 19.3 million in 2020 and 28.1 million in 2021.

Looking back at the history of EyeQ chips, since its introduction in 2007, its total sales have exceeded 100 million pieces. This also means that Mobileye's technology is already on 100 million cars around the world.

In 2021, Mobileye also created many industry firsts.

For example, Mobileye built a vision system and joined hands with Honda to launch a Level 3 model in Japan; Mobileye's 120-degree field of view 8-megapixel camera was also used in BMW models; in addition, Mobileye also launched a cloud-enhanced version of the driver assistance system with Volkswagen, using Mobileye's REM technology.

Of course, the mainstream in the market today is still Level 2, and here we will mention Mobileye's main SuperVision system. Complemented by a customized version of the ECU, two EyeQ chips can be used to match the driver assistance function of the 11 cameras in the car. Thousands more are now on the road in China.

In 2022, Mobileye plans to upgrade ADAS capabilities to NOP Pilot Assist through OTA.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In other data, Mobileye also has a lot of highlights.

First, Mobileye built a 200PB super database, some of which was delivered to the pre-built software, and others were directly stored in the Amazon AWS cloud. For comparison, the parent company Intel's existing database is only 238PB.

In addition, Mobileye has stored 16 million driving clips, which is equivalent to driving on the road for 25 years. In terms of computing power, Mobileye has 500,000 peak CPU cores.

Every month, Mobileye's computing machines perform 50 million computing tasks, which means that Mobileye processes 100 petabytes of data per month. In other words, behind this is 500,000 hours of driving time.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

02

Mobileye Strategy Interpretation

In 2017, Mobileye developed three foundational strategies.

One is the REM map. This is a set of crowdsourcing technologies based on driver assistance systems. As long as it is equipped with an EyeQ series of chips, useful data can be transmitted to the cloud. In the cloud, mobileye's algorithm integrates this data to build a high-precision map that can support driver assistance and autonomous driving.

The second is True Redundancy (true redundancy). Different types of sensors are used when building a self-driving car, such as cameras, millimeter-wave radar, and lidar. However, the camera and lidar at Mobileye will eventually be integrated into two different subsystems. To create an end-to-end driving experience, Mobileye polished and integrated both systems simultaneously in exchange for double redundancy.

The third is the Mobileye security model RSS (Responsibility Sensitive Security Model). RSS answers tough questions, such as "How exactly do you define what constitutes safe driving?" "Where is the line between reckless driving and cautious driving?" "How can self-driving cars flow into traffic without ensuring safety?"

From assisted driving to autonomous driving: Mobileye's today and tomorrow

These are the three basic strategies of Mobileye, which drive the two major divisions of Mobileye to continue to move forward.

First, there's the Advanced Driver Assistance Division, which is L2+. The system is equipped with multiple cameras and has a complete operational design domain.

However, the cost is relatively low because the system only requires a camera and REM. Of course, RSS-based driving strategies are also incorporated, which allows Mobileye to implement lean calculations when planning paths.

Secondly, there is the L3/L4 autonomous driving division, which covers the field of conditional autonomous driving and fully autonomous driving. The REM guarantees its range of application, while the RSS escorts the vehicle safely. At this stage, millimeter-wave radar and lidar also appeared.

After setting the goal in 2017, where is Mobileye now?

From assisted driving to autonomous driving: Mobileye's today and tomorrow

On REM Maps, Mobileye has assembled the world's largest mapping crowdsourcing fleet. Right now, Mobileye is accumulating 25 million kilometers of miles per day.

Throughout 2021, Mobileye collected 4 billion kilometers of data. In 2022, if it accumulates 25 million kilometers per day, Mobileye can eventually achieve 9 billion kilometers of data accumulation.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

Today's Mobileye not only has a full-featured L2+ system, but also deploys autonomous test vehicles in many cities, such as New York, Detroit, Tokyo, Paris, Munich and Israel.

On True Redundancy, Mobileye has commercialized the camera subsystem, the crystallization of which is SuperVision.

Currently, the product is already on the Krypton car, and in 2022 Mobileye will enrich its functions through a series of OTA upgrades.

Second, Mobileye has completed the construction of the millimeter wave radar/lidar subsystem, which means that Mobileye has been able to provide an end-to-end driving experience without cameras.

In addition, in September 2021, Mobileye launched the Robotaxi service at the Frankfurt Motor Show, unifying the camera and millimeter wave radar/lidar subsystems. In mid-2022, Mobileye's Robotaxi will be officially launched.

03

From ADAS to Autonomous Driving:

Mobileye today and tomorrow

From assisted driving to autonomous driving: Mobileye's today and tomorrow

On the cloud-enhanced L2+, REM Maps is a powerful addition, and the first car company to benefit is the Volkswagen Travel Assist 2.5 platform, which was first installed on ID.4.

With REM map data, vehicles can even achieve lane keeping and centering functions where there are no road markings.

Mobileye's partnership with Ford has also borne fruit, with the launch of a new generation of BlueCruise, which REM Maps will provide strong support. In China, of course, Mobileye has partnered with Extreme Krypton, which is equipped with a SuperVision system that is also aided by REM maps.

On L3, Mobileye chose to work with Honda and Valeo in Japan, and Mobileye took on the work of computational vision.

In addition, BMW is also one of the customers, and the 7 series equipped with the L3 system will be officially launched this year. In the future, Mobileye will further enhance the operational design domain of the L 3 system.

In terms of L4, Mobileye launched Robotaxi in September last year, which integrates Mobileye's camera and millimeter-wave radar/lidar twin system, and is expected to officially launch in mid-2022, and at the end of the year, it will get approval from German and Israeli regulators to completely eliminate security officers.

In addition, Mobileye also received orders for Robotaxi, self-driving cars, self-driving minibuses and unmanned delivery vehicles from Companies such as Sixt, RATP, Transdev, Udelv, Willer and other companies.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

Not long ago, Mobileye also got its first consumer L4 order, and the customer was geely Group's extreme krypton brand.

The so-called consumer class here refers to the self-driving cars that can be bought on the market, not the ride-hailing service. This consumer L4 platform is powered by six EyeQ5 chips and will be SOPable in early 2024.

So where will these efforts by Mobileye take us?

From assisted driving to autonomous driving: Mobileye's today and tomorrow

The answer is three new market segments:

The first is the emerging segment of assisted driving, the more advanced ADAS, which we call L2+.

This is defined because Mobileye has implemented omnidirectional sensing. In the case of the Kr 001, it is equipped with 11 cameras, including 7 8-megapixel telephoto cameras, and the other 4 are used in parking, plus REM maps and constantly updated OTA firmware, the performance of the vehicle can be continuously improved.

In addition, the L2+ platform has a complete sense-plan-execute cycle, and the operational design domain is also quite broad. Of course, the essence of this system is still L2+, and the driver cannot desert.

The second is the L4 Robotaxi with geofencing. In terms of cost, the entire system cost tens of thousands of dollars, and Mobileye's Robotaxi launch at the Frankfurt Motor Show in September 2021 belongs to this segment.

The third emerging segment, which will emerge in 2024 or 2025, is the consumer Level 4 autonomous vehicle. Compared to Robotaxi, it has its own unique advantage, that is, to break through the geographical fence and drive wherever you want.

Mobileye's REM map is key here. Of course, its cost control is also better, and the market price is expected to be about 10,000 US dollars, and the cost will be less than 5,000 US dollars.

Currently, Mobileye is involved in three major market segments.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

As for the future direction of autonomous driving, Mobileye believes that there are two:

One is Robotaxi, which can carry both people and goods. In addition to Mobileye, many companies are betting on this track, such as Waymo, Argo, Cruise and Aurora.

The other direction is consumer-grade autonomous vehicles. You only need to buy a car and you can get Level 4 self-driving capabilities, and Mobileye is optimistic about this direction. In addition to this, Tesla and Apple are also important players in this market.

However, it should be noted that these two markets are not of equal size. After all, consumer-grade self-driving cars can be transformed into Robotaxi, but Robotaxi is not a consumer-grade self-driving car, and its cost is higher.

Mobileye also believes that consumer-grade self-driving cars must be specially made, and you can't use Robotaxi technology to transform it hard, after all, its scale and cost are not advantageous.

That's what Mobileye is doing, walking on two legs, but prioritizing scale and cost.

Of course, betting on both directions at the same time is not just about hedging risk, there are strong synergies between Robotaxi and consumer-grade autonomous driving systems.

In deploying Robotaxi, Mobileye was able to accumulate a lot of key experience that it could then feed back into the next phase of consumer-grade autonomous vehicles.

2022 or 2023 will be the big year for Robotaxi, and 2025 will be the first year for consumer-grade autonomous vehicles.

04

How to judge the quality of a self-driving solution?

Is there a standard for judging the quality of a set of autonomous driving solutions?

From assisted driving to autonomous driving: Mobileye's today and tomorrow

Of course! Mobileye boils down to three areas:

The first is competence. We have to have a broad design domain for autonomous driving operations. That is to say, whether it is high-speed, urban or suburban road conditions can be competent. Of course, in terms of driving strategy, we must also move closer to human habits. For example, when entering traffic, be decisive and not hesitate.

The second is robustness. The mean time between failures (MTBF) of the system must be long enough to be at least significantly higher than that of a human driver, which can be 10 times, 100 times or even 1000 times.

The third is effectiveness. Efficiency is also cost. If there is no high enough efficiency, I am afraid that it is difficult to control the cost, and it is also impossible to get out of the geofence.

These are the three criteria that determine the quality of autonomous driving solutions, so how is Mobileye prepared to meet them?

In terms of capabilities, Mobileye can support the full operational design domain, covering L2 to L4. That said, Mobileye believes that the difference between different levels of autonomous driving is not in the design domain, but in the mean time between failures.

If the system performs below the average for human drivers, it is a Level 2 system. If the system average failure data is higher than the human average, then it is a Level 4 system.

Speaking of robustness, Mobileye follows two principles: the RSS model and the True Redundancy concept. With these two pillars, Mobileye can greatly improve system robustness and increase mean time between failures. At the same time, it will be easier to verify the system.

As for performance, Mobileye believes there are four key elements.

(1) Customized EyeQ SoC

Mobileye's EyeQ chip has now grown to the fifth generation, and its sales are in the millions. For example, Mobileye's cooperation project with BMW is the Core of the EyeQ5 chip, and another German giant, Audi, is also a user of EyeQ5.

At this year's CES, Mobileye brought three new EyeQ family SoCs:

From assisted driving to autonomous driving: Mobileye's today and tomorrow

EyeQ6 Light, which is mainly built for Level 2 ADAS function, consumes ultra-low power, only 3W, and can be installed behind the windshield.

EyeQ6 High, which is between 2-3 times the performance of the EyeQ5, is the royal SoC of the next generation superVision.

EyeQ ULTRA is the result of Mobileye's experience on the road to autonomous driving, and the new architecture can fully support heavy workloads.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In simple terms, Mobileye is rushing to design the ultimate chip this time, which is EyeQ ULTRA. Although it adopts a monolithic integrated architecture, it can achieve full automatic driving. In addition, the chip also comes with internal redundancy, and with an external MCU, it can fully evolve into an ASLD system.

In the past, Mobileye built 4 different accelerators, one of which focused on deep learning computing. The other is similar to an FPGA, which we call CGRA.

The third is called Sim D, which is similar to DSP (Digital Signal Processor). The last accelerator is a multithreaded CPU.

At work, different types of cores are responsible for different workloads, and EyeQ ULTRA has a total of 64 such acceleration cores.

In terms of process, EyeQ ULTRA uses the latest 5nm technology, it has 12 RISC-V cores, each core 24 threads. In addition, Mobileye also armed it with GPU ISPs, specializing in visual tasks. At full power, it hash rates up to 176 TOPS and can support massive 8-bit deep learning jobs. Compared to the competition, the 176 is not much, maybe even one-tenth of it, because EyeQ ULTRA cares about more than just computing power (TOPS).

In Mobileye's view, performance comes from a real understanding of the interaction between hardware and software, such as what cores to use in the work and which algorithms to use to support these cores.

Taking SuperVision as an example, the entire system uses two EyeQ5 chips, which can not only complete the calculation of 11 8-megapixel cameras to collect data, but also formulate a complete driving strategy and complete the closed loop of the perception-planning-decision cycle, while the single EyeQ5 has a computing power of 15 TOPS.

Obviously, Mobileye's requirements for computing power are not as high as expected. This means that as long as the vertical integration of software and hardware can be completed, great performance improvements and lean computing can be achieved. The 176 TOPS is actually ten times the computing power of the EyeQ5, which is also the strongest weapon that Mobileye can come up with at the moment.

Today, two EyeQ5s can power Mobileye's computational vision subsystem, SuperVision.

In 2024, Kr will also have 6 EyeQ5 models under the umbrella to achieve L4 autonomous driving with limited operating design domains.

Mobileye's Robotaxi carries 8 EyeQ5s at a time. From practical experience, Robotaxi requires at least 10 EyeQ5s.

Clearly, the EyeQ ULTRA has the ability to support a complete L4 system, and it won't end up selling for more than $1,000. In terms of cost, this is definitely a game changer in the industry. In terms of power consumption, the EyeQ ULTRA is also very low, and the entire system is less than 100 W.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In terms of computing power, the EyeQ6 High is 3 times the size of the existing EyeQ5 chip, or 34 TOPS, but the energy consumption has increased by only 25%. In terms of acceleration cores, The EyeQ6 High has only 14, and the process has become 7nm like eyeQ5.

However, it is still equipped with GPU ISP, which can be adapted to the next generation of SuperVision (two EyeQ6 Highs) in the future.

Of course, a single EyeQ6 High can do other types of work. If all goes well, EyeQ6 High's engineering samples will be available by the end of the year and will be in mass production in 2024.

EyeQ ULTRA engineering samples will be available by the end of 2023 and will be in series production in 2025.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

EyeQ6 Light, which uses a 7nm process, has 5 TOPS hash rate.

At present, Mobileye's chip used in the ADAS market is EyeQ4 Mid, and the EyeQ6 Light has 5 times the computing power and consumes only 3 W of power.

Although it has not yet been mass-produced, the EyeQ6 Light has actually ordered 9 million units, and its engineering samples were completed half a year ago, and will be mass production in 2023.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

(2) Next-Generation Radar – "Software-Defined Image Radar"

Why is Mobileye turning its attention to millimeter-wave radar again?

I would like to talk about these two subsystems, one is the camera-based SuperVision, which is covered by ToF lidar and 360 degree coverage by millimeter-wave radar.

At work, the two subsystems perform independent perceptual work. From Mobileye's point of view, it not only wants to be more robust, but also wants to reduce costs.

From experience, radar costs only one-fifth or one-tenth of that of lidar.

The problem is that the current millimeter-wave radar is really not feasible if it is taken out and used alone, and in congested traffic, millimeter-wave radar cannot distinguish between pedestrians and vehicles.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

Fortunately, today's millimeter-wave radar is also rapidly evolving. Two years ago, Mobileye decided to build a high-precision radar, calling it a "software-defined radar" because the radar can be upgraded through software, or even OTAs, and even configured for the reception/transmission and processing of signals.

In addition, with the help of corresponding deep learning algorithms, Mobileye can also create sensors that can operate independently, allowing millimeter-wave radar to obtain mobileye functions. If Mobileye achieves the set targets, then the 2025 self-driving car will only need to carry a front-facing lidar, and the millimeter-wave radar and camera can fully shoulder the task of monitoring the surrounding environment.

Software-defined image radars will become cost-reducing killers, and as sensors that can operate independently, they can also increase the robustness of the system again, because the front viewing angle has three redundancies.

In his 2021 presentation, Shashua has already talked about the performance indicators of this high-precision radar.

In simple terms, Mobileye's software-defined radar has 2,000 virtual channels, 48 transmitters and 48 receivers each, and its dynamic range can reach 100db, which is absolutely powerful in performance. At present, Mobileye has completed the trial production of the entire set of radar system chips, and the assembly test of the radar is also in progress.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

As shown in the figure, the radar gradation on the left side of the figure above: where green represents altitude, blue is the vehicle that is gradually moving away from itself, and red represents the vehicle or object that is close to itself. In the image on the left, many of the green dots on either side of the blue dots are radar-identified road teeth, and when the video moves, you will find that the road traffic is very busy.

As another example, the vehicle in the video passes under the bridge. From the radar, even vehicles obscured by guardrails can be seen, and vehicles inside the tunnel are clearly visible.

If it is replaced by the radar on the market now, it will not reach such a level of recognition. In the next video, you will see the motorcycle in the radar dot matrix on the left, even if it has left the line of sight, the radar can still detect it, and the entire trajectory is clearly visible, knowing that this is a busy main road.

The detection ability is so strong, what is the quality of the output radar data?

Why is this question? Because it is different from the image output of lidar, it is difficult to read millimeter-wave radar data. To this end, Mobileye specially trained two neural networks, one of which is input by millimeter-wave radar and output by lidar.

Mobileye specifically integrates millimeter-wave radar and lidar, using neural networks to convert millimeter-wave radar inputs into lidar outputs (made in the same form just to demonstrate the capabilities of millimeter-wave radar). From the final result, the scene detected by lidar and millimeter-wave radar is almost the same.

Looking at a few more examples, millimeter-wave radar can already compete with lidar in terms of detection accuracy, and the quality of the data it collects is quite high.

Mobileye also did another experiment, "training a neural network to convert the output of radar into a camera image." This time, Mobileye integrated millimeter-wave radar and cameras, and then trained them with neural networks, and the results were as shown in the video, although slightly distorted, but the guardrail and vehicle were clearly visible.

Mobileye's goal is simple, to save users money with this radar that costs only five to one-tenth of the cost, while increasing the security redundancy of the entire system (2 to 3 channels).

From assisted driving to autonomous driving: Mobileye's today and tomorrow

Speaking of lidar, Mobileye also has a dedicated department for development – Mobileye's FMCW LiDAR, which can implement 4D point clouds and add additional rate data.

At present, Mobileye has almost completed the development of the LIPRO SoC prototype, and in the future it will be responsible for the processing of lidar data, supporting the scanning of 90 vertical channels. On top of that, Mobileye's FMCW LiDAR will be in mass production by 2024.

Mobileye's most powerful EyeQ ULTRA autopilot chip will hit the road in 2024 and go into mass production in 2025. This is followed by software-defined radar, which can be formed independently and act as a third level of redundancy.

With all this gear, Mobileye has a self-driving system that costs less than $5,000.

(3) RSS-based lean driving strategy - not only must the SoC be customized, but the hash rate supporting Level 4 must also be "lean" enough.

Lean computing is closely related to the level of system performance.

The first is perception, that is, the vehicle understands the road conditions around it, and it is a 360-degree view. The location of road participants and their dynamics, fixed objects, obstacles, traffic lights, road teeth and road markings, etc.

Then there's planning, where vehicles need to make their own decisions as they travel, such as when to merge, how to merge, or even how to merge less abruptly. Finally, there is execution, which is the control of the vehicle. In order to complete this cycle, it is necessary to invest super computing power.

Why is the hash rate investment so large, and how does Mobileye save hash power through lean computing?

What is a driving strategy? The so-called driving strategy is actually to transform the mapping of the induction state (that is, everything around the vehicle) into action through the engine, and to achieve control of the driving state of the vehicle.

It sounds simple, but it's a real big problem, why?

First, unlike perception, driving strategies are not supported by live pavement, and any action has long-term effects. That is to say, a small action now may cause an accident in a few seconds or minutes. Therefore, every action must be considered for subsequent chain reactions.

Moreover, this is still a closed loop, you are not the only participant in the game, you must deal with the uncertainty on the road, after all, other road participants will also have their own actions.

Right now, many companies are developing technologies to try to develop driving strategies in new ways.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In the chart on the left (as above), the vertical axis represents the amount of computation and the horizontal axis represents the quality of the search. As for "brute force", it is easy to define, in order to deduce the future, we must predict the possible actions of all objects, and then repeat the work in a cycle, like a large tree that grows savagely around.

Obviously, relying on brute force is too clumsy, but if you have unlimited computing power, you can ultimately deduce the best solution. Unfortunately, the hash rate is not unlimited.

In the robotics and autonomous driving industry, there is a technology that is widely used, it is MCTS - Monte Carlo tree search algorithm, that is, to follow the feeling to search for several "most likely" solutions, and then to think about these solutions to the opponent's "most likely" countermeasures, and then think about how to proceed to the next step. As for the quality of the search, in the end, it depends on whether the computing power is strong or not.

Finally, there's MDP, the Markov decision-making process, which is of very high search quality, but you need to make strong assumptions about how other road participants will move in the next few seconds, tens of seconds, or even 100 seconds without considering various interactions. If you believe in all kinds of assumptions, then MDP is quite efficient.

Let's take a look at the chart on the right (as above) and what exactly is the most realistic assumption.

MDP – The Markov decision-making process is the least realistic, because you already assume that you can predict the actions of other road participants, without having to make closed-loop assumptions. This means that your trajectory is completely independent.

As for MCTS, the Monte Carlo tree search algorithm, the assumption that knowing a driving strategy and then shuttling between other road participants may not be in line with reality.

What about brute force? The fewest assumptions are required.

Obviously, to get the best solution, you still have to spend a lot of computing power. In the RSS model papers, we also talk about driving strategies but rarely emphasize them: Mobileye prefers to talk about the regulatory framework enabled by RSS.

In RSS, we talked about the following things.

First, we need to define reasonable boundaries for the behavior of other road participants, because humans also make various assumptions when driving.

So we need to capture these assumptions, in formal mathematical form, so that we can unite industry players and regulatory entities to develop the parameters of these assumptions. Once you've identified the hypothesis, you can figure out the worst-case scenario without worrying about what other road participants will do.

In RSS, we also developed formal theories based on induction, that is, in an architecture that considers only the worst case, you can prove that you will not cause an accident, which gives us a formal guarantee.

That's why RSS has gained so much favor with so many institutions and companies, and it's also the starting point for solving some key issues.

Of course, what I want to emphasize here is not the regulatory framework of RSS, but how it affects driving strategies.

If we use induction and analytical calculations, you need to present the problem, and RSS can combine all reasonable future assumptions and present them. Obviously, this is a more efficient and realistic approach, and the search quality is the highest and the data is interpretable.

So, when do neural networks step in?

Neural networks don't appear at the security level, but at the comfort level. That is, instead of predicting what other road participants are going to do, neural networks discover various intentions. These intentions can be to give way or cut in line, or the vehicle is parking or to make a U-turn. With these intentions in place, the neural network can control the parameters of the RSS.

Overall, if we put RSS-based driving strategies in this chart, we can get MDP efficiency based on reasonable assumptions and make driving strategies really lean.

(4) Create REM high-precision maps for ADAS and autonomous driving

From assisted driving to autonomous driving: Mobileye's today and tomorrow

In terms of performance, rem maps are also a link that cannot be ignored. ID.4, powered by the Volkswagen Travel Assit 2.5 system, uses Mobileye's REM map.

Take these scene fragments as an example:

On country lanes, for example, you can't see lane dividers at all, but with rem maps, vehicles can activate lane keeping.

In addition, the interrelationships between traffic lights, markings and lanes are also included in the REM map, which supports some key safety functions, such as ensuring that the vehicle does not run a red light.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

On top of that, the REM map already covers a very wide area, with 2.5 million kilometers in Europe. In addition, the REM map data is constantly being updated, and in 2021 alone, Mobileye has 4 billion kilometers of road data, and this number is still growing at a rate of 25 million kilometers per day.

From assisted driving to autonomous driving: Mobileye's today and tomorrow

As for the richness of the REM map, first of all it is impeccable in accuracy, the map has information on driving routes, road boundaries, traffic lights, sidewalks, drivable lanes and speed limits.

In 2021, Mobileye added more information, such as construction sites, which are marked in red on the map. The second is the turning prompt, which is also an important part of a good autonomous driving experience.

In addition, Mobileye has also added the location of the speed bumps and the speed limits of different sections to the REM map, which is not easy to do. Of course, the lines of public transport in the city are also included in the map.

Let's conclude by saying that, first, our build redefines the evolutionary path of ADAS and pushes The Level 2+ to the ground. Of course, REM maps and lean computing are also key elements. For the ADAS market, it is the Polaris that guides the course.

Second, we understand what the right engineering is, and it requires true redundancy to continuously improve the time between failures (MTBF) on the basis of which we can really roll out MaaS services.

Finally, there are blockbuster products such as the EyeQ ULTRA and software-defined image radar, which are important pillars of whether fully autonomous driving can reach the consumer market in the future.

Read on