laitimes

Hai Dayu heavy warehouse, five gate sect assembly, intelligent transportation ushered in the "thunder vision integration" new war

Hai Dayu heavy warehouse, five gate sect assembly, intelligent transportation ushered in the "thunder vision integration" new war

You probing my nest, taking your cave, offending their respective boundaries, will become the regular action of these companies for many years to come.

Wen 丨 New Intelligent Driving ID: AI-Drive

Author 丨 Jieping

"Leishi fusion", a technical term that has always existed but has not been amplified, is now beginning to attract much attention.

On the one hand, video IoT companies such as Uniview began to raise the "Leishi fusion" to the height of the company's strategy this year, and on the other hand, Sagitar Juchuang also revealed to the new intelligent driving that it will actively lay out the Leishi fusion products at the end of the road, and on the other hand, there are intelligent driving companies Xidi Intelligent Driving, technology companies Baidu, Ali, Tencent, etc., also holding Rayvision fusion related programs, waiting at the road end.

"Thunder Vision Fusion" is not a new path to solve the problem, and it is not uncommon for players who are good at using this skill in various fields, and what is new is that several unrelated kung fu sects are finally going to meet and fight at the same intersection this time.

Internet companies, highway industry technology product suppliers, video IoT enterprises, vehicle specification radar suppliers, intelligent driving enterprises...... The five forces were all holding the same Sword Sect cultivation called "Thunder Vision Unity", of course, a closer look would find that the starting time of each family was not different, and the moves were also different.

A more tempting carrot also lies in the fact that the long-called vehicle-road coordination is likely to be torn open as a result, so as to get rid of single-point intelligence and start to really fight the "smart car" + "smart road" cooperation war.

But the question is, why now? Why them? When the "Lightning Vision" solution was relocated to the field of transportation, did it really work? Who will win in the end?

Five forces

In the autonomous driving industry, it is generally recognized that multi-sensor fusion is the main development trend of the future direction of autonomous driving perception.

The same story is now repeated in the field of smart transportation.

Induction coils, cross-sectional radar, geomagnetic nails, etc. are the main traditional vehicle detectors, but the common problem is that they can only obtain information about the lane and speed of a vehicle at a certain section or instantaneous moment.

The new generation of intelligent vehicle detectors, the two males, are AI cameras and forward traffic millimeter wave radar, which can obtain real-time data of all lanes, as well as accurate position and vectorization data of each vehicle.

However, AI cameras and traffic millimeter wave radar also have their own hard injuries, such as cameras in harsh environments will falsely report, underreport, and although millimeter wave radar can be applied in harsh environments, it cannot see clear images, nor can it identify the vehicle's license plate, model, logo and body color.

Therefore, the integration of cameras and radars, complementary advantages, and the integrated product of lightning vision, which is based on each other's strengths, has become a dark horse challenger in the traffic perception scheme in the past year.

One of the representative factions is the video IoT enterprises such as Uniview Technology, Hikvision, and Dahua Shares.

They are originally long in video technology, in the to G field of transportation is more cultivated for many years, in recent years have been transformed, began to video as the core, the integration of radar, big data, artificial intelligence and other technologies, in the field of transportation to do the overall scene solution.

For example, Uniview Technology has played the slogan of "Rayvision Fusion" this year, and released Rayvision all-in-one machine products and related solutions.

Liu Shengning, director of the product line of Uniview Technology, told the new intelligent driving that in terms of video AI, the first to land is the traffic roadside field, such as in urban traffic governance, the off-site law enforcement system based on bayonet and electronic police effectively reduces illegal driving behavior, and for static traffic governance such as entrances and exits, roadside parking, parking space anti-seeking and induction, the video AI foundation effectively improves service efficiency.

"The strategy of a strong transportation country and the 14th Five-Year Plan emphasize that on the basis of transportation infrastructure construction, we will further improve the level of traffic informatization and intelligence, build an overall transportation network around the three elements of order, efficiency and safety, and sufficient density of all-weather, high-precision, multi-dimensional perceptual roadside equipment to become a key element of construction, especially the integration of radar and video, which has been increasingly applied to the roadside field of intelligent transportation in the past year."

The other faction is sagitar Juchuang, Ouster, Wanji Technology and other vehicle-grade lidar suppliers, they are using radar sensors as a knife to cut out intelligent driving, intelligent transportation and other market segments.

Sagitar Juchuang told the new intelligent driving that as the industry's demand for high-precision perception is getting higher and higher, the application scale of lidar in the field of intelligent transportation has also accelerated.

Sagitar Juchuang first began to provide lidar and camera fusion products in 2018, and its Lightning Fusion Road End Project has landed on the roads of Guangzhou, Shenzhen, Zhejiang and other provinces and cities.

In addition, the launch of the Leishi integration program, as well as Baidu, Xidi intelligent driving, Huawei and other intelligent driving companies, tencent and Alibaba that have long targeted smart cities are of course not absent.

It can be summarized that at present, with the fusion of lightning vision as the guideline, there are mainly 5 forces in the smart transportation battlefield to polish products or solutions, namely video IoT enterprises, roadside millimeter wave radar suppliers, intelligent driving related enterprises, vehicle-grade lidar suppliers and Internet companies.

Some of them are deeply rooted in the field of transportation, some are good at radar sensors, and some are on the side of the road, killing all the way from the automotive field.

Each has its own killer skills

Different sects have different techniques for their own fortunes, and of course, killing moves will have their own emphases.

In his book The Nature of Technology, Brian Arthur argues that technology (all technologies) is a combination, meaning that any specific technology is built or combined from current components, integrations, or system components, and that each component of technology is itself a miniature technology.

To put it simply, the new technology is generated from the combination of existing technologies, evolved from existing technologies - and the so-called "lightning fusion" is also to do a good job in the integration of radar and video, which has millimeter-wave radar and lidar.

Millimeter wave radar mainly uses its emitted wavelength of millimeter wave electromagnetic waves to achieve the detection and identification of the target, after the target reflected back electromagnetic waves, through the formula calculation, you can get the distance and relative speed of the detected target relative to the radar position.

At present, most millimeter-wave radar output information is point clouds, which are two-dimensional information without height, and the images acquired by the camera are also two-dimensional information.

In order to realize the fusion detection of millimeter-wave radar and video, it is necessary to convert the 2 coordinate spaces, such as converting the millimeter-wave radar point cloud from the radar coordinate system to the image coordinate system obtained by the camera.

This requires the fusion of the underlying data of the two, which can achieve better detection accuracy on the one hand, and ensure the time consistency of key data on the other hand.

Liu Shengning said that front-end integration and integrated delivery will be the main forms of Rayvision integration, such as:

In the feature layer, radar can be used to provide more accurate feature area information for video, and video recognition can also be used to compensate for the multipath effect of radar;

In the data layer, a large amount of data is used to train the AI model of radar and video synchronously through a large amount of data and scenario-based algorithms to adapt to different business needs.

Lidar scans the external environment and uses point clouds to create a three-dimensional spatial map.

According to sagitar Juchuang's introduction, at the sensor hardware level, the fusion of lidar and camera is mainly to calibrate time and space, that is, to spatially match and synchronize the three-dimensional point cloud of lidar and the two-dimensional color image of the camera.

"This is true for both the end of the car and the end of the road, and the difference is mainly in the location and number of deployments. At the same time, compared with the car end, there is also a multi-base station fusion at the end of the road, that is, the perception results of multiple base stations on a certain section of the road are fused. ”

The main technical routes are different, so it also leads to the difference in the actual application effect of the Leishi all-in-one machine between different enterprises.

For example, since 2020, Yantai City has cooperated with Huawei, Baidu, and Keda to conduct holographic intersection/smart intersection application tests, and has compared the actual application of the three companies' lightning fusion solutions.

According to the introduction, at Huawei's holographic intersection, through the fusion of radar data and video data, the high-precision real-time collection of all elements of road traffic can be preliminarily realized, and the driving trajectory of pedestrians, non-motorized vehicles, and vehicles can be truly restored.

Help traffic accidents to be quickly compensated, accurately judge the driving trajectory through AI algorithms, automatically identify and warn accidents, reduce the occurrence of secondary accidents and congestion, and cooperate with the intelligent light control background system to optimize signal control... These are all functions that can be implemented at huawei holographic intersections.

Another example is the smart intersection of Keda, from the functional side, it focuses on two aspects of application:

First, the presentation of a variety of video fusion methods, like Keda will provide the fusion of high-point video and ground video, so as to achieve dynamic AR labels on high-point screens;

The second is to emphasize the use of real-time vehicle location information to generate value in traffic signal control, for example, Keda will optimize the traffic signal scheme by introducing the real-time location of the vehicle into traffic simulation to make short-term predictions of road network operation.

Sun Zhenxing, deputy section chief of the science and technology section of the traffic police detachment of the Yantai Municipal Public Security Bureau, said that overall, Huawei's holographic intersection emphasizes the application of trajectory heat map in traffic safety by collecting full data; Baidu tends to the direction of vehicle-road coordination and pays more attention to the integration and application of Internet data; Andrda prefers visualization and real-time signal optimization.

The killers are different, and in fact, the indicators that customers pay attention to are not the same.

"Price, detection accuracy, and difficulty of construction are all indicators that users pay attention to, but different customers have different focuses on different scenarios according to actual application scenarios, and they cannot be generalized." Sagitar Juchuang said to the new intelligent driving, for example, some customers will also pay attention to the functionality of the solution, some will pay more attention to the integrity of the perception information, and some will pay more attention to the application ability of the perception information.

Ambiguous borders: the struggle has a long history

Many times, in order to survive, the boundaries of the battlefield have long been blurred.

Under the general trend of smart cities, video IoT enterprises cross-border into the field of intelligent driving, intelligent driving enterprises into the traffic track, and the Internet companies with large enterprises in various subdivisions and the four-way attack are not strange.

For example, Hikvision's automotive electronics business is focused on the field of intelligent driving, and the path is to take video sensors as the core, combined with radar, AI, perceptual data analysis and processing and other technologies, trying to become a vehicle safety and intelligent product supplier with video technology as the core.

In this regard, as early as the middle of 2016, Hikvision established Hikvision Automotive Technology, and in October of that year, it quickly displayed the driving recorder, intelligent rearview mirror, on-board surveillance camera and related accessories.

In addition, Hikvision also invested in Nuctech Automotive Technology, established Hikvision Automotive Software in mid-2017, and launched advanced driver assistance systems and automatic parking APA+ in early 2018.

It is worth noting that Hikvision has also invested in millimeter-wave radar startup Sensenstack, which was founded by radar expert Qin Yi and is currently a supplier to car companies such as Hongqi, FAW and South Korea's Hyundai.

In addition, Hikvision's research and development project on radar sensors is also under construction and is quite important.

In March this year, Hikvision's total investment of 850 million yuan in the science and technology park project started construction in Shijiazhuang, according to the plan, Hikvision will build more than 20 professional millimeter wave and lidar laboratories in the science and technology park, production and scientific research equipment investment is estimated to be 240 million yuan, research and development of more than 50 kinds of high-quality radar products, covering unmanned driving, vehicle safety, intelligent transportation.

It is understood that after the project is put into operation, Hikvision Shijiazhuang Science and Technology Park will become one of the largest commercial radar R&D and production bases with the highest product market share in China.

From the perspective of results, Hikvision Automobile's progress is rapid, and the data shows that in 2020, Hikvision's fully automatic parking products based on the fusion of vision and ultrasonic radar have obtained new project fixed points from multiple depots.

In 2020, Hikvision Automotive Electronics' customers include SAIC Passenger Cars, Geely Automobile, Changan Automobile, Great Wall Motors, etc., with a total of more than 50 new mass production projects and more than 60 new projects throughout the year, involving more than 40 models.

In fact, due to the integration of the cloud-edge end-based system, the overall integration of traffic big data and Internet of Things applications, and the launch of the perception scheme with video technology as the core, in addition to Hikvision, the video IoT enterprises with full confidence in the field of intelligent driving are also a series of players such as Dahua Shares, SenseTime, Obi Zhongguang, Hongsoft, Megvii, Glene Deep Pupil, and Lu Shenshi.

They have gathered in the field of intelligent driving, from the cabin to the outside of the cabin, from the parts to the vehicle, from the hardware to the software, all involved in the hunt, each with its own strengths, as if the original players in the intelligent driving track, forming a trend of encirclement and chase.

Like the zero-running car hatched from Dahua shares, it has previously launched three mass-produced models, and according to the official data of zero-running cars, as of August 31, 2021, the cumulative orders for zero-running cars in 2021 have reached 35,662 units. At the same time, according to foreign media reports, the current zero-run car is also planning to go to Hong Kong for an IPO and raise at least $1 billion.

Cross the big rivers

When the characters of all parties meet at the same intersection, it is a battlefield of wind and clouds. Competing in the same territory, people wielding knives to open the way, and the team has just pulled out of the village, whether it is a hundred schools of thought or several heroes to fight each other, it is still unknown.

Of course, the first to benefit is the development of the entire field of vehicle-road collaboration, after all, the synergy of "smart cars" + "smart roads" constitutes the most basic main body of vehicle-road collaboration, and the Leishi fusion roadside perception system can support the high-speed application of vehicle-road collaboration and automatic driving at the same time to achieve mutual cooperation.

Sagitar Juchuang said to the new intelligent driving that under the blessing of a large amount of rich and accurate real-time information, under the coordination of bicycle intelligence and roadside intelligence, intelligent transportation will be able to achieve a global leap from bicycle intelligence to intelligent network connection, and move towards the global intelligent network transportation concept of "smart car" + "smart road".

This can be found in the changes in the bidding requirements of local governments in the past two years.

For example, the Traffic Signal Control System Project of the Smart Traffic Project released by the Puyang Municipal Government Procurement Center in July this year will purchase 126 sets of networked adaptive signal machines, and at the same time 120 sets of Laser Integrated Machines.

From the requirements of the lightning integrated machine, the project clearly requires that the product is mainly based on radar monitoring, supplemented by video, and the pixels are not less than 4 million pixels, which should be no less than 1500 lines, so that the radar target position and speed information can be superimposed with video images.

In terms of function, it is necessary to realize the perception and statistics of traffic flow, lane speed, front spacing, front distance, lane time occupancy, lane space occupancy, queue length, traffic status and vehicle type, and can identify traffic conditions such as smoothness, slowness and congestion.

A concrete example of application is the "green wave band".

The so-called green wave belt refers to the technology of calculating the time when the vehicle passes through a certain section of the road, and then coordinating the traffic light signals at each intersection so that the vehicle can continuously obtain a green light when driving, which requires the traffic back-end system to grasp the recent changes in motor vehicle traffic on the road, as well as real-time vehicle traffic data.

Mei Yu, chief engineer of Dahua Intelligent Transportation Algorithm, once told Leifeng Network that one of the optimization pain points of the two-way green wave is that the length of the queue is difficult to determine, and the queue will hinder the vehicles in the green wave belt, if the impact of the queue is not considered, the green wave will fail. "The often said green-to-early coordination, green-in-green coordination, and green-to-the-end coordination are empirical treatments for queuing problems."

The long-distance collection of traffic information such as traffic flow, queue length, and front distance is the biggest feature and advantage of Leishi fusion products.

"The current vehicle-road collaboration technology requires the roadside to provide traffic information within a certain range to facilitate the vehicle or driver to make decisions in advance, and the Laser Vision fusion solution can provide vehicles with all-weather, high-precision, over-the-horizon, lane-level traffic information, effectively improving traffic safety and driving efficiency." Even the intelligent autonomous driving of bicycles will be more secure with the blessing of the roadside perception system. Liu Shengning thought so.

However, at present, the path of ray vision fusion of players at the end of the road is not smooth.

The factors affecting the fusion of Lightning mainly include the level of sensor technology, product stability and accuracy of data acquisition, and these factors need to be further improved.

As Jiang Rongjun, general manager of Huiershi, said in an interview before, when the Laser vision integrated machine is integrated at the pixel level, the radar data and video data will have synchronization problems, and the two themselves process the data in different ways, which will cause a certain delay between the two data.

In addition, there is a problem of the trade-off and balance of the sensor to collect data, when there is a problem with the collection of data, you should choose and believe in which way of data information, which requires improving the algorithm of data processing according to the actual scene.

Chen Dong, chief of the science and technology section of the Haikou traffic police detachment, said that radar video detection data can replace detectors on the road, but it cannot replace internet floating car data, and how to better integrate the two data is still a problem.

In addition, how to use the original detection equipment of the leishi integrated machine with the intersection, how to further reduce the cost, how to improve the detection ability of stationary targets, how to reduce the complexity of product installation and commissioning and post-operation and maintenance, etc., are all problems that need to be solved.

At the beginning of the war, there are still major rivers to be crossed.

Which wins or loses?

In this regard, different factions have their own housekeeping skills.

Liu Shengning pointed out that for example, Internet companies have massive C-end customer portals, middle office capabilities based on big data and cloud computing and business layer capabilities are relatively strong, but their hardware products and overall solutions still need to be improved through ecology, while Hikvision, Dahua, and Uniview have strong advantages in front-end data collection and analysis, fusion computing and storage, and scenario-based engineering.

In fact, in the current bidding requirements for traffic signal optimization service projects in some cities, many will mention the use of platforms such as Didi, AutoNavi, Baidu, and Smart Traffic Management Brain to monitor and predict traffic operations, so as to timely discover and solve traffic congestion problems.

Or take the green wave belt as an example, like Baidu has already provided its floating car trajectory data to the local traffic police in 2018, which can help the traffic police to create a signal real-time monitoring and evaluation platform, and then realize the optimization of the green wave belt.

In addition, as far as radar is concerned, Baidu and Ali are customers of Sagitar Juchuang.

In the field of vehicle-road collaborative lidar roadside perception system, the advantages of vehicle-level radar suppliers such as Sagitar Juchuang are mainly reflected in their high-performance lidar hardware, deep technology accumulation of point cloud perception software and a large number of application cases across the country.

Uniview, Dahua and other enterprises, is in the road end of the algorithm accumulation of deep foundation, for the green wave optimization, these video IoT companies have long been in the game, the system and algorithm has been polished for a long time.

In fact, the realization of the basic functions of the road end perception scheme is not difficult, the real difficulty is that the enterprise's understanding of customer habits and the construction of the complex ecology, the algorithms and documents of the application scenarios that have been saved over the years are not poured out in a moment.

Of course, the current systematic solution products for Lightning Vision fusion are still in the process of continuous development of the pilot, and how to meet more intelligent transportation application scenarios still needs to be further excavated.

For example, the aforementioned Leishi integrated machine how to further reduce costs, there are people in the industry said to the new intelligent driving, equipment costs are affected by the industrial chain and application scale, the current high cost of The Leishi integrated machine, mainly due to the product application has just started, not yet has a scale effect, "with the improvement of user recognition, the industry manufacturers' strong investment, product costs or will quickly meet the user's expectations."

In terms of the installation and deployment of the Laser Integrated Machine, the video is very mature, and the radar is also relatively mature, so manufacturers need to focus on factors such as the easy installation and debugging of the fusion equipment when designing the Laser Integrated Machine.

summary

In fact, there is still a long way to go about the role of Laser fusion technology in the field of vehicle-road collaboration, and at a certain stage of development, there may be new fusion technologies such as multi-radar fusion, laser and video fusion.

However, as Liu Shengning said, just as face recognition technology has been successfully applied in recent years, ReID-based technology has now greatly improved the efficiency of public security management, "if the time is pushed back a few years, these results are difficult to foresee."

The wind rose at the end of Qingping, and the swords that had been crossed between different factions were now colliding at each knot, emitting a crisp tremor.

(First image source: One Map Network)

Are you "watching" me?

Read on