laitimes

Most likely is level 4 for self-driving trucks?

Autonomous driving has always been a hot topic in the automotive field, so how far has it developed today? When will the era of fully autonomous driving arrive? This article provides some useful insights.

Over the past decade, technology and automotive experts have predicted the "imminent" arrival of fully autonomous vehicles that can drive on public roads without any active monitoring or input from human drivers. Elon Musk has predicted that Tesla will be able to deliver fully autonomous vehicles by the end of 2021, but he made similar predictions in 2017, 2019 and 2020. Unfortunately, each prediction failed to come true, largely because of real-world safety concerns, especially how self-driving cars operate under harsh conditions or conditions.

In an article published recently in Communications of the ACM, "Still Waiting for Self-Driving Cars," the authors provide an overview of the current development status of various types of autonomous vehicles, the technical challenges faced by autonomous driving, the autonomous driving solutions adopted by various automakers, and the changes that need to be made at the ethical and regulatory levels.

The current state of self-driving cars

Although Tesla released the AutoPilot Full Self Driving (FSD) feature in October 2021, it should be admitted that fully autonomous vehicles have not yet appeared. Instead, most automakers offer systems that feature the first three of the six levels of autonomy defined by the International Society of Automotive Engineers (SAE), ranging from Level 0 (no driving automation) to Level 5 (fully autonomous driving capabilities under all conditions).

Most new cars now have some Level 1 driver assistance technology, including automatic braking, lane keeping assist, and adaptive endurance control. More advanced systems, such as Tesla's Autopilot or GM's Super Cruise, fall into Level 2, showing that cars can automatically control speed and steering, but require the driver to stay focused and take over the vehicle in the event of an emergency. Other manufacturers such as Honda and Audi are focusing on the Level 3 self-driving system that allows the car full control, but only in very specific cases, such as driving at low speeds, in good weather or on pre-approved roads.

Most likely is level 4 for self-driving trucks?

Tesla Autopilot system. Image source: Tesla

Peter Hancock, a Pegasus professor at the University of Central Florida and a Distinguished Research Professor at Provost, said many automakers want to surpass Level 2 and Level 3 as much as possible. However, the most likely level 4 feature is a self-driving long-haul truck. Hancock believes that the development of self-driving trucks could be more forcefully promoted due to the global shortage of truck drivers, at least on the U.S. interstate highway. These roads are built to specific design standards, are generally in good traffic conditions, and have physical barriers between opposite traffic.

In fact, Autonomous Driving Technology startup Aurora Innovation announced that it is building a Level 4 self-driving system and plans to launch a self-driving truck business in 2023 and an autonomous ride-hailing business in 2024. The company told Communication that it has partnered with FedEx, Uber, Toyota, and truck OEMs such as Volvo and PACCAR to develop partnership ecosystems focused on bringing self-driving technology to market.

Most likely is level 4 for self-driving trucks?

Self-driving trucks. Image source: aurora

The technical difficulties faced by autonomous driving

There are signs that the widespread adoption of autonomous driving is still years away from becoming a reality, largely due to challenges in developing precise sensors and cameras and improving algorithms that process data captured by sensors. Self-driving cars' on-board cameras and sensors can detect the physical world and a variety of objects that vehicles may encounter, such as road signs, traffic lights, other vehicles or pedestrians, specific lane markings, potholes, debris, burst truck tires, puddles, and more.

Most likely is level 4 for self-driving trucks?

Image source: pexels

Most systems take a bottom-up approach to training car navigation systems, including being trained to recognize specific objects or conditions above. However, given the variety of potential objects that can be encountered and the nearly infinite ways in which objects move or respond to stimuli, such as road signs that may not be accurately identified due to lighting conditions, glare, or shadows; animals and people make different reverses when facing oncoming vehicles. All of this makes the training process require huge amounts of data.

The data is fed into AI training algorithms, which are designed to help vehicles interpret objects and actions encountered to safely adjust their speed and position, even on roads the vehicle has never been on or has never encountered. However, the algorithms used today still have difficulty identifying objects in real-world scenes. For example, in an accident involving a Tesla Model X, the on-board sensor camera failed to identify the white side of the truck in the bright sky.

Different car companies use different solutions

Many self-driving car accidents involve so-called "emergency, edge cases," such as encountering pedestrians and animals on the road, destructive driving by aggressive drivers, or drivers deliberately violating traffic laws. To solve these challenges, researchers are working on high-definition (HD) mapping systems, which are more accurate than GPS. In addition, researchers can develop communication technologies that allow vehicles to interact with infrastructure on the highway to help self-driving cars stay safe in these emergencies.

Most likely is level 4 for self-driving trucks?

However, communication networks are latency. The approach taken by the autonomous driving development teams of Audi, Honda, Toyota, Volvo and Aurora Innovation is a combination of light detection and ranging technologies, commonly referred to as lidar . Aurora says it has designed a proprietary sensor, FirstLight Lidar, that uses FMCW (Frequency Modulated Continuous Wave) lidar to see a quarter of a mile (about 400 meters) ahead and instantly measure the speed of objects around the vehicle. The use of this technology gives autonomous driving systems more time to brake or operate safely, especially for heavy-duty self-driving trucks.

Meanwhile, self-driving startup Waymo is focusing on providing ride-hailing services. The company said Waymo Driver's self-driving technology operates largely in accordance with L4's self-driving guidelines and refines maps including lane markings, traffic signs, traffic lights, curb stones and crosswalks. In addition, the system is built on more than 20 million miles of real-world driving and more than 20 billion miles of simulated driving data, enabling Waymo drivers to accurately predict what other road drivers, pedestrians, or objects might do.

Most likely is level 4 for self-driving trucks?

Image source: waymo

A potential intermediate solution currently being tested in Germany is the use of remote drivers to control the vehicle. Berlin-based startup Vay has been testing a fleet of remotely controlled electric vehicles in Berlin and plans to launch mobility services in Europe and even the United States this year. The service allows customers to order a remote-controlled car and lets the car take the user to the desired destination; if the vehicle arrives at the destination, the user gets out of the car, and then a human remote driver miles away parks the vehicle or directs it to the next customer, claiming its systems are designed to meet the latest automotive safety and cybersecurity standards and deploy redundant hardware components and cellular network connectivity.

Industry observers are skeptical about whether such remotely operated vehicles are safe. "Latency and connectivity are a big issue, although it may be improved by some new technology or more advanced communication technologies." Gokul Krithivasan, global engineering manager for autonomous driving and functional safety at technology and management consultancy kVA by UL, said. The company is primarily engaged in safety and training on autonomous vehicles, as well as the development of relevant safety standards.

While Krithivasan did not specifically comment on Vay's model or approach, he said that emergencies faced by drivers often require decisions to be made within milliseconds, and that any delay issues caused by network redundancy can make it difficult for fully remote drivers to respond in an emergency. "In a typical implementation of a SAE L4 autonomous driving application, the remote operator does not need to continuously control the vehicle, but instead needs to perform or trigger the appropriate minimal risk operation already configured in the automatic control logic," Krithivasan explains.

Train a system that can understand human behavior

However, in order for autonomous driving systems to operate safely in all driving scenarios, there is still a need to develop and test around algorithms to ensure that vehicle navigation systems can handle the interaction of different objects on the road, such as the interaction between pedestrians and drivers, and the interaction between drivers and drivers. Usually, if a pedestrian is about to cross the road or is crossing the street, the driver and the pedestrian make eye contact and use nonverbal prompts to indicate the direction and speed of the other person's movement. Similarly, if this eye contact is lacking, the autopilot system should signal to the driver that the pedestrian or other driver is not paying attention to his or her driving, and that he/she should take circumvention measures to avoid or mitigate the collision.

Hancock says we can train a system to recognize these clues, but it takes a lot of computation and training time, and it can take years to develop a reliable and trustworthy system. A big area related to this topic is perceptual feature support, where there is a big perceptual difference between humans and automation. We usually understand human driving accidents and are confused about autonomous driving accidents, so when we see a human driving accident, we say, yes, I can understand how this happens. But when we look at self-driving accidents, we say, well, that's ridiculous — I don't know how that car could have made such a mistake.

Usually, human drivers will gain enough experience to safely deal with unreasonable or unexpected situations made by other drivers, usually by slowing down, pulling over, or simply maintaining their travel speed and direction so that humans, animals, or other vehicles can move around themselves.

Gustav Markkula, chair of applied behaviour modeling at the University of Leeds, UK, said: "Current autonomous driving algorithms do not have a sophisticated enough implicit understanding of human behavior to efficiently handle interactions in traffic. There is that implicit understanding between humans on the road, such as the driver knowing what pedestrians are doing, and the pedestrians and the driver interacting to ensure their own safety.

Regulatory challenges

The biggest obstacle to the commercialization of fully autonomous vehicles may be ethical and liability issues, such as which party is at fault if a self-driving car causes casualties or property damage. For years, the U.S. government has refused to regulate assisted driving systems such as Tesla's Autopilot and General Motors' SuperCruise.

But that trend is changing, and in June 2021, the U.S. government said all automakers must report crashes involving driver assistance systems. In addition, the National Highway Traffic Safety Administration (NHTSA) launched an investigation in August 2021 into an emergency vehicle rear-end accident caused by Tesla Autopilot. In addition, in October 2021, the U.S. government appointed Missy Cummings, an engineering professor at Duke University who studies autonomous driving systems, as a senior safety advisor to NHTSA. Cummings has been critical of Tesla and the federal government's handling of driver assistance systems such as Autopilot.

While Cummings' appointment is unlikely to prompt the government to enact the relevant rules immediately, NHTSA's five-year guidelines clearly state that the agency has the authority to intervene if the self-driving system shows predictable evidence of a driving safety violation, a situation that often appears in You-Tube videos in which drivers sleep in the driver's seat, play games or engage in other activities that divert drivers' attention, despite warnings in Tesla's instruction manual.

A fully autonomous L5 driving system can take a decade or more, at least in terms of deployment to private users and operational vehicles. Technical issues, regulatory issues, and persistent chip shortages have all become obstacles to the development of fully autonomous systems. Fully autonomous driving may first be deployed on commercial vehicles, including self-driving trucks, ride-hailing services, and shuttles. In addition to having the funds needed to purchase these vehicles, commercial implementations are more likely to limit operations to specific known roads, as well as establish and execute company-specific safety operating parameters for autonomous vehicles.

Reproduced from the heart of the machine, the views in the text are only for sharing and exchange, do not represent the position of this public account, such as copyright and other issues, please inform, we will deal with it in a timely manner.

-- END --

Read on