laitimes

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

Reports from the Heart of the Machine

Editors: Zhang Qian, Mayte

Tesla's fully autonomous driving is again problematic.

For some time now, Tesla's fully automated driving system (FSD) seems to have been exposed to bugs on a regular basis. The most recent occurred during the Spring Festival: Tesla recalled 53,822 vehicles for FSD's "rolling stop" feature in violation of U.S. traffic regulations. That is, at some intersections marked with an "all-way stop" sign, the software will allow some cars to move at low speeds instead of having a complete stop.

At the time, Tesla said it had not received any warranty claims, crashes, injuries or fatalities related to the recall as of Jan. 27. As a result, Tesla CEO Elon Musk Musk said on Twitter that there are "no security issues" with the feature.

At present, Tesla has disabled the "rolling stop" function of FSD, but the discussion about the safety of this feature continues. At the same time, more questions about FSD began to surface.

In a Feb. 10 report, the Washington Post mentioned that they convened a panel of experts to analyze some of the driving videos uploaded by Tesla owners frame-by-frame, and the results showed that Tesla's FSD had fundamental weaknesses. Experts say these problems are not easy to solve, and patching up a problem can introduce new complexities.

Some questions are laughable, such as a right turn, tesla directly into the tram track:

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

The videos they analyzed were from YouTube, and more of the issues with FSD included:

Crashing into bike lane bollards at 11 miles per hour;

No stopping give way in front of the zebra crossing where pedestrians are about to pass;

Long-distance parking;

Fight for control of the steering wheel;

Some traffic signs are not recognizable.

Hit the guard

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

On a clear day, a Tesla with an FSD Beta swerved right at 15 miles per hour through an intersection in San Jose. A bike path is located on the inside of the road. The car then hit the protective column of the bike lane at 11 miles per hour.

"This problem is both a cartographic problem and a perceptual problem. As permanent guard posts rather than temporary cones, they should appear on the map," said Brad Templeton, who has long been working on the development of self-driving cars.

"As for why the FSD didn't perceive these gollies in time, perhaps because the shape and color of the balustrades are less common, the system didn't see them during training," said Templeton, a Tesla owner and fan.

Tesla's ultrasonic sensors promise to detect these dangers, but their location (front bumper, etc.) can be a weakness. Templeton says, "They may not see sparse, thin things like stickers."

Almost hit a pedestrian

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

The second example also occurred in San Jose, where Tesla nearly bumped into a pedestrian about to cross a zebra crossing after turning right at the green light as usual. Pedestrians suddenly stopped after seeing tesla, and tesla also slowed down, but when slowing down, it had already walked through most of the zebra crossing.

After analyzing the video and other similar videos, The Washington Post's panel said the FSD did not appear to recognize signs on the sidewalk or anticipate that a stationary pedestrian might venture across the road. Professor Andrew Maynard, director of the Risk Innovation Lab at Arizona State University, said: "It is unclear whether the car reacts to the presence of pedestrians, but it is clear that the driver is frightened."

Hod Finkelstein, chief research and development officer at lidar technology company AEye, said he doesn't believe that cameras alone are enough to detect pedestrian intent in all situations because they are not good at measuring the distance of distant objects and can be "blinded" by car headlights and the sun. Traditional self-driving car manufacturers are already using a combination of cameras, lidar, traditional radar, and even ultrasonic sensors.

Tesla's software combines machine learning software with simpler software "rules," such as "Stop when you see a stop sign and a red light." But as one researcher pointed out, machine learning algorithms always learn things they shouldn't. For example, if software is told to "never hit a pedestrian," it may misinterpret that if a pedestrian is about to be hit, they themselves will give way.

Software developers can create a "rule" that cars must slow down or stop for pedestrians. But then the software will be paralyzed because the city is full of people. So, it's ultimately a long-tail issue.

Long distance parking

In another video recorded by the same driver in early December, Tesla stopped when a pedestrian crossed the street outside the crosswalk and started stopping when the pedestrians were far away. At this point, many human drivers will choose to continue driving.

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

The video suggests that Tesla may be ordered to slow down if pedestrians travel in the direction of the road. But one expert suggested another possibility: The car might have stopped because of an optical illusion.

In this video, the red sign between Tesla and pedestrians coincides with a tree on the sidewalk, creating an image that resembles a stop sign. A video uploaded in February showed the same phenomenon, showing that the parking sign illusion is indeed deceiving cars.

Compete for control of the steering wheel

Of course, the Tesla in these videos doesn't pose a real danger, because most of the time the owner will take over in time when the system makes the wrong choice. Tesla's official website shows that when using Autopilot and FSD, drivers must "always keep their hands on the steering wheel" and always "maintain control and responsibility for the car."

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

But sometimes, there are also troubles with taking over. In another example, the same Tesla is passing a truck parked on a narrow street with vehicles parked on both sides. When unsure what to do, the car software prompts the driver to take over. But at this time, the driver has difficulty controlling the vehicle because the steering wheel is swinging violently from side to side.

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

In this case, both the computer system and the human are trying to drive the car through a narrow position with little room to swing left and right. In most cases, the driver takes over by jerking the steering wheel in the opposite direction the software is trying to steer. However, in this case, this movement is not possible, so it is not clear whether it is a car or a person in control. With no sharp turns, the driver can't regain control by turning the steering wheel, so the situation becomes tricky.

I don't recognize the stop sign with only letters

Another serious problem is that Tesla's cars don't respond to the "Stop here on red" sign. Chris, a michigan driver, released a video in November in which he was forced to put on the brakes.

Tesla FSD system review: drive on the tram track, do not recognize the stop sign, and almost hit a pedestrian

One self-driving researcher said the ubiquity of signs on U.S. roads would bother Tesla engineers. Unless a car camera is able to recognize the letters on a sign, the computer will have to look for other clues, such as an arrow sign or a white line drawn on the road. But this can create problems in other situations, causing the car to stop incorrectly when it sees a line on the road or a similar arrow sign.

Many of Tesla's competitors use high-definition maps to guess where to park and turn, but this strategy also raises other problems, such as the potential for any map to keep up with the ever-changing road network.

So, after about a year of using FSD, Chris believes the car is still a decade away from achieving reliable autonomous driving.

In addition to autopilot, Tesla recently recalled nearly 580,000 cars for other issues.

The pedestrian warning system was silenced, recalling nearly 580,000 vehicles again

On Feb. 10, U.S. regulators said Tesla was recalling 578607 cars in the U.S. because pedestrians might not be able to hear the necessary alarms that came when the car approached because the car's "Boombox" function would play music or other sounds loudly.

In the past four months, Tesla has issued 10 recalls in the United States, including four in the last two weeks. The National Highway Traffic Safety Administration (NHTSA) has placed increasing scrutiny of the Texas-based company. Tesla said it has not found any crashes, casualties related to the vehicle alert problem that triggered the latest recall.

Tesla launched "Boombox" in December 2020, which allows drivers to change the car horn sound, car owners can customize other sounds to replace the normal car horn sound, and can also play the music of the car through external speakers, such as playing their favorite songs outside the car while driving.

The "Boombox" allows sounds to be played through external speakers while the vehicle is driving, which may mask the required pedestrian warning system sounds. The National Highway Traffic Safety Administration said it did not comply with federal motor vehicle safety standards for minimum volume for electric vehicles. As a result, Tesla is recalling the 2020-2022 Model S, Model X, Model S, and 2017-2022 Model 3 vehicles.

Electric vehicles are often more difficult to "hear" in time than gasoline engines at low speeds. Under congressional regulations, automakers must add sound to electric vehicles while they are traveling at speeds of up to 18.6 miles (30 kilometers) per hour to prevent injuries to pedestrians, cyclists and blind people. The National Highway Traffic Safety Administration says that at higher speeds, tire noise, wind resistance and other factors mask the alarm.

After the software update, Tesla will disable the Boombox function in D, N, R three modes. In fact, tesla's multiple recent recalls have been aimed at solving software problems. Shortly after NHTSA questioned tesla vehicle features and complaints, tesla saw several recalls. Currently, regulators are investigating Tesla's driver assistance system Autopilot and in-car gaming features.

Under pressure from NHTSA, Tesla agreed in January 2021 to recall 135,000 vehicles with potentially malfunctioning touchscreens. In this case, NHTSA took the unusual step of formally requesting a recall.

Tesla had tried to address the issue with OTA updates, but NHTSA said in early 2021 that the updates might be "procedurally and substantially inadequate."

According to Tesla, NHTSA sent out a request for information in January 2021, and in the months that followed, the two sides held several online meetings on the issue.

Tesla said that by September 2021, NHTSA had escalated its investigation into the matter. In October, Tesla defended the tests and rationale used to determine Boombox compliance, but after the December meeting, Tesla finally agreed to recall the vehicle.

Reference Links:

https://www.washingtonpost.com/technology/2022/02/10/video-tesla-full-self-driving-beta/?utm_campaign=wp_post_most&utm_medium=email&utm_source=newsletter&wpisrc=nl_most&carta-url=https%3A%2F%2Fs2.washingtonpost.com%2Fcar-ln-tr%2F3600afd%2F620545689d2fda34e78790ff%2F59726031ae7e8a1cf4a77b1b%2F37%2F72%2F620545689d2fda34e78790ff

https://www.reuters.com/business/autos-transportation/tesla-recalls-nearly-579000-us-vehicles-over-pedestrian-warning-risk-sounds-2022-02-10/

Read on