laitimes

Tesla FSD Autopilot is suspected of being dangerously used, and California is considering regulating its testing

On January 13, California regulators said that California is evaluating whether Tesla's self-driving tests need to be regulated. Previously, "there were videos showing the technology being used dangerously," and the federal government launched an investigation into the Tesla accident.

The California Department of Motor Vehicles (DMV) has said tesla's "fully autonomous driving" (FSD) beta requires human intervention and is therefore not subject to its self-driving car regulations. However, dmv wrote in a letter to Lena Gonzalez, chair of the state Senate Transportation Committee on Friday, saying that "the DMV is revisiting this decision after a recent software update, video showing the technology being used dangerously, a public investigation by the National Highway Traffic Safety Administration (NHTSA) and hearing from other experts."

Tesla FSD Autopilot is suspected of being dangerously used, and California is considering regulating its testing

A Tesla car drives past its California factory

Tesla has not commented. The company has been expanding the push for a "beta" of advanced driver assistant software FSD, which actually allows untrained drivers to test how well the technology works on the road, raising safety concerns.

Read on