laitimes

Lidar burned out the phone camera, what happened?

Lidar burned out the phone camera, what happened?

Author | Hong Zexin Editor | Wang Bo

Don't use your phone to record at LiDAR, taking pictures or videos with your phone at LiDAR may dry up your phone camera.

At the beginning of this month, a user said on the social platform Xiaohongshu, "Taking pictures of NIO ET5 and Galaxy at night, I found that the photos sent out had two bright stars that the thieves could not see, and found that the mobile phone camera was damaged by LiDAR." ”

Earlier, some netizens said that when they were shooting NIO's new car ES7, the camera of the Xiaomi 12S Ultra was damaged, and multiple horizontal green lines appeared in the picture. Netizens said that the car was parked at the time of the shooting, but the lidar was still working.

Going back further, during the 2019 CES show, a Sony mobile phone user also showed a green line when shooting AEye's lidar.

In all three events, the same point is the lidar pointing to a 1550nm light source.

So, will 1550nm lidar burn out the phone camera?

Chat GPT 4.0 gives the conclusion.

Lidar burned out the phone camera, what happened?

But why exactly is this the case? We did some research.

Burned out phone camera

Existing 1550nm lidar has burned out phone cameras more than once.

In addition to the above-mentioned cases of mobile phones burned out by lidar, according to lidar practitioners told HiEV, there have been similar situations when lidar companies tested 1550nm products internally.

But this is a small probability event.

The reason behind it is also very understandable, looking at the world, the official mass production of 1550nm automotive lidar is not long. At present, Tudatong, which provides products to NIO, is the main one, and other lidar companies use 905nm light sources.

In addition, 1550nm lidar is not bound to kill the mobile phone camera.

Before figuring out the whole thing, let's understand two things.

What is a phone camera?

The technical term for mobile phone cameras is CMOS sensor, which is an image sensor that converts light signals into electrical signals. It is one of the most commonly used sensors in digital imaging devices such as digital cameras, mobile phone cameras, surveillance cameras, etc.

CMOS sensors consist of thousands of tiny photoelectric sensing units, each capable of converting optical signals into electrical signals.

Below is a simple schematic of a CMOS image sensor:

Lidar burned out the phone camera, what happened?

As shown in the figure, the principle of CMOS sensor is to convert optical signals into electrical signals, in which each pixel unit is composed of three parts: a photosensitive element, a conversion circuit and a readout circuit.

A photosensitive element is a photodiode or photocapacitor that generates an electric charge when light hits it.

The conversion circuit converts the charge generated by the photosensitive element into a voltage signal, and in the process amplifies and shapes the charge, making it easier to measure.

The readout circuit converts the charge into a digital signal, and the readout circuit generally reads out the signal of each pixel unit one by one in a row-by-row manner and converts it into a digital signal.

Finally, the read digital signal is processed and encoded by the computer to produce a complete digital image.

What is LiDAR?

Lidar can determine the distance, position, shape, and other information of the target object by emitting a laser pulse and then measuring the echo time and intensity of the pulse.

Lidar usually consists of lasers, scanners, detectors, processors, and other parts.

Below is a simple lidar schematic that can be used to illustrate the basic principles and components of lidar:

Lidar burned out the phone camera, what happened?

Laser: A light source that emits laser pulses, generally using solid-state lasers, semiconductor lasers, etc.

Scanner: Controls the scanning range and speed of the laser beam, generally using a rotating mirror or microelectromechanical system (MEMS) scanner.

Receiver: Receives the laser signal reflected back from the target object, generally using photodiodes, photomultiplier tubes, etc.

Computer: Process the received laser signal, including measuring distance, position, speed and other information, and generate a three-dimensional point cloud or image.

The laser emits a high-energy pulsed laser, the scanner controls the laser beam to scan within a certain range, the receiver receives the echo signal and measures the time and intensity of the echo, and the computer processes the data and generates a three-dimensional point cloud or image.

Lidar works by using the speed of laser beam propagation in the air very fast, which can be approximately seen as the speed of light propagation in a vacuum, so the distance and position of objects can be measured with extremely high accuracy.

The laser wavelength used in lidar is generally hundreds of nanometers to several microns, common wavelengths are 905 nanometers, 1550 nanometers, etc., the specific wavelength is determined according to different application needs.

So, when the phone camera captures the lidar, will it really be burned out?

Under certain conditions, it will

Under normal circumstances, the beam emitted by the vehicle-mounted lidar will gradually attenuate during transmission, and the light intensity when it reaches the receiver is already small. In addition, modern lidar systems often employ measures such as filters to reduce interference and damage to photosensitive elements such as CMOS sensors.

Therefore, automotive lidar usually does not burn out CMOS sensors.

So, under what circumstances can lidar cause damage to CMOS sensors?

Both the beam intensity and wavelength of lidar have an impact on what it does to the object, so both the power and wavelength of the beam are factors to consider.

Previous papers have conducted in-depth research on the irradiation damage mechanism and influencing factors of laser on CMOS sensors.

In this paper, the following conclusions are concluded by studying the electrical properties and image quality of CMOS sensors under different conditions by studying the electrical properties and image quality of CMOS sensors under different conditions:

The laser wavelength has a significant effect on the irradiation damage of CMOS sensors, among which the damage caused by laser irradiation at 532 nm is the most significant, while the damage caused by laser irradiation at 1550 nm is relatively small.

The laser power has a very significant effect on the degree of irradiation damage of the CMOS sensor, and the higher the power, the more obvious the damage to the sensor. Of all the tests, laser irradiation of 1000 mJ/cm2 produced the most severe damage.

The time of irradiation also has an impact on the degree of damage. Shorter periods of laser irradiation can cause temporary damage to the sensor, while longer irradiation can cause more severe and long-lasting damage to the sensor.

Laser irradiation affects the performance indicators such as cell charge transfer rate and charge collection efficiency in CMOS sensors, and also affects image quality indicators such as image noise and dynamic range.

Lidar burned out the phone camera, what happened?

As can be seen from the figure above, as the laser energy increases, CMOS appears spot damage, line damage, and cross line surface damage in turn.

Through the analysis of CMOS failure mechanism and combined with damage CCD simulation simulation, it can be seen that:

The spot damage of CMOS is mainly caused by the damage of microlenses and filters, accompanied by the peeling and melting of some metal aluminum, and the failure of some unit pixels.

Line damage is mainly due to local circuit short circuit or open circuit resulting in signal transmission interruption;

As the CMOS temperature continues to rise, crystalline silicon melts, silica deforms and ruptures due to thermal stress, metal lines are seriously damaged, cross surface damage occurs, and CMOS fails in a large area.

Back to the 1550nm lidar that appeared at the beginning of the article and burned the phone camera.

One view is that the 1550nm laser beam irradiated on the Si pixel will not be absorbed, and will be directly transmitted, that is, 1550nm will not burn out all CMOS, and the pure Si process will not burn out. However, in the CIS underlying circuit, there may be a part that is sensitive to 1550nm, causing a short-term thermal effect and burning out this part of the circuit.

There is also an analytical view that the 1550 lidar pulse power is higher, the human eye has a lens, contains more water, can absorb part of the energy, and do some attenuation of the energy of light. However, some mobile phone cameras do not do 1550nm filtering, and the light hits the CMOS directly, causing damage.

A person close to Tudatong told HiEV that the cause of the problem is not yet clear, but it can basically judge some influencing factors, which may be related to the two factors of laser energy and transmittance.

The direction of improvement is basically there.

On the one hand, the energy of the laser pulse can be controlled by the development of special modes. Before there is a very perfect research and evaluation, some special modes can be developed, so as to achieve the development of special modes in close follow-up and stationary time scenarios, and some control of pulse energy can be achieved in special modes.

On the other hand, energy is related to transmittance, lens aperture size, and lens and filter coating. These three factors affect the CMOS energy of the laser pulse. Among them, the coating cost of the filter will be relatively low, which is a more practical state.

In general, when a car with a 1550nm lidar passes by you, please do not take pictures on the safe side, otherwise it may burn the phone camera.

At the same time, keep a safe distance from the lidar when appreciating - 10cm, too close is to break the safety threshold of the human eye.

Can lidar harm the human eye?

Lidar may burn out the phone camera, so will it burn out the eyes?

High-power lasers will definitely be. However, there are no public cases of damage to the eyes of vehicle-mounted lidar.

From the power point of view, it has been said that for the same power and laser type, the single-photon detector sensitivity of 905 nm wavelength laser is relatively higher than that of 1550 nm wavelength laser, so it can achieve long-distance detection and measurement at lower power.

In contrast, 1550 nm wavelength laser single-photon detector has low sensitivity and high eye-safety threshold (40 times higher than 905nm), so 1550nm wavelength lidar will generally achieve a longer detection distance than 905nm by increasing the laser power.

In specific applications, the power and wavelength of the laser need to be selected and adjusted according to the specific application scenarios and system parameters to achieve the best performance and effect.

Therefore, leaving the specific application scenarios and system parameters to talk about the level of power of these two wavelengths is playing hooligans.

From the perspective of the degree of damage to the eyes, it was previously said that an excess of 905nm will damage the retina, because it can penetrate the front of the eye and is mainly absorbed by retinal melanin; An excess of 1550nm will damage the cornea and lens because it will be absorbed by the lens and cornea of the eyeball, and the two are different for the injury site of the eyeball in the case of overdose.

This view is not necessarily correct.

In 2010, the Institute of Physical and Chemical Technology of the Chinese Academy of Sciences published an article:

Lasers with wavelengths in visible and near-infrared light have low absorption and high transmittance of the refractive medium of the eye, while the focusing ability (i.e., light concentrating power) of the refractive medium is strong. When highly intense visible or near-infrared light enters the eye, it can pass through the human eye refractive medium and accumulate light on the retina.

At this time, the laser energy density and power density on the retina increase to thousands or even tens of thousands of times, and a large amount of light energy is instantly concentrated on the retina, causing the temperature of the photoreceptor cell layer of the retina to rise rapidly, so that the photoreceptor cells coagulate and degenerate and lose the role of photosensitivity.

The damage of far-infrared laser to the eye is mainly cornea, because almost all of this wavelength of laser is absorbed by the cornea, so the corneal damage is the most serious, mainly causing keratitis and conjunctivitis, patients feel eye pain, foreign body irritation, photophobia, tears, eyeball hyperemia, vision loss, etc.

According to the ASTM (American Society for Testing and Materials Testing), the wavelength range of visible light is 380 to 780 nanometers, the wavelength range of near-infrared light is 780 to 2500 nanometers, and the wavelength range of far-infrared light is 2500 nanometers to 1 millimeter.

Both 905nm and 1550nm belong to the near-infrared light, and no academic papers have been found to study the difference in their effects on the eye.

Due to different application scenarios and needs, different lidar manufacturers are meeting to select lasers of different wavelengths as their core technologies.

They will exaggerate the merits of their products in publicity and marketing, and even denigrate the products of their competitors.

This is why there is a lot of discussion on the Internet about the advantages and disadvantages of 905 nm and 1550nm lasers, and many of them are concluded without combining application scenarios and needs.

Read on