laitimes

Xinhua Viewpoint | many "face-changing" fraud cases: a series of realistic videos are actually fake...

A video or a voice may not be filmed or recorded by a real person. In the background of the mobile app you don't know, the payment interface, the access control gate, maybe someone is stealing your face... Since last year, "face-changing" scams have occurred in many places.

The "Xinhua Viewpoint" reporter's investigation found that with the rapid development of deep synthesis technology and the surge in landing scenes, some lawbreakers took advantage of the opportunity to make profits. The abuse of synthetic technologies such as audio and video poses a challenge to the protection of sensitive personal information such as faces, voiceprints, and fingerprints.

Xinhua Viewpoint | many "face-changing" fraud cases: a series of realistic videos are actually fake...

Synthesize a dynamic video of 2 to 10 yuan can register a mobile phone card, payment account

Recently, Mr. Chen came to the Xianyan Police Station of the Ouhai Branch of the Wenzhou Municipal Public Security Bureau in Zhejiang Province to report that he had been defrauded of nearly 50,000 yuan by his "friends". After police verification, the scammers used AI face-changing technology to use the video previously posted on the social platform of Mr. Chen's friend A Cheng to intercept the facial video footage and "change face", thus defrauding Mr. Chen.

In April 2021, police in Hefei, Anhui Province, smashed a criminal gang in the Ministry of Public Security's "Clean Net 2021" special operation, which used artificial intelligence technology to falsify dynamic videos of other people's faces and provide technical support such as registered mobile phone cards for the black and gray industry chain.

At the scene of the police arrest, several criminal suspects are using computers to make a dynamic video of a human face with still photos. The simulated dynamic characters can not only do nods, shake their heads and other actions, but also complete rich expressions such as blinking, opening their mouths, frowning, etc., and the effect is extremely realistic.

In the suspect's computer, the police found more than a dozen G of citizen face data, face photos and ID photos stored in folders. "The front and back photos of the ID card, the photos of the handheld ID card, the selfie, etc., are called a set." According to the police, the complete set of photos is called "material", and the person who sells the photo is called "material merchant", these "materials" have changed hands many times on the Internet, and the owner of the "material" is unaware.

Criminal suspect Ma Mou confessed that due to the simplicity of production, the price of a video is only 2 to 10 yuan, and the "customer" is often hundreds of thousands of purchases, and the profit space is huge.

In recent years, similar cases have occurred in Zhejiang, Jiangsu, Henan and other places. A criminal ruling by the Quzhou Intermediate People's Court in Zhejiang Province disclosed that Zhang, Yu and others used technical means to deceive Alipay's face recognition authentication, and used citizens' personal information to register Alipay accounts, illegally profiting tens of thousands of yuan.

The modus operandi of these cases is quite similar: criminals illegally obtain photos of others or acquire "materials" such as other people's voices for a fee, and only a small amount of audio and video sample data can be synthesized into forged audio and video comparable to real people, which can be used to carry out accurate fraud, infringe on the personal and property safety of others, or sell or maliciously disseminate technology to change face indecent videos, etc., resulting in damage to the reputation of portrait rights holders.

Network "selling" synthetic software tutorials Behind the risks are technical loopholes and governance shortcomings

According to Wang Xiangrui, a police officer of the Network Security Brigade of the Baohe Branch of the Hefei Municipal Public Security Bureau, the 8 criminal suspects in the above-mentioned cases were mostly idle people in society, and some did not even finish high school. They downloaded the software according to the online shopping tutorial and spent a few months "self-taught".

The reporter contacted a seller selling related tutorials on the Internet. According to the seller, the price of the full set of software and tutorials is 400 yuan and 800 yuan, and 800 yuan is a high-end version, "the success rate of the face is super high". The reporter saw in the demonstration video that after the photo was uploaded to the software, the position of the facial features was marked, the script parameters were adjusted, and a face moved. "The parameters of the five senses are sent with the tutorial, and you can copy it." According to reports, these fake videos not only have a high pass rate, but also are difficult to distinguish between true and false by manual review.

"At present, the public has been vigilant about the easy tampering of static information such as photos, but there is still a high degree of trust in dynamic information content such as video and sound." Zhu Jun, director of the Basic Theory Research Center of tsinghua University's Institute of Artificial Intelligence, said that the rapid evolution of deep synthesis technology has made it "no longer true to see", and the difficulty of cracking identity verification will become lower and lower, and the time consumption will become shorter and shorter.

Experts worry that despite the continuous iteration of identification technology for deep synthesis technology and the continuous enhancement of detection methods, it has not been able to outpace the speed of "forged" technology upgrades. Ren Kui, dean of the School of Cyberspace Security at Zhejiang University, said that with the further reduction of the threshold for the application of synthetic technology, synthetic content has blurred the boundary between true and fake.

Tian Tian, executive director of the Security Innovation Center of Beijing Zhiyuan Artificial Intelligence Research Institute, believes that new forgery methods are emerging in an endless stream, the network communication environment is becoming more and more complex, and there are loopholes in detection algorithms, etc., and it is becoming more and more difficult to detect deep and false.

The relative lag in the legal provisions also leaves opportunities for lawbreakers to take advantage of. Chen Jihong, a partner at Zhong Lun Law Firm, said that the current law prohibits the use of information technology means to forge and other ways to infringe on the portrait rights of others, but there are no specific provisions on how to use the technology to calculate fair use, and in which cases should be prohibited, etc.; the collection or acquisition of personal voiceprints, photos, the use of personal biological information such as faces, fingerprints, DNA, iris and other acts, within what scope constitutes a crime, what kind of punishment will be faced, and further clear guidance is required by judicial adjudication.

Regulate the abuse of synthetic technology Stop making the public worry about "face"

Protecting sensitive information such as faces, fingerprints, and voiceprints, and no longer worrying about information "running naked" to damage personal privacy, property, reputation, etc., is the common expectation of the public.

The first national-level guiding document on the ethical governance of science and technology in the mainland, the Opinions on Strengthening the Ethical Governance of Science and Technology, was recently issued, highlighting the importance and urgency of the ethical governance of technology. In this year's SPC work report, the security of personal information, including face security, has been mentioned many times.

Chen Jihong said that to combat the crime of "changing face" fraud, it should be regulated from the aspects of the boundaries of the legal use of technology, the security assessment procedures of technology, and the legal regulation of abuse of technology, so as to increase the illegal cost of technology abuse.

Wu Hequan, an academician of the Chinese Academy of Engineering and an information technology expert, proposed that in view of the abuse of deep synthesis technology, technology should be regulated by technology, and the ability of technological innovation and technological confrontation should be used to improve and iterate detection technology.

In addition to technical regulations, risk management of technology abuse and exposure should be systematized and improved. "It is necessary to build data set quality specifications, carry out risk classification and classification management of relevant technologies according to application scenarios, and clarify the responsibilities of design and development units, operation and maintenance units, and data providers." Qiu Huijun, deputy chief engineer of the National Industrial Information Security Development Research Center, said.

Experts remind that in response to the "face change" fraud of fancy renovation, the public should improve their awareness of prevention, do not easily provide personal biological information such as faces and fingerprints to others, and do not overly disclose or share GIFs, videos, etc.; before online transfer, they must verify each other's identity through various communication channels such as telephone and video. Once the risk is found, call the police for help in time. (Reporters Zhang Manzi, Zhang Chao, Chen Nuo)