The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.

The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
At the end of May last year, the Cyberspace Administration of China, together with relevant departments, issued the Measures for the Administration of Data Security (Draft for Solicitation of Comments), which requires that "network operators using big data, artificial intelligence and other technologies to automatically synthesize news, blog posts, posts, comments and other information shall be marked with the word 'synthetic' in an obvious manner; information must not be automatically synthesized for the purpose of seeking benefits or harming the interests of others.".
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
At the end of May last year, the Cyberspace Administration of China, together with relevant departments, issued the Measures for the Administration of Data Security (Draft for Solicitation of Comments), which requires that "network operators using big data, artificial intelligence and other technologies to automatically synthesize news, blog posts, posts, comments and other information shall be marked with the word 'synthetic' in an obvious manner; information must not be automatically synthesized for the purpose of seeking benefits or harming the interests of others.".
Last October, the government of the U.S. state of California signed AB 602, which prohibits pornography without the consent of the person concerned – "consent" means that the party clearly knows and voluntarily signs an agreement (the agreement should include a description of pornography). If a California resident discovers that someone has made a pornographic face-swapping video without his permission, he or she can file a lawsuit in court and seek compensation. Statutory damages are not less than US$1500 but not more than US$30,000, and if the wrongful act is committed in bad faith, statutory damages may be increased up to US$150,000.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
At the end of May last year, the Cyberspace Administration of China, together with relevant departments, issued the Measures for the Administration of Data Security (Draft for Solicitation of Comments), which requires that "network operators using big data, artificial intelligence and other technologies to automatically synthesize news, blog posts, posts, comments and other information shall be marked with the word 'synthetic' in an obvious manner; information must not be automatically synthesized for the purpose of seeking benefits or harming the interests of others.".
Last October, the government of the U.S. state of California signed AB 602, which prohibits pornography without the consent of the person concerned – "consent" means that the party clearly knows and voluntarily signs an agreement (the agreement should include a description of pornography). If a California resident discovers that someone has made a pornographic face-swapping video without his permission, he or she can file a lawsuit in court and seek compensation. Statutory damages are not less than US$1500 but not more than US$30,000, and if the wrongful act is committed in bad faith, statutory damages may be increased up to US$150,000.
How to prevent AI technology from committing evil? We may be at the beginning of a new era.
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
At the end of May last year, the Cyberspace Administration of China, together with relevant departments, issued the Measures for the Administration of Data Security (Draft for Solicitation of Comments), which requires that "network operators using big data, artificial intelligence and other technologies to automatically synthesize news, blog posts, posts, comments and other information shall be marked with the word 'synthetic' in an obvious manner; information must not be automatically synthesized for the purpose of seeking benefits or harming the interests of others.".
Last October, the government of the U.S. state of California signed AB 602, which prohibits pornography without the consent of the person concerned – "consent" means that the party clearly knows and voluntarily signs an agreement (the agreement should include a description of pornography). If a California resident discovers that someone has made a pornographic face-swapping video without his permission, he or she can file a lawsuit in court and seek compensation. Statutory damages are not less than US$1500 but not more than US$30,000, and if the wrongful act is committed in bad faith, statutory damages may be increased up to US$150,000.
How to prevent AI technology from committing evil? We may be at the beginning of a new era.
Original report from the Southern Metropolis Daily (nddaily).
The much-publicized "Room N" incident in South Korea has made new progress. According to South Korean media sources, the police found a number of "pornographic face-changing" chat rooms during the investigation process, which contained a large number of face-changing photos and videos of Korean female idols.
Nandu reporters investigated and found that this is just the tip of the iceberg of pornography black ash production. Nowadays, a variety of AI technologies, including face swapping, have been deeply applied to the pornography-related industry.
Photos and videos in "Room N". Figure from the network
Police investigate Room N
Pull out the special chat room for pornographic face swapping
According to the South Korean website Naver, when the police investigated the "Room N" incident, they found four special chat rooms for "adult face-changing compounds" on the platform involved, and the face-changing objects were all Korean female artists.
Unlike "Room N", which is joined by invitation, these dedicated chat rooms need to be accessed by entering a complex registered address. In a face-changing room themed after female idol singers, more than 2,000 members uploaded face-changing photos and videos.
This is not the first time that a pornographic face-changing video has been exposed. In fact, the technology behind AI face swapping, Deepfakes, was used by developers for face swaps at the beginning of its birth: in the videos uploaded by developers, the faces of adult actresses were replaced by various female stars, and the effect was extremely natural. AI face-changing technology has gradually entered the public eye.
From its birth in 2017 to the present, after nearly three years of development, AI face-changing technology has become a weapon for the pornography industry. According to a previous investigation by Nandu reporters, some foreign adult websites have appeared in the "face changing area", which includes a large number of pornographic videos with the theme of female artists from various countries.
Last year, cybersecurity firm Deeptrace released a report on porn-related face swaps. The report shows that from December 2018 to July 2019, the number of AI face-changing videos on the Internet almost doubled, approaching 15,000. Among them, pornographic videos account for up to 96%.
It is worth noting that the embarrassing situation of Korean female idols has begun to emerge in Deeptrace's report. The report shows that among the top ten female artists in terms of views, Korean idol singers occupy three seats, and the number of pornographic face-changing videos has reached 404, with a total of 13.4 million views.
Korean idol singers became the main victims. Photo/AI outpost
Korean female idols have become the hardest hit areas for face change, and are closely related to the status of Korean women and popular culture.
Xi Lin, a doctoral student at Korea University's School of International Studies, pointed out in his thesis that there is a phenomenon of "objectification" of female idols in Hallyu culture. On average, the more people who pay attention to Hallyu activities, the less they recognize gender equality.
Hye Jin Lee, a practical assistant professor at the University of Southern California's Ann Arbor School of Journalism and Communication who studies Hallyu and global culture, said in an interview with foreign media that he suspected that the "black powder" of Korean female idols had made pornographic face-changing videos. For idols, reputation and image are crucial. Tarnishing their image through erotic face-swapping videos allows black fans to get full satisfaction.
In the N room incident, the staff of a company owned by a female idol singer helplessly said in an interview with Korean media that the content flow of the online platform is very frequent, and it is difficult to determine the producers and disseminators. Therefore, even if they know that the artist's pornographic face-changing photos are circulating on the Internet, they can only "let it go". "If you have to deal with it toughly, the singer's image will also be damaged, which will be a headache in many ways." He said.
Not just changing faces, AI technology has been deeply applied to the industry
In the turbulent dark tide of AI face change, women have become the biggest victims.
A contrasting fact is that according to cybersecurity firm Deeptrace, many male artists, entrepreneurs and even politicians have been made into entertaining face-swapping videos, such as Facebook founder Zuckerberg and US President Trump. However, the face-changing object in the pornographic face-swapping video is the same woman.
Women account for only 39% of non-pornographic videos, compared to 100% of pornographic videos. Photo/AI outpost
These women's face-changing videos are viewed and reviewed by tens of thousands of people, but they may not know it. Some ordinary women who are not artists have also been stolen and exploited for their lives posted on social networking sites.
In fact, it's not just AI that changes faces, there are other AI technologies that are deeply applied to the industry. Advances in image recognition and augmentation algorithms have given AI the ability to fill in missing parts of the picture, resulting in "stripping" or "decoding" tools.
In late June last year, the AI software "DeepNude" appeared on the network. It is based on artificial intelligence algorithms that can identify and "delete" the clothes of the people in the photo, thus turning the normal photo into a "nude photo". Due to the controversy, the software was quickly removed from the official website a few days after it was launched. However, the Nandu reporter's investigation found that the software resources that have been downloaded and saved are still quietly circulating.
The "decoding" technique that can remove the mosaic in the picture is now quite mature, resulting in a large number of so-called "codeless" videos circulating online. In August last year, a "codeless version" video featuring popular Japanese actress Yuya Mikami caused a hot discussion, and Yuya Mikami publicly responded through Instagram: "This time the outflow of codeless videos has become a topic of discussion, but I'm sorry ~ this is an AI computer synthesis." ”
Yuya Mikami responded publicly. Image from Instagram.
It is worth noting that from Deepfakes to today's "stripping" and "decoding" tools, relevant developers have always been hidden behind the network, and their real identities are a mystery. The software and tools they release have been widely circulated because of open source, and have greatly reduced the threshold for the application of technology - even ordinary people without a theoretical basis in artificial intelligence can quickly use them through relevant tutorials and instructions.
Throughout the world, women who encounter AI pornography generally face difficulties: despite the perspective of rights protection in terms of portrait rights and reputation rights, factors such as the anonymity and liquidity of the Internet still leave them in a state where they have no way to defend their rights.
Since last year, regulators in various countries have begun to explore, hoping to find better ways to regulate.
At the end of May last year, the Cyberspace Administration of China, together with relevant departments, issued the Measures for the Administration of Data Security (Draft for Solicitation of Comments), which requires that "network operators using big data, artificial intelligence and other technologies to automatically synthesize news, blog posts, posts, comments and other information shall be marked with the word 'synthetic' in an obvious manner; information must not be automatically synthesized for the purpose of seeking benefits or harming the interests of others.".
Last October, the government of the U.S. state of California signed AB 602, which prohibits pornography without the consent of the person concerned – "consent" means that the party clearly knows and voluntarily signs an agreement (the agreement should include a description of pornography). If a California resident discovers that someone has made a pornographic face-swapping video without his permission, he or she can file a lawsuit in court and seek compensation. Statutory damages are not less than US$1500 but not more than US$30,000, and if the wrongful act is committed in bad faith, statutory damages may be increased up to US$150,000.
How to prevent AI technology from committing evil? We may be at the beginning of a new era.
Original report from the Southern Metropolis Daily (nddaily).
Nandu reporter Feng Qunxing Chen Zhifang