laitimes

Taylor Swift's deep vacation nude photos went viral on social media. Her fans are fighting back

author:Faint notes in the distant starry sky
Taylor Swift's deep vacation nude photos went viral on social media. Her fans are fighting back

Deep images of Taylor Swift are circulating online, making the singer the most famous victim of a difficult problem for tech platforms and anti-abuse groups to address.

This week, Swift's pornographic and abusive fake photos began to go viral on the social media platform X.

Her rabid fan base of "Swifties" quickly mobilized to launch a counterattack on the platform formerly known as Twitter, and launched the hashtag #Protect TaylorSwift(#ProtectTaylorSwift), tweeting more positive images of the pop singer. Some said they were reporting accounts that shared deepfakes.

Deepfakes detection group Reality Defender said it tracked down a large number of non-consensual pornographic material depicting Swift, especially on X. Some of the images have also made their way into Facebook and other social media platforms owned by Meta.

"Unfortunately, by the time some of them were removed, they had already spread to millions of users," said Mason Allen, head of growth at Reality Defender.

The researchers found at least dozens of unique AI-generated images. The most shared was football-related, showing a painted or bloodied Swift, objectifying her and, in some cases, violently harming her hypocritical image.

Researchers say the number of apparent deepfakes has increased over the past few years as the technology used to generate such images has become more accessible and usable. In 2019, artificial intelligence company DeepTrace Labs released a report showing that the vast majority of these images were used against women. Most of the victims were Hollywood actors and K-pop singers, the report said.

Brittany Spanos is a senior writer for Rolling Stone magazine and teaches a course on Swift at New York University. She said Swift's fans will soon mobilize to support their artists, especially those who take their fandom very seriously.

"If she does take this to court, it will be a big deal," she said. ”

Spanos said the serious problem of fake pornography was consistent with other issues in Swift's past, noting that she sued a radio DJ in 2017 for allegedly molesting her; Jurors awarded Ms. Swift $1 in damages, and her lawyer, Douglas Baldridge, said the money was "a symbolic dollar that is immeasurable to all the women in this situation" in the MeToo movement. ($1 lawsuits have since become a trend, like Gwyneth Paltrow's countersuit against a skier in 2023.) )

When contacted by reporters for comments about Swift's fake photos, X gave the Associated Press access to a post on its secure account saying the company strictly prohibits sharing non-consensual nude photos on its platform. Since Elon Musk took over the platform in 2022, the company has also slashed its content moderation team.

"Our team is actively removing all identified images and taking appropriate action against the accounts responsible for posting them," the company wrote in a post early Friday morning. "We are closely monitoring the situation to ensure that any further violations are addressed immediately and that the content in question is removed. ”

Meanwhile, Meta said in a statement that it strongly condemns "content that appears on different internet services" and has worked to remove it.

"We will continue to monitor the platform for offending content and will take appropriate action if necessary," the company said. ”

Swift's representatives did not immediately respond to a request for comment on Friday.

Allen said researchers are 90 percent confident that the images were created by a diffusion model, a generative AI model that can generate new realistic images based on written prompts. The most well-known are Stable Diffusion, Midjourney, and OpenAI's DALL-E. Allen's team did not attempt to determine its source.

Microsoft offers a partly DALL-E based image generator, and the company said Friday that it is investigating whether its tool has been misused. Just like other commercial AI services, it says it doesn't allow "adult or non-consensual private content, and any repeated attempts to make content that violates our policies may result in the loss of access to the service." ”

In Tuesday's episode of NBC Evening News, Microsoft CEO Satya Nadella was asked about Swift's deepfakes in an interview with host Lester Hoult, saying that there was still a lot of work to be done in setting up AI protections, and that "we need to act quickly on this." ”

"It's absolutely worrying and scary, and therefore, we have to act," Nadella said. ”

Midjourney、OpenAI和Stable diffusion的制造商Stability AI没有立即回应置评请求。

Federal lawmakers, who have introduced more bills to limit or criminalize deep fake pornography, say the incident shows why the U.S. needs to implement better protections.

"Women have been victims of non-consensual deepfakes for years, so what happened to Taylor Swift was much more common than most people realized," said Rep. Yvette D. Clarke, a Democrat from New York who proposed legislation that would require creators to digitally watermark deepfakes.

"Generative AI is helping to create better deepfakes at a fraction of the cost," Clark said. ”

U.S. Rep. Joe Morrell, another New York Democrat, is pushing a bill that would criminalize the sharing of deep fake pornography online. What happened to Swift was disturbing and becoming more common on the internet, Morrell said.

"The photos may be fake, but their impact is very real," Morel said in a statement. "In this increasingly digital world, where women are deepfaked every day, it's time to stop them. ”

Read on