What the hell are people afraid of Sora? Science and Technology Observation

What the hell are people afraid of Sora? Science and Technology Observation

Cover News Reporter Yan Wenwen

What is true and what is false? In the age of artificial intelligence, it is clear that it will be more difficult to distinguish between real and fake.

Last week, artificial intelligence company Open AI released Sora, a tool that generates hyper-realistic videos from text, to great acclaim that it will change the ecosystem of many industries, giving everyone the opportunity to create videos in an easier way. The tool is currently being tested and evaluated for potential security risks, and no date has been set for its public release.

However, things will never be that simple. Less than a week later, experts have warned that new AI tools such as Sora are likely to lead to a proliferation of fake videos, and it will become increasingly difficult for people to tell if they are real or fake.

What the hell are people afraid of Sora? Science and Technology Observation

Video generated by Sora

Sora's ability to make people "scared"

In October 2023, U.S. President Joe Biden mentioned in an interview that he had seen a video about him online. After reading it, he said, "When have I ever said such a thing?" This little joke made Biden think that an AI-generated image of only 3 seconds is enough to impersonate himself and even deceive the victim's family. Well, it's easy for fraudsters to use the video to trick victims' families.

And the appearance of Sora seems to be a prophecy for the future.

OpenAI explains on its website: "Sora is capable of generating complex scenes with multiple characters, specific types of movement, and accurate details of themes and backgrounds. "The model understands not only what the user is asking for in the prompt, but also how those things exist in the physical world. ”

This means that Sora is not only able to fall from a physical point of view, such as gravity, the movement of human muscles when chewing, and the lapping of waves against the shore with the movement of the tides...... What's more, as an AI, Sora is also able to make these scenes more realistic in future learning. Well, this probably means that in the future, when people see a video, it will be difficult to tell if it's real or fake.

As a result, some experts even called Sora "scary".

Dr. Newell, chief scientific officer at identity verification firm iProov, said Sora would make it "easier for malicious actors to generate high-quality fake videos." ”

What the hell are people afraid of Sora? Science and Technology Observation

Video generated by Sora

Artificial intelligence could influence the election

Experts say this in an important context, 2024 is the year of the election in the United States, and AI technology may be misused in the political sphere. As Biden said earlier, he hesitates when he sees an AI-generated video of himself, so it is even more difficult for ordinary people to tell whether this is what the candidate is saying.

Experts warn that new AI tools could lead to an increase in fake videos ahead of the US presidential election.

Ezioni, the founder of TruMedia, is not without concern that generative AI tools are evolving so quickly that they are deadly in the age of social networking, and that the timing of this could not have been better. The past year has seen a rapid increase in the quality of AI-generated images, audio, and video, with companies such as OpenAI, Google, Meta, and others racing to develop more advanced and easy-to-use tools.

While the examples given are "true, good, and beautiful", experts warn that the new technology could trigger a new wave of malicious counterfeiting.

In January 2024, the world's top female singer Taylor Swift announced her relationship with American NFL football player Travis Kelce. Almost overnight, explicit photos of Taylor Swift on the court went viral, and the photos were so widespread that they even alarmed the White House. Asked about the matter, a White House spokesman said, "We are concerned about the circulation of such images, more precisely, false images, which is worrying." ”

In fact, 2 weeks before these photos went viral, there was a video of Taylor Swift giving a famous pot on the Internet. And the video was denied by both Taylor Swift and the brand, which is another AI-generated video.

The question is, someone generates the video, but does someone take care of it? Who can be held accountable for these fake videos?

On Friday, tech companies such as Amazon, Google, Meta and OpenAI signed an agreement to take "reasonable precautions" to prevent AI tools from being used to disrupt elections around the world.

"Everyone recognizes that no tech company, no government, no civil society organization can deal with the emergence of this technology and its possible malicious use on its own," said Nick Clegg, president of Meta's global affairs.

The film and television industry is likely to be destroyed

Many people are worried about the future of the film and television industry.

Michael Gracie, film director and visual effects expert, said: "Look at what progress we've made in the year of image generation. Where will we be in a year?"

In fact, in 2023, Hollywood has gone on strike for half a year, and its fuse is related to artificial intelligence.

On May 2, 2023, the Hollywood Writers Guild went on strike partly because of "fighting for a share of streaming and replays", and partly because the giants are interested in using artificial intelligence to replace some of the screenwriters' jobs. On July 14, the Screen Actors Guild also joined the strike, and all 160,000 members responded, including actors, stuntmen, voice actors, motion capture actors, etc., and the entire strike lasted until November 9.

During the strike, there were reports that some people wanted to be able to scan the whole body of the ensemble and pay them only one day's wages. After that, these production companies have scanned documents, images of ensembles, and portraits. In any film and television project in the future, the company can use it as it wants, without the consent of the group performer, and there is no need to give extra money.

Although in the end, the idea did not hold, in the future, the giants will certainly not abandon this low-cost scheme of use of actors. Therefore, people are worried about the future prospects of the film and television industry.

Perhaps, in the near future, it will be difficult for people to see live actors on the screen.

Read on