laitimes

AI clones of people, "second life" hides dangers and opportunities

author:Globe.com

Source: Global Times

[Global Times reporter Ding Yayan Cui Jinyue] "Everyone can have the opportunity to use AI clones to start a second life. Recently, Chinese intelligent start-up Xiaoice launched the "AI Clone Program" in the Chinese and Japanese markets, and launched the first batch of Internet celebrity clones in its App positioned as a "virtual human leisure and entertainment platform". The company said that as little as 3 minutes of data can be collected, it can create AI clones derived from themselves for celebrities, experts and scholars, and ordinary people, with the same faces, voices, tones and demeanors as the copied characters. The AI dividend has been further tapped, and the boundary between the virtual and the real has been further broken.

Realized on the C-end

After downloading the Xiaoice company's App, the Global Times reporter found that the Internet celebrity clone previously said by netizens had been taken offline, and currently could only interact with the virtual human "Xiaoice" and view its circle of friends.

Wu Kai, who has used this app, told the Global Times reporter that this app has a paid model. There is a value-added service portal at the top of the chat interface, click to pay, and unlock more forms of interaction. According to his understanding, the paid model is divided into two levels, one is the "emotional mode" of 6 yuan / month or 72 yuan / year, in this mode, users can make voice calls with clones and obtain access to the clone's circle of friends; The other is the "super model" of 30 yuan / month or 360 yuan / year, in this mode, the clone will become the user's office partner, providing services such as writing copywriting.

AI clones of people, "second life" hides dangers and opportunities

"Interacting with AI clones is not new, and 'Xiaoice' has long been a well-known virtual Internet celebrity in China." Chen Jie, former senior data scientist at Microsoft, technical partner of a well-known domestic cybersecurity enterprise and head of the AI lab, told the Global Times that the Xiaoice company's predecessor was the artificial intelligence team of the Microsoft (Asia) Internet Engineering Institute, which was established in 2013 and operated independently in 2020 and became a local Chinese start-up. "Previously, virtual humans could interact with humans, write poetry, paint and broadcast news. With the blessing of the current large model technology, AI clones have reached a point where they are almost indistinguishable from real people. ”

It is worth noting that digital clones can already be monetized on the C-side (user-side). Xiaoice companies indicated that replicants who chose to make "AI clones" could decide to provide certified clones to audiences for a paid or unpaid basis. It is understood that AI clone companies will take a certain share of the cloned income.

According to the Washington Post, in May, "influencer" Karin Marjorie, who has nearly 2 million followers on Snapchat, launched CarynAI, an AI clone based on artificial intelligence chat technology. For $1 per minute, fans can "chat" with their doppelgangers and feel like they're talking to themselves. The product earned more than $100,000 in its first week after launch, and demand continued to outstrip supply, with "thousands of people lining up to use it."

The "other self" takes over socializing

"Companionship" and "cure loneliness" are high-frequency words that appear when introducing artificial intelligence chatbots. Xiaoice company's app also wrote in the application introduction: "In the future world, no one will be alone. Marjorie also said in a tweet: "CarynAI is the first step in the right direction to cure loneliness." She once said that she has millions of fans on the Internet, it is impossible to communicate with every fan, and the AI clone herself can help fans "cure loneliness".

"The more important significance of AI cloning is to improve social efficiency." Shen Yang, director of the Metaverse Culture Laboratory of Tsinghua University, told the Global Times. He believes that in the process of the continuous maturity of AI-generated content, AI cloning will become the direction of development, and its appearance, movement and intelligence will become closer and closer to people, and the future is bound to subvert the traditional social model of human beings.

Take CarynAI, an AI chatbot that gives a person the ability to reach thousands of audiences simultaneously. "At present, the development of AI clones is still in the early stages, and only a few people can use it. As the technology matures, everyone can access AI clones, and social networking is gradually hosted by AI. After people find friends through AI, they can extend to offline scenarios, which may improve social efficiency. Shenyang said.

"It's like having a personal secretary who exactly aligns with your preferences, what is the experience?" A practitioner in charge of research and development in the AI industry told the Global Times reporter that he believes that AI cloning is the landing scenario with the most potential for artificial intelligence to monetize. "If the further maturity and application of 'brain-computer interface' technology is superimposed, humans can have another self to serve themselves. Social efficiency and productivity will be greatly improved. ”

Regulation is critical

A number of experts interviewed said that the emergence of AI clones may bring a series of security problems and ethical crises in the short and long term. "At present, both the country and the AI industry are gradually exploring the road of AI supervision to prevent the occurrence of problems." Shenyang told the Global Times.

Shenyang believes that privacy, security, and copyright issues are the main problems facing AI clones. "If you want to replicate a person in the online world, how can the person in charge authorize it? Will it be infringing when used? These are all urgent problems. He also mentioned the "joint and several liability" of AI clones: "If AI quarrels with people online and violates the reputation rights of others, then who is this responsibility?" This is also something that needs to be explored. ”

In the app launched by Xiaoice company, there are also users who have left messages expressing concern about the collection of information, including voyeurism and monitoring. Although the company makes it clear that each AI clone is confined to a separate framework, thus ensuring that it is not abused. The prototype himself can end the "life" of his AI clone at any time, ensuring that it is in complete control.

"One of the biggest near-term risks of AI clones, if abused, is that they are used for fraud. If personal information is maliciously used for fraud, the deepfake's individual image and voice cannot distinguish its authenticity. Chen Jie told the Global Times reporter that in addition, the "electronic accompaniment" achieved by AI cloning technology may also bring ethical challenges between relatives, boyfriends and girlfriends, and even be used by some black and gray industries.

"But we still have to see the positive side of the benefits outweighing the risks." Chen Jie believes that compared with the vulnerabilities that AI technology may encounter, more people are actively investing in and developing application scenarios that may subvert the industry model and liberate productivity. "What's more, as an open-source big language model, AI technology can no longer be stopped, and timely and powerful supervision is crucial."