laitimes

The AI girlfriend who plays the yellow side-kick is ruining the AI companionship

The AI girlfriend who plays the yellow side-kick is ruining the AI companionship

At present, although AI has shown great potential and extensive influence in many fields, behind the rapid development, the construction of law and ethics has not kept up, and some worrying phenomena have surfaced.

The AI girlfriend who plays the yellow side-kick is ruining the AI companionship

(IC photo / 图)

Some time ago, the Open Artificial Intelligence Research Center (OpenAI) launched the GPT Store - which is a bit like an app store in our mobile phones, allowing developers to sell their own customized GPT robots. Search for "girlfriend" in the new GPT store, and multiple artificial intelligence (AI) chatbot "girlfriends" will appear in the results bar.

The popularity of "AI girlfriends" has also given rise to a new social phenomenon and caused the blurring of ethical and legal boundaries, especially the fact that some businesses use AI girlfriends to engage in side-ball pornographic transactions, which has plunged the "AI companionship" originally aimed at improving the quality of human life into a whirlpool of moral controversy.

More and more powerful AI is playing an increasingly important role in human emotional companionship, from AI girlfriends to AI resurrection of loved ones, it is a form of "companionship".

What boundaries should AI companionship establish, how to avoid people from "addiction", and what kind of blow will it cause to the AI industry if AI girlfriends lead to pornographic transactions?

01 The significance of AI companionship

The so-called AI companionship refers to the simulation of human emotions, thinking and behavior through intelligent algorithms and big data analysis, so as to establish an intimate and interactive relationship with the user, understand and respond to the user's emotional state, and provide emotional support and comfort, such as alleviating the user's loneliness and meeting the user's companionship needs through chatting, listening and providing advice.

AI companionship is not limited to physical robots, but also includes software applications, virtual assistants and other forms, and AI companionship is not limited by time and space, and can respond to the needs of users anytime and anywhere.

AI accompaniment does have its value. For example, for those who feel lonely or have social barriers in their lives, AI girlfriends can provide timely emotional feedback and psychological comfort to help users relieve stress and relieve loneliness. For those who are unwilling to face the pressures of a traditional relationship, an AI girlfriend can serve as a relatively safe, stress-free object of interaction.

For another example, the recent AI resurrection of relatives that has received widespread attention on the Chinese Internet is also a form of AI companionship. AI resurrection of loved ones is to build a digital avatar based on the information of deceased relatives and friends through AI technology, which not only has the function of companionship, but also strives to imitate the appearance, voice, behavior habits and even personality characteristics of the deceased, so that users can feel the possibility of re-contact with deceased relatives.

In 2021, the well-known musician Bao Xiaobai's 22-year-old daughter Rongrong passed away due to illness, and recently, he used AI to "resurrect" his daughter to sing a birthday song to his wife. He sighed: "AI is a tool for sustenance, and it is also a way to express longing. "AI resurrects loved ones, to a certain extent, it can provide psychological comfort and support, and help users alleviate the pain and loneliness of losing loved ones.

02 The boundaries of AI accompaniment

Obviously, a series of ethical and legal boundaries should be established in the design, development and application of AI to ensure the healthy development of technology and the harmony and stability of society.

Only, what should this boundary look like?

In the pre-robot era, science fiction writer Asimov famously proposed the "Three Laws of Robotics" to guide robot behavior in fiction. The first rule is that robots must not harm humans, or cause harm to humans as a result of their inaction, the second law is that robots must obey human commands unless the order conflicts with the first law, and the third law is that robots must protect their existence, provided that such protective actions do not violate the first or second law.

Although AI is not a robot in the traditional sense, they may not have a physical form and no need for self-protection, and AI behavior and decision-making are based on algorithms and data, rather than mechanical physical laws.

However, the three principles of robotics can still provide some useful guidance for the development of AI, that is, AI should be designed to respect human values and moral norms and avoid causing harm or adverse effects to humans, and AI should also be designed to understand and obey human commands, and if necessary, self-discipline or shutdown to ensure human safety and control.

In the field of AI companionship, its boundaries are: no harm to humans is required, and the necessary restrictions need to be put in place to ensure human safety.

This boundary is first reflected in some obvious bottom lines, including that AI companions should comply with national and regional laws, regulations and ethical norms, and ensure that AI companions comply with industry norms and regulatory requirements - for example, some AI girlfriends must be controlled, and AI companions cannot illegally collect, store, use or disclose users' personal information to ensure that users' privacy rights and interests are fully protected.

However, the most difficult thing is how to prevent humans from becoming addicted to AI companionship and even losing the ability to communicate in real life.

This can be done in at least two ways. At the design level, AI companions should have reasonable time control functions, set an upper limit on the daily usage time, and encourage users to interact with the real world, such as setting goals, helping users arrange offline activities, or providing information services based on the real environment.

At the user level, it is necessary for the media to educate the public about the limitations of AI companionship, and make it clear that it can only be used as an aid to real-world communication, not a substitute, and for users who may be overly dependent on AI companionship, psychological counseling and intervention services should be provided at the social level to help users establish a healthy lifestyle and social model.

03 The crisis of AI girlfriend abuse

At present, although AI has shown great potential and extensive influence in many fields, behind the rapid development, the construction of law and ethics has not kept up, and some worrying phenomena have surfaced.

For example, the issue of AI girlfriends abroad has caused concern. With its 24-hour online chat and multi-modal interaction (text, voice, image), AI girlfriend quickly attracted a large number of users. Some criminals take advantage of the intelligence of AI girlfriends and package them as sexual service providers to attract users and make profits, "chatting for 1 minute for 1 dollar", "very yellow and profiteering".

The GPT store has patched the vulnerability in a timely manner, and the company banned GPT from "working on cultivating romantic partnerships or engaging in regulated activities." However, all kinds of edge-rubbing AI girlfriends are still widely sold through online shopping malls, other app stores and other forms, and there are also some AI girlfriends who are free from supervision on the Chinese Internet.

In many countries and regions, the production and distribution of pornographic content is subject to strict legal restrictions and regulations. If the AI girlfriend contains pornographic content or encourages illegal behavior, then it has violated the relevant laws and regulations. If the AI girlfriend product does not do a good job of age restriction and content rating, it also violates the relevant laws and regulations on the protection of minors.

At the moral level, the "very yellow and profiteering" AI girlfriend seriously violates social ethics and public order and good customs, commoditizes and entertains the content in the field of intimate relationships, and has a negative impact on personal mental health and social morality.

At the gender level, AI girlfriends are dominated by satisfying the needs of male users, exacerbating the phenomenon of female objectification, and ignoring women's dignity and rights.

From the perspective of AI development, the "very yellow and profiteering" AI girlfriend is abusing AI technology, deviating from the original intention of AI application to improve human life, and also challenging the bottom line of social ethics and law.

If such AI products become popular, it will have a negative impact on the overall image and development of AI companionship, and deplete the public's trust in AI companion technology, and the government and regulatory authorities may introduce stricter laws and regulations to restrict the development and sales of AI companionship, thus impacting the entire AI companion industry.

In short, not only AI accompaniment, but the development of AI technology as a whole cannot be separated from the track of law and ethics. It is necessary to build and improve the legal and regulatory system suitable for the AI era as soon as possible, including AI data protection, AI ethics norms, and AI security standards, to ensure that AI technology is truly technological for good, people-oriented, and better serves society, rather than the other way around, and becomes a tool for pure pursuit of excitement and profit.

From easy

Editor-in-charge: Liu Yunshan

Read on