laitimes

Wang Qi: Social platforms have the obligation to control online violence | Reflections on the boy hunt for relatives

author:Peking University Dharma Treasure School

Peking University Legal Information Network 2022-01-27 18:14

Wang Qi: Social platforms have the obligation to control online violence | Reflections on the boy hunt for relatives

Author | Wang Qi is an assistant professor at the Law School of Beijing University of Aeronautics and Astronautics

Deputy Director of the Civil Law Research Center, Contracted Author of Peking University Legal Information Network

Thanks to the author for authorizing the push

On January 24, 2022, Liu Xuezhou, a boy looking for relatives, died tragically. After learning about the suffering of this 15-year-old teenager, especially after reading his last words that "born light, but also pure", it is sighing. Among them, the family grievances are right and wrong, and it is believed that the people of the world can make a public opinion, and whether there is criminal responsibility in it should also be decided by the competent authority. From the perspective of my own civil law profession, the author pays more attention to the issue of online violence in this incident.

Liu Xuezhou wrote in his suicide note, "In the past few days, some people have been attacking me with vibrato and Weibo private messages, scolding me... And when I tried to explain, I found that many, many more than ninety percent were small and private accounts... Endured too many, dying, nausea,, and all sorts of words. The author believes that these unsightly words have an important impact on Liu Xuezhou's decision to take his own life. There are many lessons worth reflecting on and learning from this tragedy, among which it should not be ignored that the governance obligations of social platforms for online violence should be clarified and strengthened.

01 Social platforms play an indispensable role in the governance of online violence and master efficient governance methods

In recent years, almost every network public opinion incident has been accompanied by a large number of violent remarks, whether it is the harm to the parties or the damage to social customs, is not less than or even more than the incident itself, internet violence has become a stubborn chronic disease of the Internet space, which needs to be solved urgently. Among them, social platforms play an indispensable and important role in providing relief to victims and managing online violence. One might ask that there are already remedies and governance mechanisms for online violence. As far as remedies are concerned, there is traditionally a way of protection under tort law, that is, the parties claim tort liability against the perpetrators, require the other party to delete the speech, apologize, and eliminate the impact. The speaker may also be held criminally liable if his or her conduct constitutes an offence of insult or defamation under the Penal Code.

For governance, it is also possible to find administrative organs responsible for it. Why emphasize the role of social platforms in this regard? The reason is that cyberbullying, by its very nature, often leads to the failure or at least inefficiency of traditional remedies and governance. Taking the parties' claims of tort liability as an example, many of the statements in social networks are made anonymously, and it is not an easy task for the injured party to obtain the identity information of the internet rioter, even if it is obtained, the other party is often more than one person and distributed in different regions, which increases the difficulty and cost of judicial remedies.

Similarly, there are limits to governing online speech through administrative organs, because compared with the huge number of users and massive activities on social networks, the human and material resources of any administrative organ are a drop in the ocean, and it is impossible to achieve complete supervision. In the context of the dilemma faced by traditional relief and governance mechanisms, the role of social platforms themselves has also been brought into play. For people in social networks, the judiciary and administrative organs are far away after all, but social platforms are always present.

The next question is, what exactly does the social platform rely on to govern? After all, the social platform itself is not a state organ and has no right to mobilize national resources, but the author believes that the platform has its own unique governance means, that is, its information system and data resources. No one will doubt that in the era of big data, information systems and data resources themselves are important sources of influence, and whoever masters information systems and data resources has in fact gained important influence. In this incident, Weibo's efficient action once again proved this point. Weibo issued an announcement on January 26, 2022, saying that the private message internet storm incident involving Liu Mou, a boy who was looking for relatives that was hotly discussed on the Internet, was investigated by the station for the private messages sent to Liu by others through Weibo in the early morning of January 1 to 24, 2022.

Through the investigation, the station decided to suspend the private message function for more than 1,000 users who sent private messages to Liu during this period. Previously, on January 24, Weibo also issued a special notice that if they encounter private messages or comment attacks and harassment, users can choose to open the privacy protection function according to their own needs, or they can choose to open the "comment firewall". If you encounter a malicious attack on the blog post, you can report it through the complaint portal, and the station will deal with it in a timely manner after verification. We have seen that in a very short period of time, more than a thousand anonymous users have been identified and suspended their private message function, which takes a long time through traditional judicial or administrative channels, but Weibo can easily do it, because Weibo has mastered the information system and related data, so that it can accurately locate a large number of users and take measures quickly, which shows the power of the governance means owned by social platforms.

02 The source of the obligation of social platforms to govern online violence: contracts with users

After clarifying that social platforms actually have the ability to govern online violence, the question that arises is, why are social platforms obligated to govern at the normative level? In other words, why do social platforms need to intervene in response to the complaints of the parties, and sometimes even if the parties do not apply, they should take the initiative to intervene. This "should-be" is not taken for granted, but requires a legal basis. After all, social platforms are legally profitable legal persons, and their main purpose is to make profits through operation, rather than maintaining public order and providing public services like state organs.

Of course, we can go back to the level of abstract legal principles to argue for this obligation, but the author believes that this is not necessary, and the basis for this obligation can be found at a more specific level, that is, the agreement between social platforms and users. Such an agreement is essentially a contract between the user and the platform. We take the current "Weibo Service Use Agreement" as an example, Article 4 of the Agreement "Rules of Use" stipulates the rules that users should follow when using Weibo, of which paragraph 4.9 requires that "users should speak in a civilized manner in the process of using Weibo services, and respect the personality rights and identity rights of other users in accordance with the law, and jointly establish a harmonious, civilized and polite online social environment", and 4.10 stipulates that "users must follow the following principles in the process of using Weibo services: do not upload, display or disseminate any false and false, Impersonation, harassment, slander, aggression, verbal, intimidating... Paragraph 4.12 clarifies that "Weibo operators have the right to review, supervise and deal with users' behavior and information on using Weibo services"; 4.13 stipulates that users have the right to complain and Weibo should accept complaints, and paragraphs 8.3, 8.4, 8.5 and other paragraphs further clarify that Weibo has the right to take all necessary measures to eliminate the impact of user violations, including "changing, deleting or blocking relevant content", "warning illegal accounts, account bans", "changing, restricting or prohibiting some or all functions of illegal accounts ”。 Through the above agreement, Weibo has actually carried out self-empowerment on the basis of the contract ("User Agreement"), and has obtained a series of important powers such as the definition, investigation, handling and sanction of violations, and also makes users have the obligation to cooperate and tolerate.

According to the principle of consistency of rights and obligations in article 131 of the Civil Code, "when a civil subject exercises its rights, it shall perform the obligations prescribed by law and agreed upon by the parties". The user agreement gives the social platform the power at the same time, but also makes it assume the corresponding obligations, it can be said that the social platform in becoming the user supervisor at the same time, but also become the user's protector. Therefore, also based on the user agreement, the platform has the obligation to respond to the complaints of users who have suffered from Internet violence in a timely manner, and even when the Internet violence behavior has become obvious enough to be ignored, it has the obligation to actively intervene to protect the parties to stop the Internet violence. If the platform does not act in a timely manner, this will first constitute a breach of contract, and secondly, it may trigger the tort liability of the social platform (see below).

03 Governance model: mainly to intervene in response to complaints, supplemented by active intervention

The next question is, how should the platform be governed, and should it adopt a strong model of pre-control? For example, filtering violent speech through keywords, or screening users, banning users or forcibly canceling their accounts. The author believes that this is not advisable, because the social platform is ultimately a platform for free expression, and the biggest attraction of the social platform to the user is that the user can freely and efficiently publish and share speech on the platform to achieve the meaning and fun of social interaction, so the social platform is required to screen users or review their speeches in advance, which is contrary to the fundamental business model of the social platform and is unfair to the social platform.

In addition, freedom of speech is a basic right of citizens, even if it proves that someone does have a "black history" of inappropriate speech, it cannot permanently deprive them of the right to freedom of speech, and the peer-to-peer communication (such as Weibo private messages) on social platforms is a secret communication, which is protected by the communication secret system, and the social platform has no right to view it without authorization. Therefore, a governance model based on complaint intervention should be adopted. After all, online violence is different from actual violence, there is no directness and urgency of endangering the person, and there is room for complaints to be dealt with.

As long as the platform provides users with a convenient and easy-to-use complaint mechanism (such as the "one-click complaint" function) and takes timely measures after receiving the complaint, it is generally enough to protect the infringed party. However, in some cases, social platforms also have the obligation to actively intervene, such as online violence has been very obvious, even if the parties have not complained, social platforms have the obligation to actively intervene. This is the case in the Liu Xuezhou incident, as far as the current public information is concerned, Liu Xuezhou did not complain about the private messages received before his death, and Weibo took the initiative to conduct data checks on his private messages after the relevant incidents aroused social concern, and reported the situation to the relevant departments. This model of complaint-response intervention, supplemented by active intervention, is also in line with the liability pattern established by articles 1195 and 1197 of the Civil Code for social platforms (as network service providers).

First of all, according to Article 1195, social platforms are protected by the "safe harbor" rule, even if the information published by the platform user using its services constitutes online violence or other illegal situations, the platform's publication of the information without prior review will not constitute infringement, because the main role of the "safe harbor" rule is to reduce the social platform's duty of prior care, so that it does not need to do pre-work review of each information published or transmitted using its service (which is almost an unbearable burden for network service providers). Instead, just take reasonable action in a timely manner after receiving the complaint. According to sub-sentence 2 of article 1195, paragraph 2, of the Civil Code, social platforms "shall be jointly and severally liable to the network user for the enlarged part of the damage if they fail to take necessary measures in a timely manner".

It can be seen that in general, the transmission of users' Internet violence speech without censorship will not make the social platform responsible, and the inaction or inappropriate action after receiving the complaint will make it liable. If article 1195 of the Civil Code stipulates that social platforms are responsible for not performing their passive intervention obligations, article 1197 stipulates the liability arising from social platforms' violation of their obligation to actively intervene, which stipulates that "if a network service provider knows or should know that a network user uses its network services to infringe on the civil rights and interests of others and fails to take necessary measures, it shall bear joint and several liability with the network user". The key to the application of this article is, under what circumstances, even if there is no complaint from the infringed, the social platform should be aware of the existence of the infringement and have the obligation to actively intervene?

Article 6 of the Provisions of the Supreme People's Court on Several Issues Concerning the Application of Law in the Trial of Civil Dispute Cases Involving the Use of Information Networks to Infringe on Personal Rights and Interests (amended in 2020) gives a series of factors that need to be referred to, including "the degree of social impact of the network information or the number of views within a certain period of time". Therefore, when online violence causes a major social impact, the premise of Article 1197 is generally satisfied, that is, the social platform should at least know the existence of relevant facts, and the platform must take the initiative to take measures, and if it is not taken, it will face the liability consequences of Article 1197.

04 The future direction of cyberbullying regulation: urge social platforms to play a governance function, develop more effective anti-Cyberbullying technologies, and formulate more perfect anti-Cyberbullying norms

According to news reports, more than 70% of the college students surveyed believe that they have been affected by online violence. It can be seen that the prevention and control of online violence has a long way to go, this is bound to be a long-term and arduous project, what is the future direction? The author believes that in addition to the traditional judicial and administrative mechanisms, it is necessary to guide and urge social platforms to play a greater governance function, for reasons that have been described above and will not be repeated. Regarding the specific focus points, because the author has been working in the court enforcement system for a period of time, he has a certain understanding of the basic work of the people's courts to solve the difficulty of enforcement. According to the author's understanding, there are two magic weapons for the court to basically solve the difficulty of enforcement, the first is enforcement informatization, and the second is the standardization of enforcement.

Similarly, managing online violence can also start from these two aspects. On the one hand, it is necessary to develop more effective anti-network violence technology. Online violence is a social disease born of information technology, relying on information technology can also be the most effective treatment, which especially requires the front-line social networking industry to continuously upgrade and add new functions according to the practice, more accurate and rapid prevention, identification and treatment of network violence, and to eliminate its impact to the greatest extent. This is the field of technical experts, and the author does not dare to get an axe and stop here. On the other hand, there is a need to develop more comprehensive anti-Cyber violence norms.

A prominent feature of the mainland Internet field is that the degree of practical development exceeds the degree of normative development. In terms of anti-Internet violence, at present, there are basically only the norms issued by social networks themselves, and there is still a lack of special norms at the level of laws and administrative regulations. Of course, the norms introduced by social platforms are important, and there are many advantages such as strong operability and timeliness, which also need to be continuously optimized, but it cannot replace legal norms, so the next stage focuses on the improvement of legal norms.

We can compare this to Germany, which implemented the so-called "Social Network Law" on October 1, 2017 (literally translated as: "law on improving the enforcement of laws in social networks"), and less than four years after its implementation, Germany quickly revised the law in June 2021. Germany drafted the law with the idea that hate, violent speech, and other illegal speech on social networks continues to spread and grow, and that the culture of online debate often leads to personal attacks, aggressive, offensive speech, accompanied by a large number of rumors.

In order to combat illegal speech more effectively, it is necessary to enact special laws to encourage social network operators to deal with complaints from users or others more quickly and appropriately. The Law has a number of noteworthy initiatives.

To put it in perspective, first, the obligation of social platforms to deal with complaints (as a statutory obligation) is stipulated at the legal level, and strict criteria are set for complaint procedures.

Second, it stipulates the periodic disclosure obligations of social platforms. Considering that the public's understanding of the dispute handling of social networks is very limited, it is not clear how many complaints there are in total in a certain period of time, nor how many of the complaints are deleted or retained, and it is not known who is handling complaints behind the scenes, so the law stipulates strict disclosure obligations and content that should be disclosed, social platforms must disclose an information disclosure report every six months, and if the disclosure report is not released in a timely manner, it will face a high fine. Since the implementation of the law, the three giants of foreign social networks Facebook, YouTube and Twitter have fulfilled this obligation, releasing multiple reports in a semi-annual cycle. As can be seen from the report, social platforms generally handle complaints faster, and there is no excessive blocking of user speech that was previously feared, and in general, social platforms have played a significant role in curbing illegal speech on the Internet.

Third, set up industry self-regulation institutions. If it is difficult for social platforms to determine whether the content complained about is illegal, it can be transferred to the industry self-regulating body for review. This normative body plays a role in dispute resolution in the industry, and its advantage over the courts is that it is more efficient and has a better understanding of the actual situation of the Internet industry. After the 2021 amendment, the law further strengthens the reporting obligations of platforms, such as the need to disclose the use of algorithms to automatically identify speech, the need to disclose the vulnerable user groups and the user groups that are prone to disseminating disputed content, and the addition of reconsideration procedures for parties to disagree with the decisions made by social platforms. It can be seen that Germany has also firmly grasped the "bull nose" of social platforms in the governance of online illegal speech, urged it to perform its governance responsibilities through laws, and required it to fulfill its disclosure obligations to continuously improve the transparency and standardization of governance. This practice is worth learning from the mainland.

05 A little end of the words

Everything under the sun has a bright side and a shadow side, if free and efficient communication is the bright side of social networks, then Internet riot rumors are its shadow side. The ideal of the law, of course, is to allow people to enjoy the light of social platforms more while protecting people from their dark side, but it doesn't always work out. Unfortunately, Liu Xuezhou ended up being shrouded in the dark side of social networks. There is an old Chinese saying that "a good word is warm in three winters, and a bad word hurts people in June", which makes people think that if Liu Xuezhou received not so many bad words in his private letter, but words of encouragement and comfort, it may bring him confidence and courage, let him make different choices, although he did not have a good start, but persevere, he may wait until a future that does not disappoint him. But looking back at the suffering he has experienced, who can blame the 15-year-old for his choice?

Before I put my pen down, I couldn't help but read Liu Xuezhou's suicide note, and one of the passages locked my eyes: "This manuscript was written by me to recall those things again and again in many dark nights of collapse." By the time you read this manuscript, I should have lived a happy and happy life, it should be a very good life, growing up in the arms of my parents and mothers."

I believe that this young man's wish will become a reality in the afterlife.

Read on