laitimes

In a future controlled by big data and algorithms, it may be difficult for people to do bad things

The "hot word" of "big data killing" began to spread about 4 years ago and gradually became a buzzword in social life. In a narrow sense, it refers to the same goods or services, and the price seen by old customers is much more expensive than that of new customers; sometimes it also refers to the influence and control of algorithms on people's preferences.

In a future controlled by big data and algorithms, it may be difficult for people to do bad things

3 years ago, when Jamie Susskind's The Power of Algorithms: How Do Humans Survive Together? When the English version was published, a series of negative events in domestic Internet companies were causing the public to reflect intensively on Internet technology, such as Didi's social tool that positioned the hitchhiking as a "very sexy" scene, resulting in the murder of many young female passengers. At the time, Li Dabai, the translator of "The Broken Ladder: How Inequality Affects Your Life," was working for a major Internet manufacturer, and 996 often pondered the interaction between the state, technology companies, and the public. "At that time, the attention paid to technology in the domestic humanities and social sciences field was basically focused on traditional technology ethics, the challenges of artificial intelligence, etc., and the interaction between technology and politics, and the special status of platform enterprises had not yet been fully discussed."

Therefore, after receiving the invitation to translate "The Power of Algorithms", Li Dabai agreed without hesitation. "The impact on algorithms in the book is more far-reaching, focusing on how technology will interact with politics in the context of the increasing power of algorithms, and how it will affect major issues such as social governance, fairness, and social distribution." Nowadays, the Chinese translation of "The Power of Algorithms" is listed, and Li Dabai has also jumped out of the big factory and engaged in research in the direction of digital humanities.

Big Data and "Ethical Automation"

Jamie Saskander is a prominent British scholar, author and practising barrister. He said he wanted to explore a central question through The Power of Algorithms: To what extent should our lives be guided and controlled by powerful digital systems, or in what ways?

Saskander's so-called "guided and controlled" life does not refer to the specific application of Internet technology in a certain scenario in the future, such as the increasing number of service robots and new breakthroughs in brain-computer interfaces, but refers to the impact that certain technologies and platforms and the people who control them in the digital world may have on social public life after they have strong powers. As the book puts it: "The most important revolutions did not take place in philosophy departments, not even in parliaments and town squares, but in laboratories, research institutes, technology companies and data centers, most of which involved the development of digital technologies." ”

Taking big data as an example, big data is everywhere in the digital world, and everyone is familiar with the collection of personal information and the erosion of personal privacy by big data. But Saskander is also particularly concerned about the relationship between big data and the "freedom" of political scholarship. He argues that some technologies collect all of our data, and we avoid doing what might be considered shameful, guilty, or wrong. At the same time, some technology companies also use big data technology to predict and prevent crime, such as a company in Israel that has developed a system that reads people's facial features and classifies people as "high IQ", "white-collar criminals", "pedophiles", "terrorists" and so on.

The trend towards predicting and preventing crime with similar technologies is increasing, and digital systems are increasingly being used for law enforcement. Saskander even boldly assumed that in the future, human beings under the "monitoring" of technology may achieve "ethical automation". That is to say, in the future, social management is likely not to bother too much in the moral admonition of social members, and through technical means, to a certain extent, possible crimes can be directly prevented. People in the future society will want to do bad things or it will be extremely difficult. Li Dabai explained.

However, Saskander believes that even if people's "plans and ambitions" are easier to capture by computers through big data, "moral automation" is only relatively speaking, and "anti-morality" may also usher in "automation". He imagined an extreme scenario: a government that wanted to commit genocide intent on rounding up all members of a religion or race in a particular area, and could simply determine the identity of the other party based on the data they left behind — purchase records, posts on social media, address books on smartphones, etc.

The injustice caused by the data

600,000 people around the world sent selfies, judged by computer learning algorithms, and only 6 of the 44 "most attractive" photos selected were not white; a website labeled the concentration camp photos of that year as "sports" and "climbing frames"... Saskander uses these examples to illustrate that no matter how clever a computer's algorithms are, as long as they are instilled with a wrong or one-sided view, they will not treat the problem fairly, "this is data-based injustice."

The Power of Algorithms spends a lot of time exploring the relationship between algorithms and social equity. Saskander believes that in the digital world, algorithms will play a key role in the distribution of important social goods, such as employment, loans, housing, insurance, etc., and algorithms will be more used to classify, rank, and score, and divide human beings into social classes according to status and prestige. Who is important and who is less than non-existent? Who is more popular and which is completely forgotten? These all relate to the issue of "recognition", which is an important aspect of social justice. In traditional societies, these problems are solved by the state, the market and society. But in the digital world, social justice depends heavily on the people who run the relevant algorithms.

How can injustices in the digital world be avoided? From the perspective of "The Power of Algorithms", Saskander's suggestions are similar to those of "Robot Ethics" and "Moral Machines: How to Make Computing People Distinguish Between Right and Wrong" published before, emphasizing that computers can learn moral judgment through programs - "Why can't you consciously put justice in mind when designing systems?" Whether it's equal treatment, equality of opportunity, or other principles that apply to this particular application. ”

Therefore, Saskander places special emphasis on the role that programmers play in "design justice". He criticizes that few engineers within Internet tech companies are tasked with trying to think about the systemic consequences of their work, and most of them only need to solve some scattered technical problems to make a difference. He called on tech companies to have more veritable "philosophical engineers" and "philosopher engineers" to build an intellectual framework for clear and critical thinking about the political consequences of digital innovation.

However, Li Dabai believes that Saskander's vision is too idealistic. The authors argue that code engineers should design algorithms based on a combination of political philosophy and social policy, while at the same time being humanistic. This is somewhat similar to Plato's idea of training a 'philosophical king' to govern the city-state, but the 'philosophical king' educated according to Plato's vision must be at least 50 years old, not to mention that code engineers are now generally facing an age crisis of 35 years old. ”

Unlike Saskander who pins justice and fairness on the individual "philosopher engineer", Li Dabai believes that the main body that shapes the "philosopher engineer" should be a technology company, of which the role of the helmsman is particularly important, only the company's decision-making team will "science and technology for good" as the common goal of the enterprise, "good" can be dismantled and implemented at the next level, ordinary code engineers will have a "philosopher" mentality in specific work, and the goal of "science and technology for good" should be embedded in the code design, so that the technology company can promote social development. Progress and justice play a positive role.

(Image courtesy of Visual China)

Read on