Author Zhu Jing
Open the shopping software on the mobile phone, similar goods are dazzling how to choose, according to your recent click preferences, "sales list" may be able to help you make a quick choice; open the software to read the news, "today's hot search" has already recommended the content with higher clicks to you; in some companies, the algorithm is in accordance with the design and requirements of programmers, screening a resume, compared to manual work, can be described as efficient... The algorithm is like an invisible sprite, running every day, recording and collecting your information, preferences, search records, and serving your work life.
To provide the optimal solution to the real problem of universal applicability, this is the idea behind the algorithm, whether it is the designer of the algorithm rules, or the platform service provider that applies the algorithm, most of them uphold the algorithm "neutrality theory", that the algorithm does not carry any values, is neutral, and the key to what kind of utility the algorithm produces is to see how the algorithm is used. But, is that really the case? If the algorithm pushes the "high sales" goods with moisture to you, and pushes the bad information that caters to the curiosity to you, then the quantified algorithms begin to affect the user's trust, favorability, or the algorithm design itself has a gender screening function, screening resumes have sex discrimination, then it looks harmless to people and animals, but it is producing unfair results and discrimination against certain groups.
It can be seen that the design logic of the algorithm carries values and will subtly affect people's values. From the perspective of the application of the algorithm to the news field, the information content recommended to the user is topped by the number of clicks and likes, because of the lack of the role of the gatekeeper, the content of the news, pictures, etc. will directly affect the user's concerns and emotions, and the voices and comments of netizens may continue to ferment, affecting the user's concepts and thoughts; some media may continue to use eye-catching headlines and vulgar content because of such results feedback, and the "headline Q&A" of today's headlines in 2017 He was interviewed by the Cyberspace Administration of Beijing for discussing vulgar topics. When the algorithm is applied to the commercial link, recommending goods to users, in order to expand profits and enhance user stickiness, some algorithms "kill", and some algorithms use the user's purchase information to induce consumption, affecting the user's purchase habits and consumption behavior. The once-hyped news of "the takeaway guy trapped in the code" has made people realize that the use of algorithms and the evaluation of effects are a big problem. In a similar scenario, the algorithm is supposed to provide the optimal solution, but becomes a "code dilemma" due to some motivation or design flaw.
It should be noted that "algorithm neutrality" is only a tool rational perspective, the algorithm has limitations in its function at the design level, it is impossible to cover all samples, the more powerful the algorithm function, the stronger the penetration of people's work and life scenarios, if it is linked to commercial interests, its scope of action will break through the basic engineering application scenarios, inevitably extended to the social environment related to people, affecting people's emotions, concepts, thoughts, relationships, behaviors, the technical problems of the algorithm will easily evolve into social problems, More importantly, it is difficult to identify the responsible subject.
Therefore, there is an urgent need for more involvement and active intervention in this area. The recently issued "Provisions on the Recommendation and Administration of Internet Information Service Algorithms" points out that it is necessary to regularly review, evaluate and verify the algorithm mechanism mechanism, model, data and application results of recommended service providers to ensure that Internet enterprises have rules to follow and always operate under supervision. Supervision, which is not only to put the governance threshold in front, but also to show that the algorithm should serve the public, reflect the public attributes, through strengthening governance, it is possible to effectively guide the application of algorithms from personalized services to people-oriented.