Editor: David Layan
【Introduction to New Wisdom Element】How is the algorithm generated? Florian Jaton, a postdoctoral fellow at the University of Lausanne in Switzerland, wrote the book "The Constitution of Algorithms" and explored algorithms from within, revealing the human side of algorithms.
Algorithms have become an increasingly common part of our lives.
However, many studies of algorithms still see them as "black boxes" that operate autonomously. Viewing algorithms in this isolated view, separating them from the human factor, can lead to erroneous understandings and conclusions.
Florian Jaton, a postdoctoral researcher at the STS Lab at the University of Lausanne in Switzerland, wrote The Constitution of Algorithms, which reveals the human side of algorithms by exploring them from within.

Jaton's book takes a special approach, starting with seemingly unrelated entities such as people, desires, documents, curiosities, and then studying how all of them come together and interact with each other to form what we call "algorithms."
"When I first became interested in this issue in 2013, there was already a lot of literature on the social impact of algorithms, and it was quite critical in general," Jaton said. "These studies reveal how algorithms work in our lives, while also highlighting the opacity of algorithms."
While these studies are important, Jaton's focus on algorithms doesn't stop there.
"I'm particularly concerned that algorithms are mostly considered abstract entities made up of remote code and obscure mathematics, so how do you really act on abstract, obscure entities?" In my opinion, the critical description of the algorithm does not produce much summable experience."
The problem, Jaton argues, is that in the past, algorithms were studied from the outside, such as through reports, software, and academic papers. The methodology filters out "the framework that played an important role in the formation of algorithms in the early stages, and these factors are precisely very fragile."
Jaton argues that a possible remedy seems to introduce anthropology in the scientific sense rather than relying solely on document analysis (although such analysis is still important). "
Jaton spent two and a half years writing the Constitution of Algorithms, and as part of a team of research scientists, he was involved in the study of a computer vision algorithm.
Jaton participated in and documented discussions, data collection, programming meetups, code debugging practices, and theoretical refinements. He realized that when studying algorithms in a real-world social context, much of the important work of creating algorithms was overlooked.
Jaton breaks down the composition of the algorithm into three main phases: mapping, programming, and training.
Step 1: Determine the Ground Truth
In fact, when a group of computer scientists, researchers, or engineers come together to create an algorithm, it is initially driven by a range of elements, including desires, skills, means, and hopes.
For example, a research group might want to question or go beyond the results of a previously published scientific paper that has a set of mathematical tools and programming skills that can be relied upon to achieve this.
They may then have access to computational resources, academic papers, and digital tools that will help them accomplish their goals. Finally, they may want to make changes in a scientific field, such as improving the quality of medical imaging, or solving a problem that can be productized later, such as developing an algorithm that can detect defects in manufacturing plants.
However, before developing algorithms that can meet the goals, these people must go through a process of "problemization" and "positivization". At this stage, the researcher must precisely define the problem they want to solve and determine the type of data needed to validate the algorithm.
On the other hand, an object detection algorithm must also determine the coordinates of the objects in the image, and possibly other parametric information: does the image contain only one object or may contain several objects? Are the lighting conditions different in the environment where the algorithm is used? Do objects appear in various contexts, or do they always appear in the same context?
Once the problem is in place, researchers need to build a "Ground Truth" by collecting the right material, using it to validate the algorithm and the model built later.
Taking computer vision algorithms as an example, researchers need to collect image datasets that match the problem description and can be used to train machine learning models.
In the "cat recognition" algorithm, the image label of a cat/not a cat is Ground Truth
These images must then be accompanied by the data needed to test the algorithm. For example, if they are creating an object detection algorithm, they must label each image with data from the bounding box of the object it contains.
It is important to understand that the way we think about problems and Ground Truth will greatly affect algorithms and their effects.
For example, an object detection algorithm derived from Ground Truth, which is centered on an object, may work well on similar images, but will fail miserably on images containing multiple scattered objects.
In fact, as Jaton points out in the Constitution of Algorithms, "What we get is an algorithm about Ground Truth."
"So, once an algorithm produces a result, people's reaction should be: Which Ground Truth database did the algorithm get from?"
The process of investigating and documenting Ground Truth is extremely important for studying algorithms and their impact on society, especially when algorithms are developed to perform sensitive tasks.
There have been many examples of poor designs that cause algorithms to make critical mistakes, such as making bad, biased decisions, creating bubbles, and spreading fake news.
There is a growing interest in understanding and solving algorithmic risk problems. How thoroughly ground Truth is studied and documented, this process will be key to addressing algorithmic risks.
In The Composition of Algorithms, Jaton writes, "As long as the actual work that influences the composition of algorithms remains abstract and uncertain, it will remain very difficult to improve the ecology of algorithm construction and development."
Programming process
Eventually, an algorithm will always come to the programming stage. A series of modules and instructions are created to solve a given problem and tested with basic facts. While the process is often simplified to pure source code, Jaton writes in his book that programming is much more complex than stacking a series of instructions together.
While existentialists are exploring what makes a program exist, they don't see anything higher than form, and these higher things are the parts that need to be explained precisely. Using computers as a metaphor for thinking is a vicious circle. Existentialists will eventually propose various psychological programs to explain computer programs and their development.
This view is flawed and rooted in the history of computer development. Scientists, researchers, and businesses are trying to frame computers as input-output systems that are created according to the patterns of thought in the human brain.
In reality, however, the human mind is an "organic version" of computer programs. In other words, computer programs are "original", and the human mind is only the embodiment of the model of the human brain reproducing computer programs.
These metaphors reduce programming to "giving a digital brain (i.e., a computer) a string of instructions." This simplification also changes the way programmers are trained and evaluated. Under the new understanding, more emphasis will be placed on the writing of directives and on ignoring all other practices that are valuable for developing software.
In his book, Jaton documents his and his team's experiences writing instructions, encountering bugs, and discussing problems with other team members.
He stressed that it is very important to adjust and improve the procedures in the process, communicate with the team members to improve the code, and other steps and operations that will not be reflected in the code in the end. By reading his book, programmers can compare the process of programming and implementing algorithms to discover some important details that are usually overlooked.
"I think the most overlooked aspect of the programming process is the ad hoc code that ends up being deleted or not reflected in the final code."
Jaton says that micro-sociological analysis of programming practices is just getting started, so most of the existing claims and perspectives are inquiry-based. Therefore, it is difficult to say how the future direction of this discipline will be. But he believes that mastering programming through micro-sociological research will make the design of algorithms more flexible.
Programmers who have a deeper understanding of programming practices are often what sets good programmers apart. Good programmers bring in-depth understanding to the algorithm design community.
The final production of the algorithm
Eventually, when an algorithm lands and is tested, it becomes a mathematical object that can then be used in other algorithms. An algorithm must stand the test of time, prove its value in applications, and be useful in other scientific studies or applications.
Once these criteria are passed, the algorithm no longer needs to be tested. It became the basis or component of other new algorithms, and continued to serve future applications.
But it should be emphasized that when the problem, the ground rule, and the final implementation become an abstract entity, all the small details of the process of building the algorithm are no longer visible.
After an algorithm is formed, it will become the basis of other algorithms and become the basic rule for testing other algorithms.
Once formed, algorithms become the basis for other algorithms, and they will help: the provisioning of basic rules, programming, forming algorithms, and other related operations. Having a deeper understanding of the different stages of algorithm composition allows us to better discuss them constructively and explore their broader implications.
Jaton said that understanding algorithms as a combination of basic rules, programming, and other algorithms makes the overall concept much more complicated. But it's much clearer.
By examining the significance of algorithms in this way, by paying attention to the complex networks formed by algorithms and the social components of their co-evolution, we can understand from a broader perspective why algorithms and humans are becoming more and more inseparable. Jaton said, "We also need to learn more in this field and constantly update our perspective."
When we read books, some people think that the effect and function of algorithms in sociology are quite different from the concepts of sociology. But if you think about it, this distinction is actually counterproductive.
People who design new algorithms are also part of society, and they design algorithms, which in turn are influenced by other algorithms, and they use other algorithms to build new algorithms.
Therefore, it is unlikely to completely put aside the utility and use of algorithms. It is necessary to include the functions of sociology in the design and implementation of algorithms.
Resources:
https://thenextweb.com/news/how-algorithms-are-made