There are 100 bottles of potions, one of which is poison, and just a small drop is enough to kill a guinea pig within 24 hours. How can you find this bottle of poison with the least number of rats in 1 day? The author of this article has answered this question, let's take a look.

This is a very classic Tencent interview question, which may not be unfamiliar to the program ape, but for students who see this question for the first time, it will indeed be more brain-burning. Today, in addition to explaining how to solve this problem, more is to introduce the concept of information theory to everyone, then whether you encounter problems such as testing poison or weighing the ball in the future, it is a small case!
Some people may say, I can use 100 mice to know which bottle of poison is, so the mice are recruiting you or provoking you, do you have to take a whole family to find poison for you? In fact, the answer to this question is very simple, we only need 7 mice, and the key to solving this problem is the binary in mathematical coding.
First, how to find poison with 7 mice - binary code
Step1: We code 1 to 100 bottles from 1 to 100 and convert them into 7-digit binary codes (as for why it is 7 bits, you will understand it later). For example, bottle No. 1 is transformed into "0000001", and bottle No. 10 is transformed into "0001010":
Step2: Find 7 mice and number them 1 to 7 each. For a mouse numbered 1, feed it all the bottles with binary code 1st bit (from left to right number) as 1; for the mouse number 2, feed it all the bottles with binary code 2nd bit (from left to right number) 1; and so on...
Step3: The next step is to see which rats hung up after a day: if a numbered rat died, it means that the binary value of the poison bottle in the corresponding number position is 1; conversely, if the coded mouse is not dead, it means that the binary value of the poison bottle coded in the corresponding number position is 0. If rats No. 2, 3, 5, and 7 are hung up in the end, it means that the binary code of the corresponding poison bottle is "0110101", which is converted into decimal, that is, the 53rd bottle is poison.
Why are there at least 7 guinea pigs? — Information entropy
Earlier we only explained that 7 mice can complete the search for poison, but we have not proved that 7 is the optimal solution, then to demonstrate this answer, we need to introduce the concept of "information entropy" in information theory.
First of all, let's first understand the "entropy":
In information theory, the concept of entropy and entropy in thermodynamics is similar, if you remember high school chemistry, there is a mention of a "degree of chaos" concept, in fact, entropy in thermodynamics refers to the degree of chaos of the system, probably should remember "any chemical change or chemical reaction is to increase the direction of entropy" This sentence.
Thermodynamic entropy: the degree of confusion in the system
Information entropy: A measure of the uncertainty of information
In information theory, entropy refers to the uncertainty of information, that is to say, the greater the degree of uncertainty of information, then the greater the entropy of the corresponding information, in fact, the essence of mathematics is to eliminate this uncertainty in information.
For coin tossing, every time you have to guess the front and back can only rely on blind guessing, the probability of the positive and negative sides is 1/2, for this type of event its uncertainty is larger, the information entropy will be relatively large; if you guess the possibility of the national football team to win the World Cup championship, then the uncertainty of this small probability event is relatively small, and the information entropy will be relatively small.
In information theory, several commonly used concepts can also be defined to avoid confusion:
1, information entropy: used to measure the degree of uncertainty of information, is an absolute value. (As for how to calculate the metric, I will introduce it later)
2, information: anything that can reduce the uncertainty of information in a situation can be called information, and its antonyms can be understood as "nonsense"
3, the amount of information: is a measure of information, indicating the amount of information brought by a specific thing, is a relative value. The lower the probability of an event occurring, the greater the amount of information brought by the event after it occurs, the greater the amount of information; conversely, the higher the probability of the occurrence of the event, the smaller the amount of information generated. For example, the news of a celebrity's derailment was exposed, and before he was set up in front of the public to be a good man, then the amount of information in this message is very large.
Next, go back to how "information entropy" calculates the measure:
Information measures the amount of information that comes with the occurrence of a specific event, while entropy is the expectation of the amount of information that may be generated before the result comes out—considering all the possible values of the random variable, that is, the expectation of the amount of information that all possible events bring.
So Shannon gave us a formula like this (point it out!). ):
P is the probability mass function of X, which we can understand as the probability that event xi occurs.
b is the base used for the logarithm. When b = 2, the unit of entropy is bit.
Let's use the coin toss as an example, "toss a coin is positive" the information entropy of the random variable X
That is, "tossing a coin once is positive" The entropy of the information of this event is 1 bit, and we only need to use a 1-digit binary number to represent the size of this information (0 or 1)
#Start Solving# Calculate the entropy of information in mice looking for poison:
1. First of all, the information entropy of the random variable X of "1 of the 100 bottles of potion is toxic" is:
2, 1 mouse after drinking water is either alive or dead, there are two states, so "1 mouse after drinking water after the state" this random variable Y entropy is:
3. N mice will have 2^n states after drinking water, that is, the information entropy of the random variable Z of "the state of n mice after drinking water" is:
So according to the title, the least number of rats used to find the bottle of poison, converted into information entropy is to satisfy:
H(Z) >= H(X),即n >= 6.64
Then the minimum value of n is 7, which means that at least 7 rats can find the poison
In fact, the above entropy calculation can be simplified to understand if you feel complicated:
1 mouse is alive or dead, can represent two states, n mice represent 2^n states
The presence of poison in one of the bottles 1 to 100 corresponds to 100 cases
That is, we need to find the minimum n satisfaction:
2^n>= 100,即n>=7
Third, challenge the test problem of the high-level version
Carefully reviewed the question, we found that this time the mouse will hang up within 6 hours, the topic does not say when it will hang, then we can understand that: drinking poison for the longest 6 hours will hang, if 6 hours have not hung, indicating that this bottle of water is not poison.
1. First of all, the information entropy of the random variable X of the "100 bottles of potion is toxic" is:
2, and this time, the state of the mouse is a little different, his state within 1 day after drinking water may be:
1) After drinking a drop of water at the 0th minute, die in the 6th hour
2) Still alive at the 6th hour, after drinking a drop of water, he died in the 12th hour
3) Still alive at the 12th hour, after drinking a drop of water, he died in the 18th hour
4) Still alive at the 18th hour, after drinking a drop of water, he died at the 24th hour
5) Still alive at the 6th hour, and after drinking a drop of water, still alive at the 24th hour
It can be seen that a mouse may appear in 5 states after drinking water all day, so the information entropy of the random variable Y of "the state of 1 mouse after drinking water for 24 hours" is:
3, n mice will have 5^n states after drinking water all day, that is, the information entropy of Z of the random variable Z is:
H(Z) >= H(X),即 2.3219n >= 6.64
Then the minimum value of n is 3, which means that at least 3 rats can find the poison
Leave a homework: So how to use these 3 mice to find poison?
According to the idea of the previous simplified version of the binary encoding method, can we use the 5 states of the mouse to construct a 5-digit encoding method?
Fourth, I am summarizing
In fact, the problem of rat poisoning, the first encounter will indeed be more difficult to start, for those who have done it, most of them may only stay in the way of knowing to solve it in a binary coding way, and few people will think about the principle and the nature of the logic behind this.
If the problem becomes more complicated, often most people will try, 7 mice can not do 8 mice, anyway, only know the way to use coding, this is to know it but do not know why. Similarly, when we only grasp a lot of theoretical knowledge, but lack the practice of application scenarios and the correction and optimization of various situations, it will eventually be on paper.
Therefore, this is also a way of thinking that I personally recommend: when we encounter a fun problem and a clever solution, we can continue to dig deep into the essence of the principle and logic behind it, and then push back more application scenarios, so as to continuously exercise the breadth and depth of our thinking ability.
This article was originally published by @WINTER and everyone is a product manager. Reproduction without permission is prohibited
The title image is from Pexels, based on the CC0 protocol