laitimes

Introduction to time complexity

Introduction to time complexity

Brief introduction

To calculate the time complexity, we usually estimate the number of operating units of the algorithm, each running for the same amount of time. Therefore, the difference between the total runtime and the number of operating units of the algorithm is at most one constant coefficient.

Different input values of the same size can still cause the algorithm to run differently, so we usually use the worst-case complexity of the algorithm, denoted as T(n), as the maximum elapsed time required for input n of any size. Another less commonly used method is averaging the complexity of the case, which is usually used only when specifically specified. Time complexity can be classified by the natural properties of the function T(n), for example, an algorithm with T(n) = O(n) is called a "linear time algorithm"; while T(n) = O(M^n) and M = O(T(n)), where the algorithm of M≥n> 1 is called an "exponential time algorithm".

The time spent by an algorithm is proportional to the number of times the statements in the algorithm are executed, and which algorithm has more statements executed, it takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Denote T(n).

In general, the number of times the basic operation in the algorithm is repeated is a function of the problem scale n, denoted by T(n), if there is a helper function f(n), so that when n approaches infinity, the limit value of T(n)/f (n) is a constant that is not equal to zero, then f(n) is a function of the same order of magnitude of T(n). Denoted as T(n)=O(f(n)), O(f(n)) is the progressive time complexity of the algorithm, referred to as time complexity.

In a variety of different algorithms, if the number of statement executions in the algorithm is a constant, the time complexity is O(1), in addition, the time complexity may be the same when the time frequencies are not the same, such as T(n)=n2+3n+4 and T(n)=4n2+2n+1 Their frequencies are different, but the time complexity is the same, both are O(n2).

Time frequency

The time it takes to execute an algorithm cannot be calculated theoretically, and it must be run on the machine to know. But we can't and don't have to test every algorithm on the machine, just know which algorithm spends more time and which algorithm spends less time. And the time spent by an algorithm is proportional to the number of executions of statements in the algorithm, and which algorithm has more statements executed, it takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Denote T(n).

An example of the time complexity of common algorithms

1. Constant order O(1)

No matter how many lines the code executes, as long as there is no complex structure such as loops, the time complexity of the code is O(1) such as:

int i = 1;
int j = 2;
++i;
j++;
int m = i + j;
           

When the above code is executed, it does not grow with the growth of a variable, so no matter how long this kind of code is, even if there are tens of thousands of lines, it can be used O(1) to express its time complexity.

2. Logarithmic order O(logN)

int i = 1;
while(i<n)
{
    i = i * 2;
}
           

As you can see from the above code, in the while loop, i is multiplied by 2 each time, and after multiplying, i is getting closer and closer to n. Let's try to solve it, assuming that after the cycle x times, i is greater than n, at which point the loop exits, that is, 2 to the power of x equals n, then x = log2^n

That is to say, when the loop log2^n times, the code ends. So the time complexity of this code is: O(logn)

3. Linear order O(n)

for(i=1; i<=n; ++i)
{
   j = i;
   j++;
}
           

This code, the code in the for loop will execute n times, so the time it consumes changes with n, so this type of code can use O(n) to express its time complexity.

4. Linear logarithmic order O (nlogN)

Linear logarithmic order O(nlogN) is actually very easy to understand, if the time complexity of the code is O(logn) loop N times, then its time complexity is n * O (logN), that is, O (nlogN).

Take the above code with a little modification as an example:

for(m=1; m<n; m++)
{
    i = 1;
    while(i<n)
    {
        i = i * 2;
    }
}           

5. Square order O(n²)

The square order O(n²) is easier to understand, and if the code of O(n) is nested and looped again, its time complexity is O(n²).

Example:

for(x=1; i<=n; x++)
{
   for(i=1; i<=n; i++)
    {
       j = i;
       j++;
    }
}           

小结:O(1) < O(logn) < O(n) < O(nlogn) < O(n~2)

Read on