DEV Community

Shailesh Kumar
Shailesh Kumar

Posted on

Time Complexity & Asymptotic Notations

Time Complexity

Let us understand time complexity with an example.
Let say you have two computers, one is old with low processing speed and the other one is a brand new computer that has very high processing speed. There are One Million elements in the array and we want to perform a Linear Search algorithm on this array and the element we are searching for is not present in the array. Time taken to perform this algorithm in the old computer is 10 seconds and the time taken to perform this algorithm in a brand new computer is 1 second.

Old Computer Brand New Computer
Data 1 Million elements 1 Million elements
Algorithm Linear Search Linear Search
Target Does not exist Does not exist
Time Taken 10 Seconds 1 Second

Now, according to you which of the two machines have greater time complexity?
The answer is that both of the machines have the same time complexity.
Because, Time Complexity != Time Taken.
If you plot the graph of both machines where the x-axis is size and the y-axis is time then you'll see that even though both of them have different slopes but both graphs shows a straight line which means that the time complexity of both the machine is same (i.e. O(N) or Linear Time).
Therefore, we can say that Time Complexity is a function that gives the relationship and tells us how time will grow as the input grows.

Some Points to remember while analyzing time complexity:

  • Always look for worst-case complexity.
  • Always look at complexity for larger data.
  • Ignore all constants because we need the relationship and not the actual value of time taken.
  • Ignore less dominating terms.

Why we should ignore less dominating terms?
Let's answer this question with an example.
Let say we have a time complexity of O(N^3 + logN) and here N=1Million
Now if we put the value of N in Time complexity then,

=> O( (1M)^3 + log(1M) )
We know that log(1 million) = 6, therefore,
=> O( (1M)^3 + 6 )

Here you can see that value of log(N) is almost negligible as compared to N^3. Therefore, we ignore less dominating terms.

Asymptotic Notations:

Big-Oh Notation:

Symbol: O()

Definition:
It gives the upper bound of the graph.
This means, if any algorithm has a time complexity of O(N^3) then, your algorithm can be solved in constant time, in O(1), in O(logN), etc. but it will never exceed O(N^3).

Mathematics:

O(g(n)) = { f(n): there exist positive constants c and n0
            such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }
Enter fullscreen mode Exit fullscreen mode

Graph:
Graph of Big-Oh Notation

Big-Omega Notation:

Symbol: Ω()

Definition:
It gives the lower bound of the graph.
This means, if any algorithm has a time complexity of O(N^2) then, your algorithm can at least be solved in O(N^2). It can be solved in O(2^N), in O(N^3), etc. but it will never go below O(N^2).

Mathematics:

Ω(g(n)) = { f(n): there exist positive constants c and n0 
            such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
Enter fullscreen mode Exit fullscreen mode

Graph:
Graph of Big-Omega Notation

Big-Theta Notation:

Symbol: Θ()

Definition:
It gives the tight bound of the graph.
This means, if any algorithm has a time complexity of O(N^2) then both, its lower as well as upper bound is Ω(N^2) and O(N^2) respectively.

Mathematics:

Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0
            such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }
Enter fullscreen mode Exit fullscreen mode

Graph:
Graph of Big-Theta Notation

Top comments (0)