DEV Community

Cover image for Asymptotic Notations: A Comprehensive Guide
Prince M
Prince M

Posted on

Asymptotic Notations: A Comprehensive Guide

Asymptotic notation is a powerful tool used to analyze algorithms and functions. It provides a standardized and abstract way of describing the growth rates of functions as the input size increases. We can compare and classify algorithms based on their efficiency and scalability with asymptotic notation.

This article explores the different types of asymptotic notation, including Big O(𝑂), Big Omega(Ω), and Big Theta(Θ), and their mathematical definitions. We will also delve into Little o(o) and Little Omega(ω) notations and their significance in analyzing upper and lower bounds.

We aim to provide a comprehensive understanding of asymptotic notation through examples and explanations.

What is Asymptotic Notation?

Asymptotic notation is a way to describe the performance of algorithms in terms of how their runtimes grow as the input size increases.

This is helpful when we want to compare different algorithms and determine which will be faster for significant inputs.

Why we need Asymptotic Notation?

Asymptotic notation is essential because it allows us to analyze and compare the efficiency of algorithms in a standardized and abstract way.

For example, we wrote an algorithm, and when we plot the graph of its runtime, it looks as follows:

Anonymous Algorithm

By looking at the graph, we cannot say if the algorithm's time complexity is linear or quadratic.

By the way, if you are unfamiliar with the time complexity, I suggest reading my article on Big O Notation.

If we try to calculate exact mathematical notation of the algorithm, we will get some weird functions; studying and explaining this notation is very difficult.

So instead of, calculate exact mathematical notation we are using estimated values (Asymptotic Notation).

There are five types of asymptotic notation: 𝑂(Big O), Ω(omega), Θ(theta), o (Little-o) and ω (Little Omega).

1. 𝑂 (Big O) Notation

The O-notation provides an upper bound on the asymptotic behavior of a function, indicating that the function does not grow faster than a certain rate determined by its highest-order term.

For example, take the function 3(n^2) - 2n + 1. As you know, we should ignore all non-dominant terms while analyzing complexity.

By the way, if you are unfamiliar with rules for this calculation, I suggest reading my article on Big O Notation.

The highest-order term in this function is 3(n^2), so we say that the rate of growth is n^2.

However, it might surprise you that we can also express it as O(n^3). Why? Despite being a higher order, n^3 grows more slowly than n^2, so we can correctly state that the function does not exceed the growth rate of n^3.

In fact, this function is also O(n^4), O(n^5), and so on. Generally, we can express it as O(n^c) for any constant c >= 2.

𝑂 (Big O) Notation

Mathematically

Given two functions, f(n) and g(n), we say that f(n) is O(g(n)) if there exist constants c and n₀ such that:

0 ≤ f(n) ≤ c*g(n) for all n ≥ n₀

Before proving this summation, let's clarify these notations and attributes used in the above summation.

f(n)

This is the function representing the actual complexity of an algorithm, typically expressed as a function of the size of the input n.

For example, for the algorithm that checks whether a number is prime, f(n) could be the number of divisions the algorithm has to perform.

g(n)

This is the function that we are using to compare with f(n). It's a simplification of the actual complexity that gives us a general idea of how the complexity grows with respect to the input size n.

For example, we might say that the complexity of sorting n items is O(n log n), where g(n) is n log n.

n₀

This is a constant representing the point from where the given condition (whether it's for Big O, Big Omega, or Big Theta) holds true.

This is typically used because, for smaller n, certain irregular behaviors might be observed that don't represent the general trend of the function.

For example, Car A has a speed of 50 mph and takes 6 hours to cross a distance of 300 miles. On the other hand, Car B has a speed of 60 mph and takes 5 hours to cover the same 300-mile distance. There is only one hour difference, but when the distance is 30,000 miles, Car A will take 600 hours, and Car B will take 500 hours. See how the time change when input grows.

c

This is a positive constant multiplier. Depending on the context (Big O, Big Omega, Big Theta), we use this to bound the function f(n) by multiplying it with g(n).

For Big O, we're saying f(n) is no more than c times g(n) for sufficiently large n.

For Big Omega, we're saying f(n) is at least c times g(n) for sufficiently large n.

In the case of Big Theta, we use two constants, c1 and c2, to say f(n) is bounded by c1 times g(n) and c2 times g(n) for sufficiently large n.

Let's prove the summation.



0 ≤ f(n) ≤ c*g(n)

0 ≤ 3(n^2) + 2n + 1 ≤ 6(n^2)


Enter fullscreen mode Exit fullscreen mode

We can simplify the inequality as follows:



0 ≤ 3(n^2) + 2n + 1 ≤ 6(n^2)
0 ≤ 3(n^2) + 2n + 1 ≤ 3(n^2) + 3(n^2) (since 3(n^2) is the largest term)
0 ≤ 3(n^2) + 2n + 1 ≤ 6(n^2)


Enter fullscreen mode Exit fullscreen mode

Since the inequality holds for all n ≥ 1, we have successfully proven that f(n) is O(n^2).

2. Ω(Omega) Notation

The Ω notation represents a lower bound on the asymptotic behavior of a function, indicating that the function grows at least as fast as a certain rate determined by its highest-order term.

For example, consider the function 3(n^2) - 2n + 1. Since the highest-order term in this function is 3(n^2), it grows at least as fast as n^2.

Therefore, we express its growth rate as Ω(n^2). Furthermore, this function is also Ω(n). We can generally express it as Ω(n^c) for any constant c <= 3.

Big Omega Notation

Mathematically

Given two functions, f(n) and g(n), we say that "f(n) is Ω(g(n))" if there exist constants c and n₀ such that:

0 ≤ c*g(n) ≤ f(n) for all n ≥ n₀

Let's use the same function f(n) = 3n^2 + 2n + 1 and claim that f(n) is Ω(n^2).

We need to find constants c and n₀ that satisfy the inequality. Choosing c = 1 and n₀ = 1, we need to show that for all n greater than or equal to 1, the inequality holds:



0 ≤ c*g(n) ≤ f(n)

0 ≤ n^2 ≤ 3n^2 + 2n + 1
0 ≤ n^2 ≤ 3n^2 + 3n^2 (since 3n^2 is the largest term)
0 ≤ n^2 ≤ 6n^2


Enter fullscreen mode Exit fullscreen mode

Since the inequality holds for all n ≥ 1, we have successfully proven that f(n) is Ω(g(n)).

3. Θ(Theta) Notation

The Theta (Θ)-notation represents a tight bound on the asymptotic behavior of a function. It specifies that a function grows exactly at a certain rate based on its highest-order term.

In other words, Theta-notation captures the growth rate of a function within a constant factor from above and below.

It's important to note that these constant factors may not be equal. If a function is both O(f(n)) and Ω(f(n)) for a certain function f(n), it can be concluded that the function is Θ(f(n)).

For example, let's consider the function 3(n^2) - 2n + 1. Since this function is both O(n^2) and Ω(n^2), it can also be denoted as Θ(n^2). This means that the growth rate of the function precisely matches the growth rate of n^2 within a constant factor from above and below.

Θ(Theta) Notation

Mathematically

Given two functions, f(n) and g(n), we say that "f(n) is Θ(g(n))" if there exist constants c1, c2, and n₀ such that:

0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n₀

We previously chose c1 = 1, c2 = 6, and n₀ = 1. Now, we need to show that for all n greater than or equal to 1, the inequality holds:



0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)

0 ≤ n^2 ≤ 3n^2 + 2n + 1 ≤ 6n^2
0 ≤ n^2 ≤ 3n^2 + 2n + 1 ≤ 3n^2 + 3n^2 (since 3n^2 is the largest term)
0 ≤ n^2 ≤ 3n^2 + 2n + 1 ≤ 6n^2


Enter fullscreen mode Exit fullscreen mode

Since the inequality holds for all n ≥ 1, we have successfully proven that f(n) = 3n^2 + 2n + 1 is Θ(n^2) using the chosen constants c1 = 1, c2 = 6, and n₀ = 1.

This shows that the function f(n) has an upper bound and a lower bound that are multiples of n^2, confirming that it grows at a similar rate to n^2 within a constant factor for all n greater than or equal to n₀. Hence, f(n) is Θ(n^2).

4. o (Little-o) notation

The asymptotic upper bound given by the O-notation may or may not accurately represent the actual growth rate of a function. When the bound is tight and matches the growth rate precisely, it is referred to as asymptotically tight.

For example, the bound 2n^2 ∈ O(n^2) is asymptotically tight. However, in cases where the bound does not precisely match the growth rate, we use the o-notation to indicate an upper bound that is not asymptotically tight.

o (Little-o) notation

Mathematically

Given two functions, f(n) and g(n), we say that "f(n) is o(g(n))" if for any positive constant c, there exists a constant n₀ such that:

0 ≤ f(n) < c * g(n) for all n ≥ n₀

To prove this using a summation, let's consider the function f(n) = 3n^2 + 2n + 1 and the function g(n) = n^3. We want to show that f(n) is o(g(n)).

To do this, let's examine the following inequality for any positive constant c:



0 ≤ 3(n^2) + 2n + 1 < c * (n^3)


Enter fullscreen mode Exit fullscreen mode

To find a suitable n₀, we need to choose a constant c and show that the inequality holds for all n greater than or equal to n₀.

Let's choose c = 1. Now, we need to find a value for n₀. To ensure the inequality holds, we can set n₀ = 1. Let's check if the inequality holds for n ≥ 1:



0 ≤ 3(n^2) + 2n + 1 < c * (n^3) 
0 ≤ 3(n^2) + 2n + 1 < n^3 


Enter fullscreen mode Exit fullscreen mode

We can conclude that f(n) = 3n^2 + 2n + 1 is o(n^3) since it grows strictly slower than g(n) = n^3.

5. ω (Little Omega) Notation

By analogy, the ω-notation corresponds to the lower bound provided by the Ω-notation, just as the o-notation corresponds to the upper bound provided by the O-notation.

Like o-notation denotes an upper bound that is not asymptotically tight, we use the ω-notation to represent a lower bound that is not asymptotically tight.

Little Omega Notation

Mathematically

Given two functions, f(n) and g(n), we say that "f(n) is Ω(g(n))" if for any positive constant c, there exists a constant n₀ such that:

0 ≤ c * g(n) < f(n) for all n ≥ n₀

To prove this using a summation, let's continue with the function f(n) = 3n^2 + 2n + 1 and the function g(n) = n^2. We want to show that f(n) is Ω(g(n)).

Let's choose c = 1 and n₀ = 1. Now, we need to show that for all n ≥ 1, the inequality holds:



0 ≤ n^2 < 3n^2 + 2n + 1


Enter fullscreen mode Exit fullscreen mode

We have proven that f(n) = 3n^2 + 2n + 1 is Ω(n^2) since it grows strictly faster than g(n) = n^2.

Conclusion

Asymptotic notation is a common language for algorithm analysis, aiding in algorithm design, optimization, and comparison. It empowers us to make informed decisions about algorithm selection and enables us to predict how algorithms will perform as the input size increases.

Overall, asymptotic notation plays a vital role in computer science and algorithm analysis, providing a solid foundation for understanding and evaluating the performance characteristics of algorithms and functions.

Top comments (0)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.