DEV Community

Cover image for An Introduction to Big O Notation
Joseph Trettevik
Joseph Trettevik

Posted on • Edited on

An Introduction to Big O Notation

Big O notation is a big topic, and its universal importance stems from the fact that it describes the efficiency of code written in any programming language. Since this is such a big topic, this post will cover the basics and in following posts I’ll get into how to recognize the different types Big O complexity like O(log n), O(n), O( n^2 ), etc.

Time Complexity vs. Space Complexity

Big O can be used to describe the complexity of a section of code both in terms of runtime and space. The Big O time complexity describes the runtime in the worst-case scenario. So the code may run super fast if the array its iterating through has a length of 10, but what about an array with a length of a million, or 10 million? Big O space complexity, on the other hand, describes how much memory is required to run a section of code in the worst-case scenario. For example, a for loop that copies an array will take a lot more memory to run than one the simply modifies an existing array.

Time Complexity

Let’s look at two functions to see how Big O describes the runtimes.



const doubleAtIndex = (array, index) => {
     array[index] = array[index] * 2;
}


Enter fullscreen mode Exit fullscreen mode

Because this function only accesses and assigns a value at one location, the runtime will be the same whether the array length is 10 or 10 million. If the runtime is constant regardless of the input, the function is said to have a time complexity of O(1).



const doubleArrayValues = (array) => {
     for(let i = 0; i < array.length; i++) {
          array[i] = array[i] * 2;
     }
}


Enter fullscreen mode Exit fullscreen mode

In this example, the value at every index of the array is getting doubled. Since there is a linear increase of for loop iterations as the length of array increases, this code is said to have a runtime complexity of O(n).

Given these two example, it's clear the first one with a time complexity of O(1) will run faster in almost every case. Could you find a specific input where a O(n) function was faster than the O(1) function? Sure, but generally the as the complexity of a function increases, so will the runtime of the worse-case scenario.

Space Complexity

To understand space complexity, let's look at the last example again.



const doubleArrayValues = (array) => {
     for(let i = 0; i < array.length; i++) {
          array[i] = array[i] * 2;
     }
}


Enter fullscreen mode Exit fullscreen mode

Since the array already exists in memory and this function is just updating the values in the array, the function does not use additional memory no matter how big the array is. This means the function has a space complexity of O(1).

However, what if the function made a copy of the array like in this example:



const doubleAndCopyArray = (array) => {
     let newArray = []
     for(let i = 0; i < array.length; i++) {
          newArray[i] = array[i] * 2;
     }
     return newArray
}


Enter fullscreen mode Exit fullscreen mode

Now we are using addition memory and the amount of memory increases linearly as the length of the array increases. This means the function has a space complexity of O(n).

Contants? Who needs 'em?

When determining Big O complexity, remember to drop any constants. Big O is meant to describe the scale of how complex a section of code is, not an exact number. So the difference between O(n) vs O(2n) is small potatoes when compared to the difference between O(n) and O( n^2 ).

So,

  • O(2n) becomes O(n)
  • O(n(n - 1)/2) becomes O( n^2 )
  • O( 2^n - 1 ) becomes O( 2^n )

Big Man on Campus

Like with constants, drop any non-dominant terms as well. This again comes back to goal of Big O which is to describe the scale of complexity, and non-dominant terms don't contribute as much. How do we know which one is dominant? Let's look at a graph of the rate of increase of common Big O terms.

Big O Complexity Chart

The steeper the angle of the curve, the more dominant the the term.

So,

  • O( n^2 + n ) becomes O( n^2 )
  • O(n + log n) becomes O(n)
  • O(2^n + n^2 + n log n + n + log n) becomes O( 2^n )

Conclusion

So here are the main take aways:

  • Big O notation helps us understand the complexity of code by describing the scale of the worst-case scenario.
  • Big O can describe both the Time Complexity and the Space Complexity.
  • Time Complexity describes the scale of the runtime in the worst-case.
  • Space Complexity describes the scale of memory usage in the worst-case.
  • Don't forget to drop the constants and the non-dominant terms when reporting the Big O of a section of code.

Song of the Week

Breathe In - Jordy Chandra | Spotify

References

McDowell, Gayle Laakmann. Cracking the Coding Interview. CareerCup, LLC, 2019. (Pg 38-42)

Top comments (1)

Collapse
 
antp1997 profile image
Antoine Price

this is all useful information thank you