loading...

Beginners guide to finding the time complexity of an algorithm

merastackhouse profile image Mera-Stackhouse ・3 min read

What is time complexity?

Time complexity is a method for comparing the abstract time an algorithm takes to execute. It doesn't refer to actual time or duration. It is only useful for comparing (and in comparing the same discrete elements; not all algorithms have the same elements).

How to calculate Big O?

1. Break the code down to different parts

Here are some pieces we should recognize:

Group A:

  • Assignments, statements, accessing a certain element in an array, a comparison
  • Examples
array[i]
const b = 5
b > 4

Group B:

  • Loop or recursion that runs n number of times
  • Example
for (const i = 0; i < n; i++) {
  // do something
}

Group C:

  • Combining loops that run n times
  • Example
for (const i = 0; i < n; i++) {
  for (const j = 0; j < m; j++) {
    // do something
  }
}

Group D:

  • Other logic that dictates how many times we iterate over elements
  • Example
while ( low <= high ) {
  mid = ( low + high ) / 2;
  if ( target < list[mid] )
    high = mid - 1;
  else if ( target > list[mid] )
    low = mid + 1;
  else break;
}

2. Decide how 'long' it will take to execute each piece

Most code can be broken down into the above pieces and we can the evaluate each piece to understand how much 'time' that piece of the code will take.

In Group A are the most discrete pieces of code. The amount of these specific actions are what we are counting. Each comparison, assignment, or array at index access takes about the same amount of time which we consider 1 unit of time. Note: for certain algorithms that don't use comparison/index reading there are different actions considered to be the baseline.

Group B is a simple if statement. If statements will iterate over the whole array or list of length n. For each element we do whatever the actions are inside the loop.

Group C is combining loops or recursions. Because the loops are nested, we do n actions for the outside loop and for every time it runs we do n actions in the inside loop. n * n, or n^2.

Group D is logic that will affect the time in relation to the logarithm of the input size. In this example we have a while loop that cuts the input size in half each time it runs.

There are plenty of things you can do in your code and you just need to figure out how this logic effects the number of elements you are touching or the number of times you are running over these elements.

3. Add them all up

After you've figured out the time for each element we simply need to add them all up and slap an O() on them!

Group A are all worth 1 unit of time, so we just use O(1).

Group B take n units of times, so, you guessed it, they are O(n).

Group C take n times the amount of time as Group B, which can be expressed as O(n^2). The more iterations you add, the larger the exponent.

Finally, Group D. In our example we see that the while loop actually halves the size of the list we're iterating over each time it runs. This run time will be much less than running over the whole list. Dividing the potential time is expressed with O(logn).

4. Don't sweat the small stuff

One of the nice things about Big O is that it isn't concerned with the details. Big O is a comparison of time as n moves towards infinity, where certain elements become trivial. Don't worry about O(1) parts, and don't worry about constants. If there are higher order n values then you can get rid of the lesser n's as well. 5n^2 + n + 10 becomes O(n^2). As n gets larger, the exponential part has the greatest impact on the order of magnitude of the whole expression.

Sources

Posted on by:

merastackhouse profile

Mera-Stackhouse

@merastackhouse

New Web Developer looking to learn

Discussion

markdown guide
 

Thanks for this beginner intro, just what i needed to read!