DEV Community

Discussion on: What exactly is big O, big Θ, big Ω ?

Collapse
 
deciduously profile image
Ben Lovy • Edited

EDIT: Please see @codemouse92's response on this thread for a more thorough and correct explanation!

These terms are ways of describing the runtime of an algorithm in terms of the size of the input, or time complexity. If your input size is n, saying O(n) means you expect the runtime of this operation to scale linearly with the input. If you have four items, it will take twice as long as with two. A worse runtime is O(n^2) - this means your runtime increases exponentially as your input increases - you've probably got a nested loop or something. A Big-O of 1 - O(1) - is called constant time. The runtime of the operation is not dependent on the size of the input at all.

Big-O describes the worst case of an algorithm - your algorithm will take at most that long if your data is in the furthest possible state from the expected output. Conversely, Big-Ω is the best case - or, in other words, the lower bound. For a given input size, you can expect this algorithm to take at least that long. If your set is already sorted at input, how long does it take your sorting algorithm to notice this and exit?

Big-Θ is in the middle. It describes the expected average runtime. It gets more accurate as you input size increases, but for any workload, you can state that you expect the runtime to be between a lower bound and an upper bound. A solid quote from Khan Academy:

When we say that a particular running time is Θ(n), we're saying that once n gets large enough, the running time is at least k1 * n and at most k2 * n, for some constants k1 and k2.

graph

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

Close, but not quite. (I'll admit, Khan Academy's explaination is more than a little vague at times.) Big-O, big-Θ, and big-Ω can all be used to measure the running time of the worst-case. The difference is which bounds are defined by the notation.

I'm putting my entire answer in a separate comment.