DEV Community

Jonathan Thomas
Jonathan Thomas

Posted on

My recall of Big O and how understanding it will make all development easier.

Since my last post I've been working on and off getting back into the flow of code.

Recently I dived back into Data Structures and Algorithms in Javascript Masterclass by Colt Steele. So far the concepts are starting to make sense but to really reinfore the ideas recall is needed so here it goes

In computer science, Big O is a measurment used to describe the efficiency of algorithms. There are best, worse and average cases that can come about depending on the data structures, algorithms and languages used.

Common runtimes include O(N), O(log N), O(N log N), O(N^2) and O(2^N).

Runtimes are determined by the amount of operations and memory(O(1) a constant as an example) used in programming.

Ideally taking runtimes into account could help when deploying a program. In thes case of javascript in browsers, depending on the browsers themeselves, different runtimes might occur.

With regard to memory(O(1)), also know as space complexity, can affect big O since the computer has to reserve memeory to use for operations.

Best and worst case scenarios are determined by what the algorithm is trying to do. For example of going through an array of near sorted items for bubble sort would be O(N) however in most cases the items will have to be sorted first, thus we'd be left with O(N^2) since it has to recursively go through and check each element that isn't sorted.

Anyway this was my first foray into inderstanding big O theory and constructive criticism and guidance os greatly appreciated.

If there is anything that I'm missing or could elaborate on, please let me know.

Thanks again,
Jonathan Thomas, future Software Engineer

Top comments (2)

Collapse
 
curtisfenner profile image
Curtis Fenner

To be precise:
Big-O notation doesn't have anything inherently to do with program performance, runtime, memory usage, whatever.

It also doesn't have anything to do with "best case", "worst case", "average case", etc.

Big-O notation describes functions -- mathematical operations that take in numbers and spit out numbers.

It turns out this is really useful when we're talking about computer programs, but the concepts themselves are orthogonal. You can talk about programs without using big-O notation by saying specifically what you are counting (e.g., most possible, least possible, "average" (after fixing a distribution of inputs!) taken of nanoseconds, clock cycles, instructions executed, etc).

Collapse
 
jonathanmkpt profile image
Jonathan Thomas

Thanks for the information, I 'll try to be more precise in the future :)