Quick sort is a comparison algorithm with great performance and time complexity considering its simplicity of implementation. I personally prefer the recursive implementation of quick sort which we will review in this post. Generally quick sort is like merge sort to some degree as both use a divide-and-conquer approach to sorting, essentially splitting the array in 2 and sorting each smaller side individually over and over until the arrays are fully sorted and then reconnected in the sorted order once again.

## Implementation

Below we can see an example implementation of quick sort using JavaScript.

```
function quickSort(array) {
if(array.length <= 1) return array;
const pivot = array[0];
const left = [];
const right = [];
for(const item of array.slice(1)) {
if(item < pivot) {
left.push(item);
} else {
right.push(item);
}
}
return [...quickSort(left), pivot, ...quickSort(right)];
}
```

Quick sort is my go to algorithm most of the time if I have to implement something custom just because of how simple, efficient and surprisingly quick it is, pardon the pun. Note how the implementation is recursive and be aware of memory consumption on larger datasets.

In general, we have a circuit breaker condition to check if the input array has 1 or less items, if so, just return it. Otherwise if we have more than 1 item in the array, we take the first item as a `pivot`

and for each item from the second item to the last, we check if the `item`

is smaller than the `pivot`

and if it is move it left, otherwise move it right. Finally we return a new array where we use recursion to sort the `left`

and `right`

arrays and placing the pivot in the middle.

## Use Case and Performance

Quick sort has a great Big O time complexity of `O(n log n)`

on average which is also known as linearithmic time which is the fastest possible time complexity for a comparison sort algorithm. In the worst case the algorithm will run at `O(nΒ²)`

which is also known as quadratic time but this occurs seldomly for this algorithm.

Let's look at some example average runtimes from given input sizes:

Input size | Time complexity (Big O) |
---|---|

10 | O(10 log 10) = O(10) |

100 | O(100 log 100) = O(200) |

1000 | O(1,000 log 1,000) = O(3,000) |

Compared to the bubble sort, selection sort and insertion sort algorithms which we have covered so far in this series, these performance statistics are fantastic. Quick sort is great for datasets of most sizes too which makes it a great utility to have under your belt overall and as I mentioned above, generally, it is my go to algorithm when I *need* to implement something custom.

## Discussion

This part below is not accurate since there is no square.

<< This means that the time it takes to run the algorithm is the square of the size of the input array, otherwise known as linearithmic time >>

Thanks for pointing this out, it was a leftover from a previous draft where I mentioned the worst case implementation. Updated the article to correct this.