DEV Community

Cover image for Implement 5 Sorting Algorithms using JavaScript.
Rajat M
Rajat M

Posted on • Originally published at Medium

Implement 5 Sorting Algorithms using JavaScript.

Most languages have a built-in method which serves the purpose of trying to sort a bunch of data. The common tendency amongst most developers, especially those who are just beginning their journey, might be to choose this path and avoid writing their own implementation. But, this can end up having unforeseen repercussions in terms of performance. Therefore, it is better to go with a sorting technique that is best suited for your current requirement.

The first 3 sorting algorithms that I cover in this article, have an average time complexity of O(n²). These are ones which are fairly popular, and are much more intuitive in their approach to sort data.

The other 2 have an average time complexity of O(n*log n), and can be bit tricky to comprehend if you have no prior knowledge of recursion. So, I would suggest that you go through this article to understand how recursion works.

In the following sections, I shall give you a brief explanation as to how that particular algorithm goes about sorting data. Then, I give you some pseudocode in case you wish to go ahead and try to implement that algorithm on your own. Finally, I provide a gist for my implementation of the algorithm. I would suggest that you understand the pseudocode before diving into the gist, as that will help you grasp the use case for each algorithm better.


Let’s get started with Bubble Sort, shall we. The space complexity for the algorithm is O(1) and the average time complexity is O(n²).The pseudocode is as follows:

  • Start iterating through the array, comparing 2 elements at a time.

  • Swap them as required.

  • At the end of first pass, the largest number has bubbled to the last index of the array, so ignore the last index in the next pass.

  • Continue these passes until the array is sorted.

An illustration to understand how bubble sort works

The code for the implementation in JS is as follows:

Note that the second implementation is slightly optimized to handle an array that is almost sorted.


The next sorting algorithm which has a time complexity of O(n²) is Insertion Sort, it also has a space complexity of O(1). This is most useful when there is a scenario wherein you are receiving a series of numbers in real time, and need them in a sorted array.

The main concept to understand when using this technique is that, there is a portion of the array that is always sorted and a section that remains unsorted.

  • Start by comparing the 2nd element with the 1st element, swap if necessary.

  • Iterate through the rest of the array. Then, for each element, iterate through the sorted portion of the array, and insert this element where it needs to be, by making comparisons.

  • Keep doing this until all the elements have been inserted into their correct positions.

An illustration to understand how insertion sort works

The code for the same is as shown below.


Selection Sort is the last sorting algorithm to have a time complexity of O(n²), included in this article. The space complexity is the same as the previous two techniques i.e, O(1). The pseudocode for this algorithm is as follows.

  • Assume that the first element is the smallest. (Or largest, if sorting in descending order).

  • Find the minimum value from the array and swap this with the first element of the array. This completes one pass, wherein the smallest element of the array is now at the 0th index.

  • Repeat this procedure for the rest of the array elements, but for the next pass do not compare the element we just placed at the 0th index.

This is usually not that useful in most situations, but still helps a beginner grasp the concepts of implementing an algorithm to solve a problem.

An illustration to understand how selection sort works.
My implementation is as follows. Note that the sorted array is built up from the 0th index.


You may have noticed that it is quite difficult to get a performant sorting algorithm using these techniques. Hence, in order to have an algorithm that is better than O(n²) in terms of time complexity, we have to use recursion.

The next 2 techniques can seem less intuitive at first go. So do read the pseudocode before you jump to the code, in order to make sense of the procedure followed!

Both of them have an average time complexity of O(n * log n). Their space complexities varies depending on the technique.


Let’s take a look at how merge sort is able to use recursion to implement an algorithm with a better time complexity.

The main concept here is that an array with size 0 or 1 is inherently sorted. This means that if we are able to split our array into smaller subarrays of size 0 or 1, and merge them correctly, we have sorted our array!

So there are two things that we need to do before we can implement merge sort. We need to find a way to divide an array into halves continuously, until we end up with arrays of size 0 or 1. Then, we merge them in a way that results in a larger (but still sorted) array.

The pseudocode to continuously divide an array, and end up with a bunch of arrays of size 0 or 1, is as follows.

  • We use recursion to do this. Use slice() to halve the array, and do this until the base case of arr.length ≤ 1 is reached.

Now, let’s tackle the issue of merging two arrays (of size≤1) such that we end up with a sorted array.

  • Start by making an empty array.

  • Compare the first elements of the 2 subarrays, and push the smaller of the two, to the the new array.

  • Suppose 1st element of 1st array is smaller, then push that to the new array. Now compare the 2nd element of the first array to the 1st element of the 2nd array, and so on.

  • If we have exhausted the array elements in any of the 2 subarrays, then just push the other subarray to the new array we had created.

See the image below to see how this technique needs to work.

Illustration for merge sort. **Note:** The middle element is colored red.Note: The middle element is colored red.

The code for the merge sort algorithm is as follows. Note the use of helper function to implement the merging of 2 subarrays, and it is pretty evident that the space complexity for this algorithm is O(n).


Lastly, let us see how quick sort justifies its name and goes about sorting an array.

It works by choosing a pivot element, and making sure that all the elements to the left of the pivot element is less than the pivot(not necessarily sorted, they just need to be less than the pivot) and that all the elements to the right of the pivot are all greater than it.

The only 2 tasks we need to do in order to implement quick sort’s algorithm is to correctly identify the index for the pivot and place the pivot element at that index. Initially, we assume the pivot to any element in the array, in this example I shall consider the 0th element to be the initial pivot.

The pseudocode to correctly return the index for the pivot element is as follows. Note that this is also called the partition function.

  • Choose a pivot, store its index in a variable, let’s say pivotIndex. Loop through the array, if the current element is less than than the pivot, then increment the pivotIndex, and swap the current element with the element present at the new pivotIndex

  • After one iteration through the array, swap the pivot with the element present at the pivotIndex

Once you have a helper function to do the above task, we need to recursively place all the pivot elements in their correct positions. The pseudocode to so that is as follows.

  • Assume left indicates the start of a subarray, and right indicates the last index of the subarray.

  • Do the following only if the left pointer is at a lesser index than the right pointer:

    • Start by calling the partition() on the entire array by defaulting the left and right pointers to the first and last element of the array respectively.
    • Then store the return value in the pivotIndex
    • Use this to recursively call quickSort() with the same array, but from left up until (pivotIndex - 1), for the left part of the array.
    • For the right part of the array, call quickSort() again, with the same array, but from (pivotIndex + 1) upto right
  • Once the base case becomes invalid, it means that left equals right, so we return the array.

The video shows a visualization of quick sort algorithm. The pivot elements are colored yellow.


The code for implementing quick sort in JavaScript is as follows. Note that the space complexity is O(log n).


Now that you know how to implement these 5 sorting algorithms, the next step is to understand which technique works best for the situation you find yourself in. To see some regular use cases, you can checkout this article.

Top comments (0)