The first three elementary algorithms that I recently wrote about are not good for large data sets. They are good to know for certain scenarios but for lets say an array of 100,000 it would take 1000x longer (not sure about this factor but feel free to create an array with 100,000 values and do it on bubble sort and then compare to this when it's done). The idea behind this particular sort is:
- splitting up
- merging
- sorting
Merge sort takes the fact that array with 0 or 1 elements are always sorted. It works by the divide and conquer approach. We break up our array into smaller array and continue that until we have arrays of 0 or 1 elements and then we merge them back together, and you probably guessed it but this can be done either iteratively or recursively. The recursive solution is definitely the "better" route to go however if you do not fully understand recursion (I may or may not lol) then iterative is probably better. Here is a photo that will more or less show you how it works:
So as you can see we continue to split up the arrays in halves until they are single elements and then once down to those single elements we compare them and build them up into smaller arrays, then compare two of those arrays (first elements, since they are sorted) and so on and so forth.
An easy way to start the implementation of merge sort is to create a helper function that takes in two sorted arrays and merge them. Remember these arrays can be length of 1 or 2 or 3 as long as they are sorted. This will be used over and over again until finally a fully sorted array is returned. In this function we'll need a counter for each array and an array to push values into, and then a while loop to continue to loop over until all values have been accounted for. So after this all we'll have to do is break down the initial input array into halves and then halves again until they have single lengths. This type of divide and conquer pattern can and will be used when doing breadth and depth first searches. If we want to find a value or a sorted array that is one million in length we break it down in halves each time and eventually we will land on our value that we are looking for. Extremely faster than iterating over every value and seeing if that value is the one we are looking for.
To break the initial array into halves we'll use recursion. So the plan is to split the array in half by using slice, which returns a copy so does not modify the original array. So once we have the "left" side we'll continue to call our function recursively until we have a base case which in our case is a length of one or less than one. Now when we have our base case arrays we'll call on our helper function to merge these two sorted arrays. Merge Sort. So I added (and please let it be known I just found this solution online) a fully iterative solution in case recursion does not make sense to you. The easiest way to visualize the recursion process in my opinion is to either write it down or and so you can truly see whats going on is in google developer console you can add your own code snippets in the Source
tab. You can then write your JS code there (or copy paste mine) and add break points and watch the actual call stack and how it takes place. Listen I am not an expert in recursion and I'm trying to add it in my code where I can daily but you'll need to have a fundamental understanding of it before you can move on to more advance topics such and tree traversals. The final thing I'll say about this is how efficient this particular algorithm is. Well the time complexity is n log n which is not the best however it's 100% acceptable. The space complexity is solely big O(n) since we are not creating new arrays we are solely modifying them. So congrats on your first efficient / intermediate sorting algorithm.
Top comments (0)