Christmas 2021 - my favourite gift was the book ** Grokking Algorithms by Aditya Y. Bhargava**. This book is perfect for somebody like me who has never formally studied computer science but has developed a deep interest in the subject.

Over the festive period I worked through the chapters and the code examples, making the small changes required to get them to run in Python 3 (the book examples are written in Python 2), and then converting them to JavaScript.

Below is my interpretation of some of the algorithms that the book focuses on, namely:

- Binary search
- Selection sort
- Quicksort

In later parts I will cover:

- Breadth-first search
- Dijkstra's algorithm &
- Solving the Knapsack Problem with dynamic programming

## BINARY SEARCH

Imagine you have a sorted array and you are searching for a specific element that may, or may not, be in that array. How would you approach the search?

One way would be to start at array `index 0`

and work your way through each element until you find what you are looking for. If your target element is the last one in the array, or it isn't in the array at all, you will need to access every element. That's the worst case scenario and it's customary to compare algorithm efficiency based on the worst case.

### Binary search - steps

Since the array is sorted you could use a binary search algorithm. Imagine you have a sorted array of 512 elements. Binary search works like this:

Your first step is to look at the middle element (index 256) to see if it is the element you are looking for. If it is, happy days! Chances are though that it won't be, in which case you ask yourself:

"Is this middle element higher or lower than my target element?"

If `array[256]`

is higher, you know that your target element must be in the lower half so you have immediately discarded half of the array.

Next, look at the middle element from those that remain and go through the same steps. Again you have eliminated half of the remaining elements.

Keep doing that until you either find your target element or discover that it's not in the array. Worst case scenario is that your target isn't in the array, or it's the very last element. But how many steps would it take you to find the solution in that worst case scenario?

Well, in an array of 512 elements the answer is **log _{2}512**. In other words, to what power do you have to raise the number 2 to get 512?

Answer: **9 steps.**

### Comparison with simple search

Using the first method (known as simple search) on an array of 512 elements would take 512 steps (remember, we're looking at the worst case here). The 9 steps taken by binary search is clearly significantly quicker. And the difference is magnified with larger data sets.

Imagine that you need to search an array of 1 billion elements and your super fast computer can process 1000 elements per second. Binary search would deliver an answer in 30 milliseconds (2^{30} = 1.073 billion) while simple search would take more than 11 days.

Below is my JavaScript version of binary search.

```
function binarySearch(arr, target) {
let low = 0;
let high = arr.length - 1;
let mid;
while (low <= high) {
mid = Math.floor((low + high) / 2);
let guess = arr[mid];
if (guess === target) {
return mid;
}
if (guess > target) {
high = mid - 1;
} else {
low = mid + 1
}
}
return null;
}
return null;
}
const myList = [1,3,5,7,9,11,13,15];
console.log(binarySearch(myList, 5)); // 2
console.log(binarySearch(myList, 12)); // null
```

## SELECTION SORT

The first algorithm that we looked at, binary search, only works on a sorted array. Selection sort is one method that you can use to get an array into a sorted state and it works as follows:

### Selection sort - steps

Loop through your unsorted array;

Find the lowest value element;

Extract said element and place it in a new array at index `0`

.

Loop through the remaining elements of the unsorted array;

Find the lowest value element;

Extract said element and add it to the end of the new array.

Repeat until the original, unsorted array is empty by which time the new array is a sorted array of the same elements.

Below is my JavaScript version of selection sort. The Python code in the book makes use of a for loop in the main selection_sort() function whose initial length is determined by the length of the original, unsorted array. I preferred to use a while loop to avoid the risk of referencing an out-of-range array index with the original array shrinking on each iteration.

```
function findSmallest(arr) {
let smallest = arr[0];
let smallestIndex = 0;
arr.forEach((el, index) => {
if (el < smallest) {
smallest = el;
smallestIndex = index;
}
});
return smallestIndex;
}
function selectionSort(arr) {
newArr = [];
while (arr.length > 0) {
const smallest = findSmallest(arr);
newArr.push(arr.splice(smallest, 1)[0]);
}
return newArr;
}
console.log(selectionSort([5,3,6,2,10])); // [ 2, 3, 5, 6, 10 ]
console.log(selectionSort(['grape', 'apple', 'banana', 'kiwi'])); // 'apple', 'banana', 'grape', 'kiwi' ]
```

### Efficiency

It's worth mentioning here that selection sort is a slow algorithm. For an unsorted array of `n`

items, that array has to be looped through `n`

times. It therefore takes n^{2} operations.

But, hang on a minute, n reduces by 1 on each iteration so it's not n^{2}; surely it's more like 1/2n * n operations.

That's true, but in the world of algorithm performance measurement, constants (like the 1/2 in the previous sentence) are ignored so selection sort has an efficiency of **n ^{2}**.

## QUICKSORT

As its name suggests, *quicksort* is somewhat quicker than selection sort. It's what's known as a divide and conquer algorithm and uses a technique similar to that used in binary search in that it breaks down the problem into smaller and smaller chunks.

It also relies on recursion, a subject that I won't go into in any depth here other than to say that it is a technique that relies on a function being able to call itself repeatedly until what's known as the "base case" is reached, at which point the function returns its result.

Recursion also relies on the inner workings of the call stack. Until the base case is reached, every call to the function is incomplete and is held *in limbo* in the call stack. When the base case is reached, and the function finally returns its result, the results of each preceding function call can then be passed down as each completed function is popped off the call stack and the final result is output from the initial call to the recursive function.

It's vitally important to include a valid base case in a recursive function, otherwise the function will continue calling itself forever, or at least until the call stack overflows.

That's probably a rather confusing explanation of the workings of recursion. If you want to understand it more fully I recommend getting your own copy of Grokking Algorithms. Aditya Bhargava does a wonderful job of explaining it with lots of hand-drawn illustrations.

I can also recommend this talk by Al Sweigert on the subject:

https://www.youtube.com/watch?v=fhDsjfLSmVk

### Quicksort steps

Quicksort works by selecting an array element at random. This becomes the *"pivot"*. Remaining elements are compared against the pivot and divided into "less than" and "greater than" arrays.

Each of the less and greater arrays is run through the same process, and so on and so on until the base case is reached (ie. the array is only one element long so cannot be sorted) at which point all of the recursive function calls can return and everything is put back together at the end in sorted order.

Below is my JavaScript take on quicksort based on the Python version in the book. The Python version is very succinct. It makes use of list comprehensions, a very neat technique, and Python's ability simply to add lists together.

I used JavaScript's filter function in place of Python's list comprehensions and the array spread operator to facilitate the adding together of all of the elements in the recursive return statement.

```
function quicksort(arr) {
if (arr.length < 2) {
return arr;
} else {
const pivotIndex = Math.floor(Math.random() * arr.length);
const pivot = arr[pivotIndex];
const reduced = [...arr.slice(0, pivotIndex), ...arr.slice(pivotIndex+1)];
const less = reduced.filter(v => v <= pivot);
const greater = reduced.filter(v => v > pivot);
return [...quicksort(less), pivot, ...quicksort(greater)];
}
}
console.log(quicksort([10, 5, 2, 3])); // [ 2, 3, 5, 10 ]
```

Any element can be the pivot in quicksort but choosing an element at random will yield the greatest time efficiency in the average case, namely: **n log n**. (In algorithm efficiency terms, "log" is assumed always to refer to log_{2} and it's customary simply to omit the 2)

## Summary

This article introduced the concept of algorithms by looking at the simpler examples. Not all algorithms are created equally efficient and the idea of time efficiency was introduced.

The subject of recursion also featured. Recursion is a technique often used in algorithms which is notoriously difficult for beginners to wrap their head around.

Part 2 of this series will look at graphs and breadth-first search.

Cover image by Clem Onojeghuo on Unsplash

## Top comments (0)