When working with a large input dataset, it may be necessary to adopt different algorithms. Imagine that you need to search in a large list of numbers, for example, 1 million numbers. If you adopt a solution to iterate through the entire list of numbers one index at a time with a conventional loop until you find the target element, in the worst case, this would require 1 million searches (linear time complexity O(n)).
Now, by adopting binary search, you won't need to search more than 19 times in the worst case (logarithmic time complexity O(log(n))), and this number doesn't grow much even if your input data increases significantly.
However, this is only possible with ordered lists.
If you want to know more about this topic, I found a good material on W3Schools: https://www.w3schools.com/dsa/dsa_algo_binarysearch.php.
I also included a simple example of a binary search implementation.
function binarySearch(value, list){
let left = 0; //start of the list
let right = list.length - 1; //end of the list
while ( left <= right ) {
const middle = Math.floor(( left + right ) / 2); //middle of the list
const potentialValue = list[middle]; //potential value starts in the middle
if (value === potentialValue){ //is the potential value what you are looking for?
return middle; //if so, return the index of the list
} else if (value < potentialValue){ //is this value smaller than the potential value (middle of the list)?
right = middle - 1; //if so, it's to the left, and you can ignore everything to the right of the middle
} else {
left = middle + 1; //if not, it's to the right, and you can ignore everything to the left of the middle
}
}
//the whole process happens again... until the value you are looking for is found
return false;
}
Top comments (0)