DEV Community

Cover image for Shaping Data Landscapes Using Sorting Algorithms 🏞️
Shrutik
Shrutik

Posted on

Shaping Data Landscapes Using Sorting Algorithms 🏞️

Intro:
A sorting algorithm is a systematic procedure or method for arranging elements in a list, array, or data structure in a specific order, typically in ascending or descending order based on some key criteria.
Sorting is a fundamental operation in computer science and data processing, as it allows data to be organized for efficient searching, retrieval, and analysis.
Sorting algorithms play a crucial role in various applications, including databases, information retrieval, data visualization, and more.

Factors considered for choosing the perfect algorithm:
1. Time Complexity:
We see that time complexity is required for checking the effectiveness of an algorithm.
If there's a case in which a one sorting algorithm sorts 10 elements perfectly but fails when the number of elements scale up to 1000, in that case the time complexity of that sorting algorithm is considered poor.
In general, O(N log N) is considered a better algorithm time complexity than O(N 2).

2. Space Complexity:
This criterion helps us compare the space the algorithm uses to sort any data set. If an algorithm consumes a lot of space for larger inputs, it is considered a poor algorithm for sorting large data sets.
This evaluates the amount of additional memory or storage space required by the algorithm.

3. Stability:
A sorting algorithm is stable if it preserves the relative order of equal elements in the sorted output. Stability is crucial in certain applications, such as sorting by multiple criteria.
For example - Take 5 numbers as 4, 2, 2, 3, 7 by sorting them in increasing order by an algorithm we get the output as 2, 2, 3, 4, 7. But the key thing is to look after the fact that whether or not the 2s follow the same order as given before or not. If they do, then the algorithm we followed is stable.

4. Adaptivity:
An adaptive sorting algorithm performs better when the input data is partially sorted. It adjusts its strategy to take advantage of existing order.

5. Recursiveness:
If the algorithm uses recursion to sort a data set, then it is called a recursive algorithm. Otherwise, non-recursive.

Different Sorting Algorithms:
1. Bubble Sort:

  • Imagine you have a row of numbers, and you start comparing the first two numbers.
  • If the first number is bigger than the second, you swap them.
  • Then, you move one position to the right and do the same comparison and swap if necessary.
  • You keep doing this until you reach the end of the row.
  • You repeat this process for the entire row until no more swaps are needed. It's like repeatedly bubbling up the largest number to the end of the row.

Pseudo code:

for i from 0 to (n-1):
    for j from 0 to (n-i-1):
        if list[j] > list[j+1]:
            swap(list[j], list[j+1])
Enter fullscreen mode Exit fullscreen mode

2. Insertion Sort:

  • Think of it like sorting a deck of cards.
  • You start with one card (the first card is already considered sorted).
  • Then, you pick up the next card and insert it into the correct position among the already sorted cards.
  • You repeat this process for all the cards, one by one.
  • It's like building a sorted hand of cards from an initially unsorted pile.

Pseudo code:

for i from 1 to n:
    key = list[i]
    j = i - 1
    while j >= 0 and list[j] > key:
        list[j + 1] = list[j]
        j = j - 1
    list[j + 1] = key

Enter fullscreen mode Exit fullscreen mode

3. Selection Sort:

  • Imagine you have a list of numbers.
  • You look through the list to find the smallest number and put it in the first position.
  • Then, you find the second smallest number in the remaining list and put it in the second position.
  • You keep doing this until you've sorted the entire list.
  • It's like repeatedly selecting the smallest number and putting it in its correct place.

Pseudo code:

for i from 0 to (n-1):
    min_index = i
    for j from (i+1) to n:
        if list[j] < list[min_index]:
            min_index = j
    swap(list[i], list[min_index])

Enter fullscreen mode Exit fullscreen mode

4. Merge Sort:

  • Picture dividing a big pile of papers into smaller piles.
  • You keep dividing until you have individual pieces.
  • Then, you start merging these pieces in a sorted manner.
  • You continue merging until you have one big sorted pile.
  • It's like breaking things down into manageable parts and then organizing them.

Pseudo code:

MergeSort(list[]):
    if length(list) <= 1:
        return list
    else:
        mid = length(list) / 2
        left = MergeSort(list[0 to mid-1])
        right = MergeSort(list[mid to end])
        return Merge(left, right)

Merge(left[], right[]):
    result = []
    while left is not empty and right is not empty:
        if left[0] <= right[0]:
            append left[0] to result
            remove left[0] from left
        else:
            append right[0] to result
            remove right[0] from right
    append remaining elements of left and right to result
    return result
Enter fullscreen mode Exit fullscreen mode

5. Quick Sort:

  • Think of it as organizing a group of people by height.
  • You pick someone as a "pivot" (maybe the tallest person).
  • You then divide the group into two: people taller than the pivot and people shorter than the pivot.
  • You repeat this process for the two smaller groups.
  • Eventually, everyone is in their right order.
  • It's like repeatedly dividing and conquering until everyone is where they belong.

Pseudo code:

QuickSort(list[], low, high):
    if low < high:
        pivot_index = Partition(list, low, high)
        QuickSort(list, low, pivot_index - 1)
        QuickSort(list, pivot_index + 1, high)

Partition(list[], low, high):
    pivot = list[high]
    i = low - 1
    for j from low to high - 1:
        if list[j] <= pivot:
            i = i + 1
            swap list[i] and list[j]
    swap list[i + 1] and list[high]
    return i + 1
Enter fullscreen mode Exit fullscreen mode

6. Heap Sort:

  • Imagine a bunch of numbers as balls in a row.
  • You arrange them into a special structure called a "heap" (like a neatly stacked pile).
  • Then, you repeatedly take the top ball (the largest) off the heap and put it in the sorted list.
  • You keep doing this until the heap is empty.
  • It's like always picking the biggest thing and adding it to your sorted collection.

Pseudo code:

HeapSort(list[]):
    BuildMaxHeap(list)
    for i from (length(list) - 1) down to 1:
        swap list[0] and list[i]
        Heapify(list, 0, i - 1)

BuildMaxHeap(list[]):
    n = length(list)
    for i from (n/2 - 1) down to 0:
        Heapify(list, i, n)

Heapify(list[], index, heap_size):
    largest = index
    left_child = 2 * index + 1
    right_child = 2 * index + 2
    if left_child <= heap_size and list[left_child] > list[largest]:
        largest = left_child
    if right_child <= heap_size and list[right_child] > list[largest]:
        largest = right_child
    if largest != index:
        swap list[index] and list[largest]
        Heapify(list, largest, heap_size)
Enter fullscreen mode Exit fullscreen mode

That was all for today's issue. Be sure to look into their implementation in your preferred language.
I have made things simple for you by providing you with the pseudo code for each.
Things will be taking a wild turn from now on. In order to catch up with it, practice numerous problems on these concepts.
After the completion of this series...a new problem-solving series update will be released.
Be sure to check that out in future.

I hope this post was informative and helpful.
If you have any questions, please feel free to leave a comment below.

Happy Coding πŸ‘πŸ»!
Thank You

Top comments (0)