In this article, we will be discussing some of the most important sorting algorithms in Data Structures & Algorithms. These sorting algorithms will be explained in detail in C, C++ and Java.
Sorting Algorithms Types & Uses
Sorting is nothing but a way of arranging the data in ascending or descending order. While there are several sorting algorithms in general, we would focus on mastering these:
Also, here are some terms you need to be familiar with before you go ahead and learn the sorting algorithms.
- Time Complexity: It is the amount of time taken by a given code/algorithm to process or run as a function of the amount of input. In other words, time complexity is essentially efficiency, or how long a program function takes to process a given input.
- Space complexity: Is the amount of memory used by the algorithm (including the input values to the algorithm) to execute and produce the result.
Now, lets get into exploring the algorithms one-by-one.
#Sorting Algorithm 1: Bubble Sort
Bubble sort is the simplest algorithm of all. It begins by comparing every pair of adjacent elements of an array and checks if the first element is greater than the second. If so, then the two elements must be swapped. Else they need not be swapped and moves on to the next element. This process goes on until the last and second last element is compared.
Here is how it looks:
Bubble Sort Pseudo Code:
for i = 0 to n - 1 // to keep track of number of cycles
for j = 0 to n - i - 1 // to compare the elements within the particular cycle
// swap if one element is greater than its adjacent element
if arr[j] > arr[j + 1] then
swap(arr[j], arr[j + 1])
#Sorting Algorithm 2: Selection Sort
In Selection sort algorithm, the entire input array is divided into two sections, the sorted array on the left and the unsorted array on the right. It finds out the smallest unsorted element in each iteration, and add it to the sorted list of elements. Click here to know in detail with an example.
Selection Sort Pseudo Code:
for i = 0 to n - 1 // Finding the minimum element in unsorted array min = i for j = i + 1 to n if arr[j] < arr[min] then min = j end if end for // Swaping the found minimum element with the first element swap(arr[min], arr[i]) end for
#Sorting Algorithm 3: Insertion Sort
Insertion sort algorithm orders all the elements in ascending order. In the insertion sort algorithm, for every pass, it inserts an element from an unsorted list to the sorted list until all the elements are sorted in the array.
Insertion Sort PseudoCode:
for i = 1 to n
key = arr[i]
j = i - 1
// comparing whether the first element is greater than the second element
// if yes, then store the largest element to the next position
while j >= 0 and arr[j] > key
arr[j + 1] = arr[j]
j = j - 1
// storing the smallest element in the correct position
arr[j + 1] = key
#Sorting Algorithm 4: Merge Sort
Merge sort follows Divide & Conquer algorithm. In a given array with n elements, initially, the unsorted list is divided into multiple sublists such that each sublist contains one element i.e n sublists are created for n elements in the array. These sublists later merge with adjacent sublists in sorted order. Continuously, repeated merging happens until one linear array is formed. The final list formed will be a sorted list. Click here to view detailed examples of merge sort.
Pseudocode of Merge Sort:
procedure merge(array, start, mid, end)
num1 = mid - start + 1
num2 = end - mid
// Copy data to temporary arrays arr1 and arr2
for i = 0 to num1
arr1[i] = arr[start + i]
for j = 0 to num2
arr2[j] = arr[mid + 1 + j]
// Merge the temp arrays back into arr
i = 0 // Initial index of first subarray
j = 0 // Initial index of second subarray
k = start // Initial index of merged subarray
while (i < num1 and j < num2)
if (arr1[i] <= arr2[j]) then
arr[k] = arr1[i]
arr[k] = arr2[j]
// Copy the remaining elements of arr1, if there are any
while (i < num1)
arr[k] = arr1[i]
// Copy the remaining elements of arr2, if there are any
while (j < num2)
arr[k] = arr2[j]
procedure divide(array, start, end)
if(start < end) then
mid = (start + end) / 2
divide(arr, start, mid)
divide(arr, mid + 1, end)
merge(arr, start, mid, end)
#Sorting Algorithm 5: Quick Sort
QuickSort also follows Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways.
- Always pick first element as pivot.
- Always pick last element as pivot
- Pick a random element as pivot.
- Pick median as pivot.
Pseudocode of quicksort:
procedure partition(array, low, high)
pivot = arr[high] // assuming last element of the array as the pivot element
i = (low - 1) // assuming the index of i pointer as one position less than the first element
for j = low to high - 1 // assuming the index of j pointer as the first position
// If current element is smaller than or equal to pivot
if (arr[j] <= pivot) then
i++ // increment index of i pointer and swap the elemets at index i and j
temp = arr[i]
arr[i] = arr[j]
arr[j] = temp
// swapping the pivot (last) element and element at i + 1 index
temp = arr[i + 1]
arr[i + 1] = arr[high]
arr[high] = temp
// returning the index of pivot element having lesser elements to the left and greater elements to the right
return (i + 1)
procedure quick_sort(array, low, high)
if (low < high)
// partitioning the single array into two sub-arrays
pi = partition(arr, low, high)
// sorting the sub-arrays
quick_sort(arr, low, pi - 1)
quick_sort(arr, pi + 1, high)
#Sorting Algorithm 6: Shell Sort
Shell sort is also called as Shellsort or Shell’s method. Shell sort is sometimes called by the name “diminishing increment sort“.
Shell sort is an improvisation of insertion sort. The sorting starts off by swapping pairs of elements far apart from each other, then progressively reducing the gap between the elements that need to be compared. This process is faster as compared to near neighbour exchange based algorithms as the initial swaps are made with far apart elements.
Pseudo Code of shell sort:
for gap = n/2 from 0
// Do a gapped insertion sort
// The first gap elements arr[0..gap-1] are already in gapped order
// keep adding one more element until the entire array is gap sorted
for i = gap to n
// add arr[i] to the elements that have been gap sorted
// save arr[i] in temp and make a empty space at index i
temp = arr[i]
// shift earlier gap-sorted elements up until the correct location for arr[i] is found
for j = i from gap and arr[j - gap] from temp; j = j - gap
arr[j] = arr[j - gap]
// put temp (the original arr[i]) in its correct position
arr[j] = temp;
#Sorting Algorithm 7: Heap Sort
The Heap Sort is an improved version of straight selection sort in which the largest element (the root) is selected and exchanged with the last element in the unsorted list.
Simple Visualisation of the working:
#Sorting Algorithm 8: Comb Sort
Comb Sort is an improvised version of bubble sort. The basic idea in comb sort is to eliminate the relatively smaller values situated to the end of the list. This is done as applying bubble sort on such an array would highly slow down the process. Click here to know the complete working of Comb sort.
Pseudo Code of Comb Sort:
// Find the gap value by dividing the array size with the value 1.3
gap = (gap * 10) / 13
if (gap < 1) then
gap = n;
// Initialize the variable swapped as true to make sure the while loop runs
swapped = true
// Keep running the loop while gap is more than 1 and last iteration causes a swap
while (gap != 1 or swapped == true)
// Find next gap
gap = NextGap(gap)
// Initialize swapped as false so that we can check if swap happened or not
swapped = false
// Compare all elements that are at the distance of current gap value
for i = 0 to n - gap
if (arr[i] > arr[i + gap]) then
swap(arr[i], arr[i + gap])
swapped = true;