A Brief Summary of Sorting Algorithm, Part 1
- Basics:
- in-place sorting vs. out-place sorting
- internal sorting vs. external sorting
- stable vs. unstable sorting
- Bubble Sort
- Selection Sort
- Insertion Sort
- Merge Sort
- Quick Sort
- Heap Sort
Internal vs. external sorting
Internal sorting and external sorting describes where the sorting occurs:
- internal sorting located entirely in memory
- external sorting utilizes hard disk and external storage
Stable vs. unstable sorting
A sorting algorithm is said to be stable if two objects with equal keys appear in the same order in sorted output as they appear in the input array to be sorted.
- stable sorting algorithms includes:
- Bubble Sort
- Insertion Sort
- Merge Sort
- Count Sort
Bubble Sort
Bubble Sort is a type of stable sorting algorithm. The algorithm compares two elements that are next to each other and swap two element is the left one is larger than the right one. Time complexity of bubble sort is O(n*n).
1 | def bubbleSort(arr): |
Selection Sort
In every iteration of selection sort, the minimum element (considering ascending order) from the unsorted subarray is picked and moved to the sorted sub-array. Selection sort can be done stably. Time complexity of selection sort is O(n*n).
1 | def selectionSort(arr): |
Insertion Sort
Insertion sort is stable. In every iteration of insertion sort, the first element is selected and inserted into the correct location in the sorted half of the array. Time complexity of Insertion sort is O(n*n).
1 | def insertionSort(arr): |
Merge Sort
Merge sort is stable. Merge Sort is a Divide and Conquer algorithm. It divides input array in two halves, calls itself for the two halves and then merges the two sorted halves. Time complexity is O(n*log(n)).
1 | def mergeSort(arr): |
Quick Sort
For quick sort, we pick a random element as pivot. Compare each element with the pivot to create first half of the list smaller than the pivot and the second half larger than the pivot. After that, quick sort divide conquer two halves. Time complexity for quick sort is O(nlog(n)), worst case is O(nn). Quick sort can be made stable.
1 | def partition(arr,low,high): |
Heap Sort
- Build a max heap from the input data.
- At this point, the largest item is stored at the root of the heap. Replace it with the last item of the heap followed by reducing the size of heap by 1. Finally, heapify the root of tree.
- Repeat above steps while size of heap is greater than 1.
Heapify: this procedure calls itself recursively to move the max value to the top of the heap. Time complexity for heapify is O(log(n)) and time complexity for building a heap is O(n). Thus, heap sort gives the overall time complexity as O(n*log(n)).1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24def heapify(arr, n, i):
largest = i # Initialize largest as root
l = 2 * i + 1 # left = 2*i + 1
r = 2 * i + 2 # right = 2*i + 2
# left child
if l < n and arr[i] < arr[l]:
largest = l
# right child
if r < n and arr[largest] < arr[r]:
largest = r
# change the root
if largest != i:
arr[i],arr[largest] = arr[largest],arr[i] # swap
# Heapify upward
heapify(arr, n, largest)
def heapSort(arr):
n = len(arr)
for i in range(n, -1, -1):
heapify(arr, n, i)
for i in range(n-1, 0, -1):
arr[i], arr[0] = arr[0], arr[i] # swap
heapify(arr, i, 0)
return a