My wife and I are avid board gamers, but oddly one of the the things we enjoy most is organizing the game itself. We’ve often joked about buying a game just to organize it. I love the “pop” as the chits break free from their cardboard bonds; discovering how each piece looks and sorting them according to size, shape and color; figuring out how to put the pieces back in the box without having any of them break before the next time we play the game.

I think this love of tidiness is also what draws me to algorithms and sorting algorithms in particular. You start with a jumbled array of items and by following some simple steps you arrive at something organized, something that makes sense.

## Insertion Sort

**Average Complexity**: O(n^2)

**Advantages**: Small footprint and fairly quick for small datasets.

The insertion sort is probably the simplest sorting algorithm. It’s actually very similar to what most people do when they’re handed a deck of cards and are asked to sort them.

An insertion sort is done by building a sorted array in place. For each element, compare it to the elements before it. If it’s larger or the same it stays where it is. If it’s smaller it gets swapped with all the elements before it until it’s either the first element or it’s larger or equal to the element before it.

### Python implementation

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
def insertion_sort(items): # Start with the second item. for i in range(1, len(items)): # Cache the position of the active item. active_item = i # Any items before the active item are considered to be in sorted order. for j in range(i - 1, -1, -1): # If our active item is smaller than the comparator swap places if items[j] > items[active_item]: items[j], items[active_item] = items[active_item], items[j] # Update the position of our active item. active_item = j else: # The item is in sorted order, break out of the loop break return items |

#### Step-by-step

## Mergesort

**Average Complexity**: O(n log(n))

**Advantages**: Highly parallelizable and stable.

Mergesort is a divide-and-conquer algorithm. It splits the input data until it has a single element list for each item in the input and then recursively merges these together in sorted order.

Python’s sort methods use a variant called *timsort* which is essentially a hybrid of mergesort and insertion sort.

### Python implementation

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
def mergesort(items): if len(items) <= 1: # A single element array is always sorted. return items # Find the middle point of the list. pivot = len(items) / 2 # Divide it into two lists and get the sorted versions. left = mergesort(items[pivot:]) right = mergesort(items[:pivot]) sorted = [] i, j = 0, 0 # Merge the lists by comparing each item in both lists and inserting # them into a new list in sorted order. while i < len(left) and j < len(right): if left[i] <= right[j]: sorted.append(left[i]) i += 1 else: sorted.append(right[j]) j += 1 # Append any remaining items in the left and right lists. sorted += left[i:] sorted += right[j:] return sorted |

#### Step-by-step

## Quicksort

**Average Complexity**: O(n log(n))

**Advantages**: In practice it’s faster than mergesort and has a smaller memory footprint.

Quicksort works by picking an element in the array to be the pivot, and then dividing all the other elements in the array into two arrays. One contains items that are smaller than the pivot; the other contains items that are larger. Recursion is then used to sort the subdivided array.

**Python implementation**

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
def quicksort(items): if len(items) <= 1: # A single element array is always sorted. return items # Select the first item as the pivot. pivot = items.pop() lesser, greater = [], [] # Loop through the items and append them to the correct list. for item in items: if item < pivot: lesser.append(item) elif item >= pivot: greater.append(item) # Recursively sort the lesser and greater items and return the result. return quicksort(lesser) + [pivot] + quicksort(greater) |

#### Step-by-step

## Heap Sort

**Average Complexity**: O(n log(n))

**Advantages**: In most cases it’s slower than Quicksort, but it has a better worst-case running time.

Heap sort is interesting when compared to the other three algorithms here, in that it actually doesn’t use an array as its data structure, it uses a heap. A heap is type of priority queue, generally created as a binary tree where any given node always has a higher priority (in the case of sorting, a smaller number) than its children.

Once you have a heap, returning a sorted array is very simple, you just keep popping off the root node until the tree is empty. This can also be done much simpler by using Python’s built-in heapq.

### Python implementation

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 |
import math class Heap(object): def __init__(self, iterable=None): # We're going to store our heap in an array. self.items = [] self.item_count = 0 for item in iterable: self.add(item) def add(self, value): self.items.append(value) self.item_count += 1 self._balance_up(self.item_count - 1) def _balance_up(self, index): if index == 0: return # Since we're dealing with an array, we use some maths to figure out # which of the indexes is the parent of the current item. parent = int(math.floor((index - 1) / 2)) if self.items[parent] > self.items[index]: self.items[parent], self.items[index] = self.items[index], self.items[parent] self._balance_up(parent) def _balance_down(self, index=0): if index >= len(self.items): return bounds = len(self.items) - 1 # Get the position of the children of the current element. left, right = (index * 2) + 1, (index * 2) + 2 # Find which child has the smallest value smallest = index if left <= bounds and self.items[left] < self.items[smallest]: smallest = left if right <= bounds and self.items[right] < self.items[smallest]: smallest = right # If the current element doesn't have the smallest value, swap it with # its child and continue to balance the tree. if index is not smallest: self.items[smallest], self.items[index] = self.items[index], self.items[smallest] self._balance_down(smallest) def pop(self): # Pop the root element and re-balance the tree. last_index = self.item_count - 1 self.items[0], self.items[last_index] = self.items[last_index], self.items[0] item = self.items.pop() self.item_count -= 1 self._balance_down() return item def is_empty(self): return self.item_count == 0 def heapsort(iterable): result = [] heap = Heap(iterable) while not heap.is_empty(): result.append(heap.pop()) return result |

#### Step-by-step

**Input:** [4, 3, 5, 2, 1]