Insertion Sort
A simple sorting algorithm that builds the sorted array one element at a time. Great for small datasets and nearly sorted arrays.
Insertion sort is one of the simplest sorting algorithms. It works the way you might sort playing cards in your hand: pick up one card at a time and insert it into its correct position among the cards already in your hand.
How It Works
- Start with the second element (assume first is already sorted)
- Compare it with elements before it
- Shift larger elements right
- Insert the current element in its correct position
- Repeat for all remaining elements
Sorting [5, 2, 4, 6, 1, 3]:
Pass 1: Insert 2
[5, 2, 4, 6, 1, 3]
↑ ↑
| current (2 < 5, shift 5 right)
[2, 5, 4, 6, 1, 3]
Pass 2: Insert 4
[2, 5, 4, 6, 1, 3]
↑ ↑
| current (4 < 5, shift 5 right; 4 > 2, insert)
[2, 4, 5, 6, 1, 3]
Pass 3: Insert 6
[2, 4, 5, 6, 1, 3]
↑ ↑
| current (6 > 5, already in place)
[2, 4, 5, 6, 1, 3]
Pass 4: Insert 1
[2, 4, 5, 6, 1, 3]
↑ ↑
| current (1 < all, shift all right)
[1, 2, 4, 5, 6, 3]
Pass 5: Insert 3
[1, 2, 4, 5, 6, 3]
↑ ↑
| current (3 < 4,5,6; 3 > 2)
[1, 2, 3, 4, 5, 6]
Implementation
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i] # Element to insert
j = i - 1
# Shift elements greater than key to the right
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
# Insert key in correct position
arr[j + 1] = key
return arr
print(insertion_sort([5, 2, 4, 6, 1, 3]))
# [1, 2, 3, 4, 5, 6]
func insertionSort(arr []int) []int {
for i := 1; i < len(arr); i++ {
key := arr[i]
j := i - 1
for j >= 0 && arr[j] > key {
arr[j+1] = arr[j]
j--
}
arr[j+1] = key
}
return arr
}
fn insertion_sort(arr: &mut [i32]) {
for i in 1..arr.len() {
let key = arr[i];
let mut j = i as i32 - 1;
while j >= 0 && arr[j as usize] > key {
arr[(j + 1) as usize] = arr[j as usize];
j -= 1;
}
arr[(j + 1) as usize] = key;
}
}
Time Complexity
Best Case: O(n) When the array is already sorted, the inner loop never executes:
Already sorted: [1, 2, 3, 4, 5]
Each element just compares once and stays in place.
Worst Case: O(n²) When the array is reverse sorted, every element must be moved:
Reverse sorted: [5, 4, 3, 2, 1]
Total comparisons: 1 + 2 + 3 + 4 = 10 = n(n-1)/2 = O(n²)
Average Case: O(n²) On average, each element needs to move about half the distance.
Space Complexity
O(1) - In-place sorting. Only uses a constant amount of extra memory for key and loop variables.
Properties
| Property | Insertion Sort | |----------|----------------| | Time (best) | O(n) | | Time (worst/avg) | O(n²) | | Space | O(1) | | Stable | Yes | | In-place | Yes | | Adaptive | Yes |
Stability
A sorting algorithm is stable if equal elements maintain their relative order. Insertion sort is stable because we only shift elements when they're strictly greater (arr[j] > key), not greater-or-equal.
Adaptive
An adaptive algorithm performs better on partially sorted data. Insertion sort's O(n) best case on sorted data makes it adaptive—the more sorted the input, the faster it runs.
When to Use Insertion Sort
Good for:
- Small arrays (n < 50)
- Nearly sorted arrays
- Online sorting (data arrives one element at a time)
- When simplicity matters more than speed
- Hybrid algorithms (used for small subarrays in Timsort, Introsort)
Not good for:
- Large unsorted datasets
- When O(n log n) performance is required
Nearly Sorted Data
For arrays where each element is at most k positions from its sorted position, insertion sort runs in O(nk) time:
# Nearly sorted: each element is at most 3 positions away
arr = [2, 1, 3, 5, 4, 7, 6, 8, 9]
# After insertion sort: [1, 2, 3, 4, 5, 6, 7, 8, 9]
# Each element moves at most 3 positions: O(n * 3) = O(n)
Recursive Version
Insertion sort can be expressed recursively:
def insertion_sort_recursive(arr, n=None):
if n is None:
n = len(arr)
# Base case: first element is already sorted
if n <= 1:
return
# Sort first n-1 elements
insertion_sort_recursive(arr, n - 1)
# Insert last element in sorted portion
key = arr[n - 1]
j = n - 2
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
Binary Insertion Sort
Use binary search to find insertion position (reduces comparisons but not shifts):
import bisect
def binary_insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
# Find position using binary search
pos = bisect.bisect_left(arr, key, 0, i)
# Shift elements and insert
arr[pos+1:i+1] = arr[pos:i]
arr[pos] = key
return arr
Time complexity: O(n log n) comparisons, but still O(n²) shifts.
Comparison with Other O(n²) Sorts
| Aspect | Insertion | Selection | Bubble | |--------|-----------|-----------|--------| | Best case | O(n) | O(n²) | O(n) | | Swaps | O(n²) | O(n) | O(n²) | | Adaptive | Yes | No | Yes | | Stable | Yes | No | Yes | | Online | Yes | No | No |
Use in Hybrid Algorithms
Modern sorting libraries use insertion sort for small subarrays:
def hybrid_sort(arr, threshold=16):
"""Quicksort with insertion sort for small arrays."""
if len(arr) <= threshold:
return insertion_sort(arr)
# Normal quicksort for larger arrays
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return hybrid_sort(left) + middle + hybrid_sort(right)
Why Small Array Threshold?
For small arrays, the overhead of recursion in merge sort or quicksort exceeds the benefit of O(n log n). Insertion sort's simplicity and low overhead makes it faster for small n, typically around 16-64 elements.
Key Takeaways
- Insertion sort builds the sorted array one element at a time
- O(n²) average/worst case, but O(n) best case for sorted data
- In-place, stable, and adaptive
- Excellent for small arrays and nearly sorted data
- Used as a building block in hybrid sorting algorithms
- Think of it like sorting cards in your hand
Insertion sort may not be the fastest algorithm, but its simplicity, stability, and excellent performance on small or nearly sorted data make it a practical choice in many real-world scenarios.
