Overview of Sorting Algorithms
Introduction
Sorting is one of the most fundamental problems in computer science, used as a prerequisite in many algorithms. There exist many sorting algorithms, each with its own strengths and ideal use cases.
Special Cases Sorting Algorithms
Binary Array Sorting (Two Types of Elements)
- Problem: Sort an array with only two types of elements (e.g., 0s and 1s)
- Solution: Use partition algorithms from quicksort
- Naive Partition: Stable but requires extra space
- Lomuto Partition: Slower than Hoare
- Hoare Partition: Fastest
Array with Three Possible Values
- Problem: Sort arrays with three distinct values (e.g., 0s, 1s, and 2s)
- Solution: Extended quicksort partitioning
- Same options as binary sorting (Naive/Lomuto/Hoare)
Small Range Values (Counting Sort)
- When to use: Array size = n, value range = k where k is small (e.g., 100-200)
- Time: O(n + k)
- Space: O(k)
- Best for: Small ranges (k < n)
Large Range Values (Radix Sort)
- When to use: Range is n² or n³ (e.g., 0-100,000 for n=10,000)
- Time: O(n)
- Space: O(n)
- Advantage: Better than counting sort for large ranges
Uniformly Distributed Data (Bucket Sort)
- When to use: Data uniformly distributed (e.g., floating points 0.0-1.0)
- Performance: O(n) average case
- Best for: Uniform distributions
Costly Memory Writes
- Algorithms:
- Selection Sort:
- Time: O(n²)
- Minimizes swaps (good for EEPROM)
- Cycle Sort:
- Time: O(n²)
- Optimal memory writes
- Selection Sort:
Adjacent Swaps Only
- Algorithm: Bubble Sort (or optimized Cocktail Sort)
- Use case: When only adjacent swaps allowed
Small Arrays (n ≤ 20)
- Best Algorithm: Insertion Sort
- Why: Simple and efficient for tiny datasets
- Note: Used by many standard libraries for small subarrays
Limited Extra Memory (Shell Sort)
- When to use: Embedded systems with minimal memory
- Time: O(n log² n) [optimized]
- Space: O(1)
- Advantage: No recursion stack needed
General-Purpose Sorting Algorithms
Comparison-Based Sorting (Ω(n log n) lower bound)
Algorithm | Time | Space | Stable | Notes |
---|---|---|---|---|
Merge Sort | O(n log n) | O(n) | Yes | Best for linked lists, stable |
Heap Sort | O(n log n) | O(1) | No | Not suitable for linked lists |
Quick Sort | O(n log n)* | O(log n) | No | *O(n²) worst-case but fastest |
Key Points:
- QuickSort: Fastest in practice (average case)
- MergeSort:
- Stable
- Excellent for linked lists/external sorting
- Parallelizes well
- HeapSort:
- In-place but not stable
- Cannot be easily adapted for linked lists
Hybrid Sorting Algorithms (Used in Standard Libraries)
-
Tim Sort (Python):
- Combination of Merge Sort + Insertion Sort
- Uses Insertion Sort for small subarrays
-
Intro Sort (C++ std::sort):
- Hybrid of QuickSort + HeapSort + Insertion Sort
- Defaults to QuickSort
- Falls back to HeapSort for bad partitions
- Uses Insertion Sort for small arrays
-
Stable Sort (C++ std::stable_sort):
- Typically implemented with Merge Sort
- Used when stability is required
JavaScript's Array.sort()
-
Implementation varies by engine:
Engine Algorithm Notes V8 (Chrome) Tim Sort (since 2018) Previously used QuickSort SpiderMonkey (Firefox) Merge Sort Stable implementation JavaScriptCore (Safari) Hybrid QuickSort + Insertion Non-stable // String sorting (lexicographic) ['banana', 'apple'].sort(); // ['apple', 'banana'] // Numeric sorting requires comparator [10, 2, 1].sort((a,b) => a-b); // [1, 2, 10]
Summary Table
Use Case | Best Algorithm(s) | Time Complexity |
---|---|---|
Binary arrays | QuickSort Partition | O(n) |
Small ranges | Counting Sort | O(n + k) |
Large ranges | Radix Sort | O(n) |
Uniform distribution | Bucket Sort | O(n) avg |
Memory-constrained | Shell Sort | O(n log² n) |
General purpose | QuickSort/MergeSort | O(n log n) |
Stable sorting | Merge Sort | O(n log n) |
Small arrays (n ≤ 20) | Insertion Sort | O(n²) |