Data Structure and Algorithm
Algorithms
Best, Average and Worst Case

Best, Average and Worst Case Analysis

Introduction

When analyzing algorithms, we often encounter functions that behave differently depending on the input. Some algorithms take constant time in certain scenarios and linear time in others. To properly analyze such algorithms, we divide all possible cases into three categories:

  • Best Case: Minimum time complexity
  • Average Case: Expected time complexity across all inputs
  • Worst Case: Maximum time complexity

Example 1: Simple Array Sum Function

Let's start with a basic example to understand order of growth:

function getSum(arr) {
    let sum = 0;                    // C2 operations (constant)
    for (let i = 0; i < arr.length; i++) {
        sum += arr[i];              // C1*n operations (linear)
    }
    return sum;                     // C2 operations (constant)
}

Analysis:

  • Constant operations: Initialization of sum, i, and return → C2
  • Linear operations: Loop condition check, increment, addition → C1 * n
  • Total time: C1*n + C2
  • Order of growth: O(n) (linear)

This function always has the same complexity regardless of input, so:

  • Best Case: O(n)
  • Average Case: O(n)
  • Worst Case: O(n)

Example 2: Conditional Array Sum Function

Now let's examine a more complex function with different behaviors:

function conditionalSum(arr) {
    if (arr.length % 2 === 0) {     // Even number of elements
        return 0;                   // Constant time
    }
    
    // Odd number of elements - calculate sum
    let sum = 0;
    for (let i = 0; i < arr.length; i++) {
        sum += arr[i];
    }
    return sum;
}

Analysis by Cases:

Best Case Analysis

  • Scenario: Array has even number of elements
  • Operations: Length calculation + modulo check + return
  • Time Complexity: O(1) (constant)
  • Example: [2, 3] → returns 0 immediately

Worst Case Analysis

  • Scenario: Array has odd number of elements
  • Operations: Length check + full array traversal + sum calculation
  • Time Complexity: O(n) (linear)
  • Example: [2, 3, 1] → returns 6 after processing all elements

Average Case Analysis

  • Assumption: Even and odd-sized arrays are equally likely
  • Calculation:
    • 50% of time: C1 (constant time for even arrays)
    • 50% of time: C2*n + C3 (linear time for odd arrays)
    • Average: (C1/2) + (C2*n + C3)/2
  • Time Complexity: O(n) (linear, after ignoring constants)

When to Use Each Analysis

Best Case Analysis

  • Usage: Rarely used in practice
  • Reason: Provides limited practical value
  • Problem: Doesn't give meaningful performance guarantees

Average Case Analysis

  • Usage: Ideal but often impractical
  • Challenges:
    • Requires knowledge of all possible inputs
    • Need to know probability distribution of inputs
    • Often involves complex mathematical analysis
  • When feasible: Make reasonable assumptions about input distribution

Worst Case Analysis

  • Usage: Most commonly used in algorithm analysis
  • Advantages:
    • Provides performance guarantees
    • Easier to analyze than average case
    • Gives upper bound on algorithm performance
  • Approach: Consider the input that causes maximum execution time

Real-World Example: Insertion Sort

Insertion Sort demonstrates all three cases clearly:

Best Case: O(n)

  • Scenario: Array is already sorted
  • Example: [1, 2, 3, 4, 5]
  • Behavior: Each element is compared once and found to be in correct position

Worst Case: O(n²)

  • Scenario: Array is sorted in reverse order
  • Example: [5, 4, 3, 2, 1]
  • Behavior: Each element must be compared with all previous elements

Average Case: O(n²)

  • Scenario: Random unsorted array
  • Analysis: On average, each element is compared with half of the previous elements

Key Principles

1. Algorithm Categories

Single-case algorithms: Always have the same complexity

  • Finding maximum/minimum in array
  • Simple array traversal
  • Basic mathematical operations

Multi-case algorithms: Complexity varies with input

  • Searching algorithms
  • Sorting algorithms with optimizations
  • Algorithms with conditional logic

2. Analysis Priority

  1. Worst Case (Most Important)

    • Provides performance guarantees
    • Used in most algorithm analysis
    • Critical for system design
  2. Average Case (Desirable but Complex)

    • Most realistic performance measure
    • Requires statistical assumptions
    • Often mathematically challenging
  3. Best Case (Least Useful)

    • Rarely considered in practice
    • Provides no meaningful guarantees
    • Can be misleading

3. Practical Guidelines

  • For system design: Always consider worst-case performance
  • For optimization: Understand average-case behavior
  • For comparison: Use worst-case analysis as standard metric

Summary

Understanding best, average, and worst-case analysis is crucial for:

  • Performance prediction: Know how algorithms behave under different conditions
  • Algorithm selection: Choose appropriate algorithms for specific use cases
  • System design: Plan for worst-case scenarios to ensure reliability
  • Optimization: Focus efforts on improving common-case performance

Remember: While average case gives the most realistic performance picture, worst-case analysis provides the guarantees needed for robust system design. Most algorithm analysis focuses on worst-case complexity because it's both practical to compute and provides meaningful performance bounds.