Data Structure and Algorithm
Algorithms
Analysis of Common Loops

Analysis of Common Loops in JavaScript

Introduction

Understanding the time complexity of loops is fundamental to algorithm analysis. This document covers the analysis of five common loop patterns that appear frequently in JavaScript programs, as well as multiple loop scenarios. For each loop, we assume that the work inside the loop body is constant (Θ(1)) unless otherwise specified.

Part 1: Single Loop Patterns

Loop Pattern 1: Linear Increment Loop

Code Structure

for (let i = 0; i < n; i += c) {
    // Constant work here
}

Where:

  • n is the user input (variable)
  • c is a business constant (e.g., 2, 3, 5)

Examples

Example 1: n = 10, c = 2
  • Loop iterations: i = 0, 2, 4, 6, 8
  • When i becomes 10, loop exits
  • Total iterations: 5 times
Example 2: n = 20, c = 6
  • Loop iterations: i = 0, 6, 12, 18
  • When i becomes 24 (> 20), loop exits
  • Total iterations: 4 times

Analysis

  • Number of iterations: ⌈n/c⌉ (ceiling of n/c)
  • Time Complexity: Θ(n)
  • Reasoning: We ignore constants and ceiling/floor operations in asymptotic analysis

Loop Pattern 2: Linear Decrement Loop

Code Structure

for (let i = n; i > 0; i -= c) {
    // Constant work here
}

Examples

Example 1: n = 10, c = 2
  • Loop iterations: i = 10, 8, 6, 4, 2
  • When i becomes 0, loop exits
  • Total iterations: 5 times
Example 2: n = 20, c = 6
  • Loop iterations: i = 20, 14, 8, 2
  • When i becomes -4, loop exits
  • Total iterations: 4 times

Analysis

  • Number of iterations: ⌈n/c⌉
  • Time Complexity: Θ(n)

Loop Pattern 3: Exponential Increment Loop

Code Structure

for (let i = 1; i < n; i *= c) {
    // Constant work here
}

Examples

Example 1: n = 33, c = 2
  • Loop iterations: i = 1, 2, 4, 8, 16, 32
  • When i becomes 64 (> 33), loop exits
  • Total iterations: 6 times
Example 2: n = 81, c = 3
  • Loop iterations: i = 1, 3, 9, 27, 81
  • When i becomes 243 (> 81), loop exits
  • Total iterations: 5 times

Mathematical Analysis

The loop runs from i = c⁰ to i = c^(k-1), where k is the number of iterations.

For the loop to terminate: c^(k-1) < n

Taking logarithm: k-1 < log_c(n)

Therefore: k < log_c(n) + 1

Analysis

  • Number of iterations: ⌈log_c(n)⌉
  • Time Complexity: Θ(log n)

Loop Pattern 4: Division Loop

Code Structure

for (let i = n; i > 1; i = Math.floor(i / c)) {
    // Constant work here
}

Examples

Example 1: n = 33, c = 2
  • Loop iterations: i = 33, 16, 8, 4, 2
  • When i becomes 1, loop exits
  • Total iterations: 5 times
Example 2: n = 81, c = 3
  • Loop iterations: i = 81, 27, 9, 3
  • When i becomes 1, loop exits
  • Total iterations: 4 times

Mathematical Analysis

The loop sequence: n, n/c, n/c², ..., n/c^(k-1)

For termination: n/c^(k-1) > 1

Therefore: c^(k-1) < n

Taking logarithm: k-1 < log_c(n)

So: k < log_c(n) + 1

Analysis

  • Number of iterations: ⌊log_c(n)⌋
  • Time Complexity: Θ(log n)

Loop Pattern 5: Exponential Exponentiation Loop

Code Structure

for (let i = 2; i < n; i = Math.pow(i, c)) {
    // Constant work here
}

Examples

Example 1: n = 33, c = 2
  • Loop iterations: i = 2, 4, 16
  • Next would be 16² = 256 (> 33), so loop exits
  • Total iterations: 3 times
Example 2: n = 512, c = 3
  • Loop iterations: i = 2, 8, 512
  • Next would be 512³ (huge number > 512), so loop exits
  • Total iterations: 3 times

Key Observation

Even for very large values of n (like 512), this loop runs only a few iterations, indicating extremely slow growth.

Mathematical Analysis

The loop sequence: 2, 2^c, 2^(c²), 2^(c³), ..., 2^(c^(k-1))

For termination: 2^(c^(k-1)) < n

Taking logarithm: c^(k-1) < log₂(n)

Taking logarithm again: k-1 < log_c(log₂(n))

Therefore: k < log_c(log₂(n)) + 1

Analysis

  • Number of iterations: ⌈log_c(log₂(n))⌉
  • Time Complexity: Θ(log log n)

Part 2: Multiple Loop Analysis

When analyzing functions with multiple loops, we need to consider how loops are arranged: sequentially (one after another) or nested (one inside another).

Example 1: Sequential Loops

Code Structure

// First loop: Linear
for (let i = 0; i < n; i++) {
    // Constant work
}
 
// Second loop: Logarithmic
for (let i = 1; i < n; i *= 2) {
    // Constant work
}
 
// Third loop: Constant
for (let i = 1; i <= 99; i++) {
    // Constant work
}

Analysis

  • First loop: Θ(n) - runs n times
  • Second loop: Θ(log n) - exponential increment pattern
  • Third loop: Θ(1) - runs constant 99 times (independent of n)

Overall Time Complexity

When loops are sequential, we add their complexities:

Total = Θ(n) + Θ(log n) + Θ(1)

Since we ignore lower-order terms:

Total = Θ(n)

Example 2: Nested Loops

Code Structure

for (let i = 0; i < n; i++) {        // Outer loop: Θ(n)
    for (let j = 1; j < n; j *= 2) { // Inner loop: Θ(log n)
        // Constant work
    }
}

Analysis

  • Outer loop: Runs n times
  • Inner loop: Runs log n times for each iteration of outer loop
  • Constant work: Executed n × log n times

Overall Time Complexity

When loops are nested, we multiply their complexities:

Total = Θ(n) × Θ(log n) = Θ(n log n)

Example 3: Mixed Loops (Sequential + Nested)

Code Structure

// First part: Nested loops
for (let i = 0; i < n; i++) {        // Θ(n)
    for (let j = 1; j < n; j *= 2) { // Θ(log n)
        // Constant work
    }
}
 
// Second part: Nested loops
for (let i = 0; i < n; i++) {        // Θ(n)
    for (let j = 0; j < n; j++) {    // Θ(n)
        // Constant work
    }
}

Analysis

  • First part: Θ(n log n) - from nested analysis above
  • Second part: Θ(n²) - n × n iterations

Overall Time Complexity

Since these are sequential sections, we add them:

Total = Θ(n log n) + Θ(n²)

Since n² grows faster than n log n, we keep the higher-order term:

Total = Θ(n²)

Example 4: Multiple Input Variables

Code Structure

// First part: depends on n
for (let i = 0; i < n; i++) {        // Θ(n)
    for (let j = 1; j < n; j *= 2) { // Θ(log n)
        // Constant work
    }
}
 
// Second part: depends on m
for (let i = 0; i < m; i++) {        // Θ(m)
    for (let j = 0; j < m; j++) {    // Θ(m)
        // Constant work
    }
}

Analysis

  • First part: Θ(n log n)
  • Second part: Θ(m²)

Overall Time Complexity

Since n and m are different input variables, we cannot ignore either term:

Total = Θ(n log n + m²)

This can also be written as:

Total = Θ(n log n + m²)

Real-World Examples

  1. Graph Algorithms: Often expressed in terms of vertices (V) and edges (E)

    • Example: O(V + E) or O(V²)
  2. Merging Sorted Arrays: Merging array of size n with array of size m

    • Time complexity: O(n + m)
  3. Matrix Operations: Operating on matrices of different dimensions

    • Example: Multiplying n×m matrix with m×p matrix: O(nmp)

Summary Table

Loop PatternCode PatternTime ComplexityGrowth Rate
Linear Incrementi += cΘ(n)Linear
Linear Decrementi -= cΘ(n)Linear
Exponential Incrementi *= cΘ(log n)Logarithmic
Divisioni /= cΘ(log n)Logarithmic
Exponential Exponentiationi = i^cΘ(log log n)Double Logarithmic

Multiple Loop Rules

Loop ArrangementHow to CombineExample
SequentialAdd complexitiesΘ(n) + Θ(log n) = Θ(n)
NestedMultiply complexitiesΘ(n) × Θ(log n) = Θ(n log n)
MixedApply rules separately then combineΘ(n log n) + Θ(n²) = Θ(n²)

Important Notes

Base Independence in Logarithms

When expressing asymptotic complexity, the base of logarithms doesn't matter because:

  • Different bases differ only by a constant factor
  • Constants are ignored in asymptotic analysis
  • Therefore, we can write log n instead of log_c(n)

Growth Rate Comparison

From fastest to slowest growing:

  1. Θ(log log n) - Extremely slow growth
  2. Θ(log n) - Logarithmic growth
  3. Θ(n) - Linear growth
  4. Θ(n log n) - Linearithmic growth
  5. Θ(n²) - Quadratic growth

Practical Implications

  • Log log n: Almost constant for practical input sizes
  • Log n: Very efficient, suitable for large datasets
  • Linear: Acceptable for most applications, scales directly with input
  • n log n: Common in efficient algorithms (merge sort, heap sort)
  • : Becomes impractical for large inputs

Key Takeaways

  1. Pattern Recognition: Learn to identify these common loop patterns quickly
  2. Mathematical Foundation: Understanding the mathematical derivation helps in analyzing complex nested loops
  3. Asymptotic Behavior: Focus on the dominant term and ignore constants
  4. Practical Impact: Even small differences in complexity classes can have huge performance implications for large inputs
  5. Multiple Loops:
    • Sequential loops: Add complexities
    • Nested loops: Multiply complexities
    • Always keep the highest-order term in the final result
  6. Multiple Inputs: When dealing with different input variables, express complexity in terms of all relevant variables

Understanding these fundamental loop patterns and combination rules forms the foundation for analyzing more complex algorithms and nested loop structures.