Analysis of Recursion
Recursive algorithms are functions that call themselves to solve smaller instances of the same problem. Analyzing their time complexity involves understanding recurrence relations, which describe how the problem size reduces with each recursive call. Let’s break it down with examples.
What is a Recursive Function?
A recursive function is a function that calls itself during its execution. For example:
function fun(n) {
if (n <= 0) return;
console.log("GFG");
fun(n / 2); // Recursive call
fun(n / 2); // Recursive call
}Recurrence Relation
To analyze recursive algorithms, we use recurrence relations, which express the time complexity in terms of smaller inputs.
Example 1: Simple Recursion
function fun(n) {
if (n <= 0) return;
console.log("GFG");
fun(n / 2); // Recursive call
fun(n / 2); // Recursive call
}- Recurrence Relation: $T(n) = 2T(n/2) + \Theta(1)$
- $2T(n/2)$: Two recursive calls with half the input size.
- $\Theta(1)$: Constant work done in each call.
- Base Case: $T(0) = \Theta(1)$.
- Time Complexity: $\Theta(n)$.
Example 2: Recursion with a Loop
function fun(n) {
if (n == 0) return; // Base case
for (let i = 0; i < n; i++) { // O(n)
console.log("GFG");
}
fun(n / 2); // T(n/2)
fun(n / 3); // T(n/3)
}- Recurrence Relation: $T(n) = T(n/2) + T(n/3) + \Theta(n)$
- Base Case: $T(0) = \Theta(1)$.
- Time Complexity: $\Theta(n)$.
Example 3: Linear Recursion
function fun(n) {
if (n == 1) return; // Base case
console.log("GFG"); // \Theta(1)
fun(n - 1); // T(n-1)
}- Recurrence Relation: $T(n) = T(n-1) + \Theta(1)$
- Base Case: $T(1) = \Theta(1)$.
- Time Complexity: $\Theta(n)$.
Conclusion
In this article, we studied how to analyze recursive algorithms using recurrence relations. We explored examples like simple recursion, recursion with loops, and linear recursion, and learned how to express their time complexity. By understanding recurrence relations and base cases, you can determine the efficiency of recursive functions and improve their performance.