Recurrence relation

Recurrence relation is used to analyze the time complexity of recursive algorithms in terms of input size.

  • A recurrence relation is a mathematical equation in which any term is defined by its previous terms.
  • Recurrence relation is used to analyze the time complexity of recursive algorithms in terms of input size.

Example — Reverse an array (Decrease by a constant)

  • solving the subproblem of reversing (n-2) size array.
public void reverse(int[] a, int l, int r){
if(l >= r)
return;

swap(a[l], a[r]);
revserse(a, l-1, r+1); // inut size decreased by 2
}
  • Time complexity => Time complexity of solving an (n-2) size problem + Time complexity of swapping operation
  • recurrence relation => T(n) = T(n-2) + c, where T(1) = c
T(n) = T(n-2)+O(1)

Methods to solve recurrence relations

  1. Recursion Tree Method
  2. Master Theorem

Recursion Tree Method

Time complexity of a recursive function depends on 2 factors.
1. Total number of recursive calls
2. Time complexity of additional operations for each recursive call.

Merge sort

  • recurrence relation: T(n) = 2T(n/2) + cn, T(1) = c
  • cn = extra cost of merging the solution of two smaller sub-problems of size n/2
Source — https://www.enjoyalgorithms.com/blog/time-complexity-analysis-of-recursion-in-programming
  • 1st level = cn (root)
  • 2nd level = cn/2 + cn/2 => cn
  • 3rd level = cn/4+ cn/4 + cn/4 + cn/4 => cn
  • at bottom => n nodes, each contribute a cost of c. so total cost is cn
  • level i has 2^i nodes, each contributing a cost of c(n/2^i)
  • total cost of any ith level = 2^i * (c(n/2^i)) = cn
Source — https://www.enjoyalgorithms.com/blog/time-complexity-analysis-of-recursion-in-programming
  • Here, the recursion tree is a full binary tree structure where each node has two children, and each level is completely full.
  • At each level, the input size decreases by a factor of 2, and the recursion tree will terminate at an input size of 1. So the height of the recursion tree
  • The total number of levels in a binary tree = height of the binary tree + 1 = logn + 1.
  • To compute the total cost represented by the recurrence, we add up the costs of all the levels. The recursion tree has logn + 1 levels, each costing cn. So the total cost = cn * (logn + 1) = cnlogn + cn.
  • After ignoring the low-order term and the constant c in (cnlogn + cn), the time complexity of merge sort = O(nlogn).

TL;DR

  • The time complexity of recursion depends on the number of times the function calls itself.
  • If a function calls itself two times then its time complexity is O(2^N).
  • If it calls three times then its time complexity is O(3^N) and so on.

Master Theorem

  • The master theorem is used to determine the time complexity of the divide and conquer algorithm, which divides an input into smaller subproblems of equal size.
  • It is basically a straightforward method for obtaining the solution for recurrences of the type:

T(n) = aT(n/b) + O(nk), where a1 and b>1.

  • The master theorem recurrence describes time complexity of an algorithm that divides a problem of size n into a number of subproblems, each of size n/b, where a and b are positive constants
  • Here “a” number of subproblems are solved recursively, each in time T(n/b) and O(n^k) is the cost of dividing the problem and combining the solution of subproblems.

Time complexity

  1. Time complexity is
number of nodes = 2^n
TC = number of nodes * work done = 2^n * work done
number of nodes = 2^n

2. Tree where one node is less than previous one

number of nodes = n!
TC = number of nodes * work done = n! * work done
number of nodes = n!

--

--

Responses (1)