Master's Algorithm

Challenge Inside! : Find out where you stand! Try quiz, solve problems & win rewards!

Learn via video course

DSA Problem Solving for Interviews using Java
DSA Problem Solving for Interviews using Java
By Jitender Punia
Free
star4.9
Enrolled: 1000
DSA Problem Solving for Interviews using Java
DSA Problem Solving for Interviews using Java
Jitender Punia
Free
4.9
icon_usercirclecheck-01Enrolled: 1000
Start Learning

Abstract

Master's Theorem is the best method to quickly find the algorithm's time complexity from its recurrence relation. This theorem can be applied to decreasing as well as dividing functions, each of which we'll be looking into detail ahead.

Scope

  • This article starts with the need for the Master's Theorem and the definition for the same.
  • Master's Theorem can be applied to 'dividing' and 'decreasing' recurrence relations. Both of which are discussed in detail.
  • Further, it covers important cases that cover the computation involved in the Master's Algorithm.
  • To understand any logical theorem, we need to understand how we reached that result. Hence, the idea behind the Master's Theorem for dividing functions is discussed in detail. Each term in the recurrence form and each comparison are explained in detail.
  • Master's Theorem is made easy for the reader by explaining the proof and solving Master's Theorem examples for both dividing and decreasing functions.
  • Every Theorem or method comes with its underlying limitations, which are important to understand to get the loopholes so that they can be avoided.
  • In our Data Structures and Algorithms course, we thoroughly explore the Master's Theorem, providing a detailed understanding of its underlying idea, proof, and practical application. We also highlight the limitations of various theorems and methods, enabling learners to identify and address potential loopholes effectively.

Introduction

The most widely used method to analyze or compare various algorithms is by computing their time complexities. As the algorithm gets complex, the time complexity function calculation also complexifies.

Recursive functions call themselves in their body. It might get complex if we start calculating its time complexity function by other commonly used simpler methods. Master's method is the most useful and easy method to compute the time complexity function of recurrence relations.

We can apply Master's Theorem for:

  1. Dividing functions
  2. Decreasing Functions

Master's Method for Dividing Functions

Highlights:

  • Used to directly calculate the time complexity function of 'dividing' recurrence relations of the form:
    • T(n)T(n) = aT(n/b)+f(n)aT(n/b) + f(n)
  • where f(n)f(n) = θ(nklogpn)θ(n^{k} log^{p}n)
  • Compare logbalog_ba and kk to decide the final time complexity function.

Master's Theorem is the most useful and easy method to compute the time complexity function of recurrence relations. Master's Algorithm for dividing functions can only be applied on the recurrence relations of the form: T(n)T(n) = aT(n/b)+f(n)aT(n/b) + f(n), where f(n)=θ(nklogpn)f(n) = θ(n^{k} log^pn)
for example:

  • T(n)T(n) = 2T(n/2)2T(n/2) + n2n^{2}
  • T(n)T(n) = T(n/4)T(n/4) + nlognnlogn

where,

n = input size (or the size of the problem)
a = count of subproblems in the dividing recursive function
n/b = size of each subproblem (Assuming size of each subproblem is same)

where the constants a, b, and k are constants and follow the following conditions:

  1. a>=1
  2. b>1
  3. k>=0

T(n)=aT(nb)+θ(nklogpn)T(n) = aT({n\over b}) + \theta(n^k{log^p n})

Master's Theorem states that:

  • Case 1) If logba>klog_ba>k then:
    • T(n)T(n) = θ(nlogba)(n^{log_b a})
  • Case 2) If logba=klog_ba=k, then:
    • a) If p>-1, then T(n)T(n) = θ(nklog(p+1)n)θ(n^{k} log^{(p+1)}n)
    • b) If p=-1, then T(n)T(n) = θ(nkloglogn)θ(n^{k} loglog n)
    • c) p<-1, then T(n)T(n) = θ(nk)θ(n^{k})
  • Case 3) If logba<klog_ba<k, then:
    • If p>=0, then T(n)T(n) = θ(nklogpn)θ(n^{k} log^{p}n)
    • If p<0, then T(n)T(n) = θ(nk)θ(n^{k})

The same can also be written as:

  • Case 1- If a>bka>b^k, then:
    • T(n)T(n) = θ(nlogba)(n^{log_b a})
  • Case 2- If a=bka=b^k, then:
    • a) If p>-1, then T(n)T(n) = θ(nlogbalogp+1n)(n^{log_b a} log ^ {p+1} n)
    • b) If p=-1, then T(n)=θ(nlogbaloglogn)T(n) = θ(n^{log_b a} loglogn)
    • c) If p<-1, then T(n)=θ(nlogba)T(n) = θ(n^{log_b a})
  • Case 3- If a<bka<b^k, then:
    • a) If p>=0, then T(n)=θ(nklogpn)T(n) = θ(n^k log^p n)
    • b) If p<0, then T(n)=θ(nk)T(n) = θ(n^k)

Hence, using the values of a, b, k and p, we can easily find the time complexity function of any dividing recurrence relation of the above-stated form. We'll solve some examples in the upcoming section.

Master's Theorem for Decreasing Functions

Highlights:

  • Used to directly calculate the time complexity function of 'decreasing' recurrence relations of the form:
    • T(n)T(n) = aT(nb)aT(n-b) + f(n)f(n)
  • f(n)f(n) = θ(nk)θ(n^{k})
  • The value of 'a' will decide the time complexity function for the 'decreasing' recurrence relation.

For decreasing functions of the form T(n)=aT(nb)+f(n)T(n) = aT(n-b) + f(n),
where f(n)f(n) = θ(nk)θ(n^{k})
for example:

  • T(n)=T(n2)+1T(n) = T(n-2) + 1
  • T(n)=2T(n1)+n2T(n) = 2T(n-1) + n^2

where:
n = input size (or the size of the problem)
a = count of subproblems in the decreasing recursive function
n-b = size of each subproblem (Assuming size of each subproblem is same)

Here a, b, and k are constants that satisfy the following conditions:

  • a>0, b>0
  • k>=0

Case 1) If a<1, then T(n)=θ(nk)T(n) = θ(n^{k})

Case 2) If a=1, then T(n)=θ(nk+1)T(n) = θ(n^{k+1})

Case 3) If a>1, then T(n)=θ(nnbf(n))T(n) = θ(n^{n\over b} * f(n))

Hence, given a decreasing function of the above-mentioned form, we can easily compare it to the given form to get the values of a, b, and k. With that then, we can find the time complexity function by looking at the three possible cases mentioned above. We'll solve some examples in the upcoming section to understand better.

Idea Behind Master's Theorem for Dividing Functions

Highlights:

  • To calculate logbalog_ba and compare it with f(n)f(n) to decide the final time complexity of the recurrence relation T(n)T(n)

The idea behind Master's algorithm is based on the computation of nlogban^{logba} and comparing it with f(n)f(n). The time complexity function comes out to be the one overriding the other.

For example:

  • In Case I, nlogban^{logba} dominates f(n)f(n). So the time complexity function T(n)T(n) comes out to be θ(nlogba)θ(n^{logba})
  • In Case II, nlogban^{logba} is dominated by f(n)f(n). Hence, the f(n)f(n) function will take more time, and the time complexity function, in this case, is θ(f(n))θ(f(n)), i.e., θ(nklogpn)θ(n^{k} log^{p}n).
  • In case III, when logba=klog_ba=k, the time complexity function is going to be θ(nlogbalogn)θ(n^{logba} log n)

Now, Why do We Compare nlogban^{logba} with f(n)f(n) and not Some Other Computation?

Given T(n)T(n) = aT(n/b)aT(n/b) + f(n)f(n), we need to calculate the time complexity of T(n)T(n). If we calculate the time taken by both the terms individually and compare as to which one dominates, we can easily define the time complexity function of T(n)T(n) to be equal to the one that dominates.

To achieve so, let's first understand the first term, i.e., aT(n/b)aT(n/b), and then, we will calculate the time taken by this term.

Understanding the term - aT(n/b)aT(n/b) aT(n/b)aT(n/b) means - For our problem of size n, we divide the whole problem into 'a' subproblems of size 'n/b' each. Suppose T(n)T(n)^'^ is the time taken by the first term.

Hence,
=> T(n)=aT(n/b)T(n)^{'} = aT(n/b)
T(n/b)T(n/b) can again be divided into sub problems of size (n/b2)(n/b^{2}) each.

Hence,
=> T(n)=aT(n/b)=a2T(n/b2)T(n)^{'} = aT(n/b) = a^{2}T(n/b^{2})
Similarly,
=> T(n)=a3T(n/b3)T(n)^{'} = a^{3}T(n/b^{3}) and so on.
to
=> T(n)=aiT(n/bi)T(n)^{'} = a^{i}T(n/b^{i})

Master's Theorem for Dividing Functions

Let's assume, n=bkn = b^{k} (When we divide a problem to 1/2 each time, we can say that n=2kn = 2^{k}. Here we divide the problem with b each time, hence, n=bkn = b^{k})
=> n=bkn = b^{k}
=> logbn=klog_bn = k

At ith level, the size of the sub-problem will reduce to 1, i.e. at ith level,

=> n/bi=1n/b^{i} = 1
=> n=bin = b^{i}
=> logbn=i=klog_bn = i = k,
Therefore, i = k at the last level, where the size of the sub-problem reduces to 1.

Using this deduction, we can re-write T(n)T(n)^{'} as:

=> T(n)=aiT(n/bi)T(n)^{'} = a^{i}T(n/b^{i})
=> alogbnT(bi/bi)a^{logbn}T(b^{i}/b^{i})
=> alogbnT(1)a^{logbn}T(1)
Hence, T(n)=θ(alogbn)=θ(nlogba)T(n)^{'} = θ(a^{logbn}) = θ(n^{logba})

T(n)T(n)^{'} was assumed to be the time complexity function for the first term. Hence proved that the first term takes nlogban^{logba} time complexity. That is why we compare nlogban^{logba} (i.e. the first term) with f(n) (i.e. the second term) to decide which one dominates, which decides the final time complexity function for the recurrence relation.

Proof of Master's Algorithm

T(n)=aT(nb)+O(nd)T(n) = a* T({n\over b}) + O(n^d)

This form of recurrence relation is in the form of a tree, as shown below, where every iteration divides the problem into 'a' sub-problems of size (n/b).

Proof of Master's Theorem

At the last level of the tree, we will be left with problems of size 1. This is the level where we have the shortest possible sub-problems.

The level = logbnlog_bn
Size of sub-problem = 1

Let us now calculate the work done at each level and that will help us compute the total work done, i.e. the sum of work done at logbn levels.

  • Work done at level 1 => O(nd)O(n^{d})
    n^d^ comes from the recurrence relation
  • Work done at level 2 => aO(n/bd)a * O(n/b^{d}) => a/bdO(nd)a/b^{d} * O(n^{d}) At second level, the problems are divided into size n/d, and there are 'a' sub-problems.
  • Work done at level k => (a/bd)kO(nd)(a/b^{d})^{k} * O(n^{d})
  • Hence, Total Work Done => Σk=0logbn(a/bd)kO(nd)Σ_{k=0}^{log_bn} (a/b^{d})^{k} * O(n^{d}).

Note: This Summation forms a Geometric Series.

Now, let us look at every single case and prove it.

Proof of Case I: When d<logabd < log_ab

  • In this case the work done is increasing as we move down the tree, i.e. towards the lower levels of the tree, with more sub-problems.
  • Also, the total work done is an expression of Geometric Series, and in this case, r<1. Hence, the last term of the geometric series will have the maximum effect.
  • Hence, the total work done will only have the last term, since, all the other terms of the geometric series can be ignored as compared to the last one.
  • Hence, we can substitute k = logbn in the time complexity expression.

=> O(O(nd)(abd)logbn)O(O(n^d)({a\over b^d})^{log_b n})
=> O(O(nd)alogbnbdlogbn)O(O(n^d) {a^{log_b n} \over b^d log_b n})
=> O(O(nd)nlogband)O(O(n^d) {n^{log_b a} \over n^d})
=> O(nlogba)O(n ^{log_b a})

Proof of Case II: When d=logabd = log_ab

  • This means that the work done by each term in the geometric series is equal.
  • Hence, the total work done is work done at each level * Number of levels.

=> O(i=0logbnO(nd))O(\sum_{i=0}^{log_b n} O(n^d))
=> (1+logbn)O(nd)(1 + log_b n) O(n^d)
=> O(ndlogn)O(n^d logn)

Proof of Case III: When d>logabd > log_ab

  • This is exactly the opposite of the first case discussed.
  • This is the case of decreasing Geometric Series. Hence, the first term is the maximum term and the major term to be considered.
  • Hence, the total work done = first term, i.e. O(n^d^)

Hence, Proved!

Examples of Master's Theorem for Dividing Function

We'll now solve a few examples to understand Master's Theorem better. Before solving these examples, remember:

Master's Theorem can solve 'dividing' recurrence relations of the form T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n), where f(n)=θ(nklogpn)f(n) = θ(n^{k} log^{p}n)

where the constants a, b, and k follow the following conditions:

  • a>=1
  • b>1
  • k>=0

Example 1: T(n)=T(n/2)+n2T(n) = T(n/2) + n^{2}

After comparison, a = 1, b = 2, k = 2, p = 0, f(n)=n2f(n) = n^{2}

  • Step 1: Calculate logbalog_ba and k
    • logba=log21=0log_ba = log_21 = 0
    • k = 2,
  • Step 2: Compare with k
    • logba<klog_ba < k (Case III)
  • Step 3: Calculate time complexity
    • If p>=0p>=0, then T(n)=θ(nklogpn)T(n) = θ(n^{k} log^{p}n)
    • T(n)=θ(n2log0n)T(n) = θ(n^{2} log^{0}n)
    • T(n)=θ(n2)T(n) = θ(n^{2})

Final Time complexity: θ(n^2^)

Example KaTeX parse error: Expected 'EOF', got '√' at position 14: 2: T(n) = 2T(√̲n) + logn

This recurrence relation function is not of the form aT(n/b)+f(n)aT(n/b) + f(n), but can be converted to this form and then, solved with Master's Theorem. Let us convert the function form:

Let logn=m=>n=2mlogn = m => n = 2^{m}

On replacing nn with 2m2^{m} everywhere, we get:

T(2m)=2T(2m/2)+mT(2^{m}) = 2T(2^{m/2}) + m
Suppose T(2m)=S(m)T(2^{m}) = S(m)
=> S(m)=2S(m/2)+mS(m) = 2S(m/2) + m
Finally ,we have this equation of the form T(n) = aT(n/b) + f(n), where a = 2, b = 2, k = 1, p = 0, and f(m) = m

  • Step 1: Calculate logbalog_ba and k
    • logba=log22=1log_ba = log_22 = 1
    • k = 1,
  • Step 2: Compare with k
    • logba=klog_ba = k (Case II)
  • Step 3: Calculate time complexity*
    • If p>-1, then T(m)=θ(mklogp+1m)T(m) = θ(m^{k} log^{p+1}m)
    • T(m)=θ(m1log1m)T(m) = θ(m^{1} log^{1}m)
    • T(m)=θ(mlogm)T(m) = θ(mlogm)
  • Step 4: Replacing m with T(2n)T(2^{n})

Final Time complexity: θ(n2)θ(n^{2})

Examples of Master's Theorem for Decreasing Function

We can solve 'decreasing' functions of the following form using 'Master's Theorem for Decreasing Functions'. Form: T(n) = aT(n-b) + f(n), where f(n) = θ(n^k)

Here a, b, and k are constants that satisfy the following conditions:

  • a>0, b>0
  • k>=0

Example 1:

T(n)=2T(n2)+nT(n) = 2T(n-2) + n
After comparison,
a=2,b=2,k=1,f(n)=na = 2, b = 2, k = 1, f(n) = n

  • Step 1: Compare 'a' and 1
    • Since a>1,
    • T(n)=θ(anbf(n))T(n) = θ(a^{n\over b} * f(n))
  • Step 2: Calculate time complexity
    • Putting values of a = 2, b = 2, f(n) = n.
    • T(n)=θ(2n2n)T(n) = θ(2^{n\over 2} * n)

Final Time complexity: θ(2n2n)θ(2^{n\over 2} * n)

Example 2:

T(n)=1/2T(n1)+n2T(n) = 1/2T(n-1) + n^{2} After comparison, a = 1/2, b = 1, k = 2, f(n)=n2f(n) = n^{2}

  • Step 1: Compare 'a' and 1
    • Since a<1a<1,
    • T(n)T(n) = θ(a^k)
  • Step 2: Calculate time complexity
    • Putting values of a = 2,and k = 2.
    • T(n)=θ(n2)T(n) = θ(n^{2})

Final Time complexity: θ(n^2^)

Limitations of Master's Method

Highlights:

Relation function cannot be solved using Master's Theorem if:

  • T(n) is a monotone function
  • a is not a constant
  • f(n) is not a polynomial

Master's Theorem can only be used for recurrence relations of the form:

  • T(n)T(n) = aT(n/b)aT(n/b) + f(n)f(n), where f(n)=θ(nklogpn)f(n) = θ(n^{k} log^{p}n) (Dividing Recurrence Relation) Or,
  • T(n)T(n) = aT(nb)aT(n-b) + f(n)f(n), where f(n)=θ(nk)f(n) = θ(n^{k}) (Decreasing Recurrence Relation)

This marks the limitations of Master's Theorem, which cannot be used if:

  • T(n) is monotone function, For example: T(n)T(n) = cosxcosx
  • a is not a constant. Consider this: T(n)T(n) = nT(n/3)+f(n)nT(n/3) + f(n). This function cannot be solved to get the time complexity functions because that changes our comparison function(as discussed above). We compare nlogban^{logba} with f(n)f(n) for the final dominating time function.

If a is not constant, this changes our Master Theorem's conditions altogether.

  • If f(n)f(n) is not a polynomial. Again for similar reasons, the Master's Theorem can only be applied to the above-stated form under the given constraints.

Conclusion

  • For the recurrence relations of the type

    • T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n), where f(n)=θ(nklogpn)f(n) = θ(n^{k} log^{p}n) (Dividing Recurrence Relation) Or,
    • T(n)=aT(nb)+f(n)T(n) = aT(n-b) + f(n), where f(n)=θ(nk)f(n) = θ(n^{k}) (Decreasing Recurrence Relation)

where,
n = input size
a = count of subproblems in the recursion function
n/b = size of each subproblem (Assuming size of each subproblem is same)
can be solved using Master's Theorem, and we can directly calculate the time complexity of such relations.

  • Necessary constraints applied to this form are:

    1. a>=1
    2. b>1
    3. k>=0

    and

    1. a>0,
    2. b>0
    3. k>=0

    respectively.

  • The final time complexity of this form is found out by comparing the time complexity of the first term, i.e., aT(n/b), and the second term, i.e., f(n) in 'dividing functions' and 'a' with 1 in the case of 'decreasing' functions.

  • We cannot apply this Theorem to relation function where

    1. a is a constant,
    2. T(n) is a monotone function
    3. f(n) is not a polynomial