Master Theorem Calculator
Enter a, b, and d for T(n) = aT(n/b) + O(n^d) to determine Big O complexity.
Shows which case applies, the critical exponent, and algorithm efficiency class.
The Master Theorem
The Master Theorem gives the time complexity of divide-and-conquer recurrences in the form:
T(n) = a × T(n/b) + O(n^d)
| Variable | Meaning | Constraint |
|---|---|---|
| a | Number of subproblems created at each step | a >= 1 |
| b | Factor by which input size shrinks per step | b > 1 |
| d | Exponent of work done outside the recursive calls | d >= 0 |
The three cases — compare d vs log_b(a):
| Case | Condition | Result | Who dominates |
|---|---|---|---|
| Case 1 | d < log_b(a) | T(n) = O(n^log_b(a)) | Recursive calls dominate |
| Case 2 | d = log_b(a) | T(n) = O(n^d * log n) | Work balanced at every level |
| Case 3 | d > log_b(a) | T(n) = O(n^d) | Top-level work dominates |
Classic algorithm examples:
| Algorithm | Recurrence | a | b | d | log_b(a) | Case | Result |
|---|---|---|---|---|---|---|---|
| Binary search | T(n/2) + O(1) | 1 | 2 | 0 | 0 | 2 | O(log n) |
| Merge sort | 2T(n/2) + O(n) | 2 | 2 | 1 | 1 | 2 | O(n log n) |
| Strassen matrix mult | 7T(n/2) + O(n^2) | 7 | 2 | 2 | 2.807 | 1 | O(n^2.807) |
| Karatsuba multiply | 3T(n/2) + O(n) | 3 | 2 | 1 | 1.585 | 1 | O(n^1.585) |
| Naive matrix mult | 8T(n/2) + O(n^2) | 8 | 2 | 2 | 3 | 1 | O(n^3) |
| Linear search | 1T(n-1) + O(1) | — | — | — | — | N/A | Not applicable |
When does the theorem NOT apply?
- When the problem size reduction is not by a constant factor (e.g., T(n) = T(n-1) + …)
- When a < 1 or b <= 1
- When f(n) is not a polynomial (e.g., n * log n as the non-recursive work)
- Case 3 requires an additional “regularity condition” (a * f(n/b) <= c * f(n) for some c < 1)
How to identify a, b, d from pseudocode:
- Count how many recursive calls are made at each level → that is a
- What is the input size passed to each recursive call? If n/2 then b=2, if n/3 then b=3
- What work is done outside the recursive calls? If O(n) then d=1, if O(n^2) then d=2, if O(1) then d=0
The critical exponent log_b(a) explained: This is the exponent of the “ideal” work if the algorithm just divided and did nothing else. If d matches it exactly, work is uniform across all recursion levels (Case 2, adds a log factor). If d is lower, lower levels dominate (Case 1). If d is higher, the top level dominates (Case 3).