190 likes | 293 Vues
Recurrences are crucial in analyzing the running time of recursive algorithms. They express the time complexity in terms of smaller input sizes, often requiring methods like substitution, recursion trees, and the master method for solutions. Using merge sort as an example, the recurrence is formulated, and we explore substitution by guessing a solution and proving it works. We also examine recursion trees and the master method, detailing their applications in determining time complexity for various cases. Discover how these techniques simplify algorithm analysis.
E N D
Recurrences(in color) It continues…
Recurrences • When an algorithm calls itself recursively, its running time is described by a recurrence. • A recurrence describes itself in terms of its value on smaller inputs • There are three methods of solving these: substitution, recursion tree, or master method
What it looks like • This is the recurrence of MERGE-SORT • What this says is that the time involved is 1 if n = 1, • Else, the time involved is 2 times half the size of the array, plus n time to merge sorted sub-arrays { Θ(1) if n = 1 T(n) = 2T(n/2) + Θ(n) if n > 1
Substitution • Similar to induction • Guess solution, and prove it holds true for next call • Powerful method, but can sometimes be difficult to guess the solution!
Substitution • Example: • T (n) = 2T (n/2) + n • Guess that T (n) = O(n lg n) • We must prove thatT (n) cn lg n, for an appropriate constant c > 0 • Assume it holds for n/2 as well T (n/2) = c(n/2) lg (n/2) T (n) 2 (c(n/2) lg (n/2)) + n = cn lg (n/2)) + n = cn (lg n – lg 2) + n = cn lg n – cn + n cn lg n, for c 1 Note: means ‘for all’
Subtleties • Let T (n) = T (n/2) + T(n/2) + 1 • Assume that T (n) = O(n) • ThenT (n/2) = c(n/2) T (n) c(n/2) + c(n/2) + 1 = cn + 1 (note there is an extra “1”!) Which does not implyT (n) cn • Here, we’re correct, but off by a constant!
Subtleties • We strengthen our guess: T (n) cn – b T (n) (c(n/2) – b) + (c(n/2) – b) + 1 = cn – 2b + 1 cn – b, for b > 1
One Last Example Original Equation: T (n) = 2T (n) + lg n Let m = lg n, then T (2m) = 2T (2m/2) + m Let S (m) = T (2m) S (m) = 2 S (m/2) + m We know S (m) = (m lg m), so T (n) = T (2m) = S (m) = O(m lg m) = O(lg n lg lg n)
Recursion Tree Method • A recursion tree is built • We sum up each level • Total cost = number of levels * cost at each level • Usually used to generate a good guess for the substitution method • Could still be used as direct proof • Example: T (n) = 3T (n/4) + (n2)
cn2 T(n/4) T(n/4) T(n/4)
cn2 c(n/4)2 c(n/4)2 c(n/4)2 T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16)
cn2 c(n/4)2 c(n/4)2 c(n/4)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)
cn2 cn2 c(n/4)2 c(n/4)2 c(n/4)2 3/16cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (3/16)2cn2 c(n/16)2 c(n/16)2 (nlog43) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)
Questions • How many levels does this tree have? • The subproblem at depth i is n/4i • When does the subproblem hit size 1? • n/4i = 1 • n = 4i • lg4n = i • Therefore, the tree has lg4n + 1 levels (0, 1, 2,… lg4n) • There are 3i nodes at each level • The cost at each level is 3ic(n/4i)2 • The last level has 3log4nnodes = nlog43
The Master's Method When it has this form: T(n) = aT(n/b) + f(n) • If f (n) = Ο(nlogba-ε) for some constant ε>0, then T (n) = Θ (nlogba) • If f (n) = Θ(nlogba-ε) for some constant ε>0, then T (n) = Θ (nlogba lgn) • If f (n) = Ω (nlogba+ε) for some constant ε>0, and if af(n/b) ≤cf(n) for c < 1 and large n T (n) = Θ (f (n))
Example • T(n) = 9T(n/3) + n • a = 9, b = 3, f(n)=n, thus nlogba =nlog3 9=n2 • f(n)=n=O(nlog3 9-ε), where ε=1 • So we can apply case 1, thus T(n) =Θ (n2) • T(n) = T(2n/3) + 1 • a = 1, b = 3/2, f(n)=1, thus nlogba =nlog3/2 1 =n0 =1 • Case 2 applies, thus T(n) =Θ (lgn)
Example … T(n) = 3 T(n/4) + nlgn a = 3, b = 4, f(n) = nlgn nlogba = nlog43 = O(n0.793) f(n) = Ω (nlog43+ε) where ε ≈ 0.2 (solve for it) For large n, a f(n/b) = 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n) for c = 3/4 Case 3 applied Then T (n) = Θ (nlgn)
When it doesn’t work... • T(n) = 2T(n/2) + n lg n • a = 2, b = 2, f(n) = n lg n • You would think that rule 3 should apply • f(n) > nlogba • n lg n > n • But f(n) is not polynomially larger! • Because (n lg n)/n = lg n, which is asymptotically less than n.