1 / 41

Lecture 9

Lecture 9. Recurrences and Running Time. An equation or inequality that describes a function in terms of its value on smaller inputs. T(n) = T(n-1) + n Recurrences arise when an algorithm contains recursive calls to itself What is the actual running time of the algorithm?

helia
Télécharger la présentation

Lecture 9

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 9

  2. Recurrences and Running Time • An equation or inequality that describes a function in terms of its value on smaller inputs. T(n) = T(n-1) + n • Recurrences arise when an algorithm contains recursive calls to itself • What is the actual running time of the algorithm? • Need to solve the recurrence

  3. Example Recurrences • T(n) = T(n-1) + n Θ(n2) • Recursive algorithm that loops through the input to eliminate one item • T(n) = T(n/2) + c Θ(lgn) • Recursive algorithm that halves the input in one step • T(n) = T(n/2) + n Θ(n) • Recursive algorithm that halves the input but must examine every item in the input • T(n) = 2T(n/2) + 1 Θ(n) • Recursive algorithm that splits the input into 2 halves and does a constant amount of other work

  4. Methods for Solving Recurrences • Iteration method • Substitution method • Recursion tree method • Master method

  5. The Iteration Method • Convert the recurrence into a summation and try to bound it using known series • Iterate the recurrence until the initial condition is reached. • Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition.

  6. The Iteration Method T(n) = c + T(n/2) T(n) = c + T(n/2) = c + c + T(n/4) = c + c + c + T(n/8) Assume n = 2k T(n) = c + c + … + c + T(1) = clgn + T(1) = Θ(lgn) T(n/2) = c + T(n/4) T(n/4) = c + T(n/8) k times

  7. Iteration Method – Example Assume: n = 2k T(n) = n + 2T(n/2) T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) T(n/2) = n/2 + 2T(n/4)

  8. The substitution method • Guess a solution • Use induction to prove that the solution works

  9. Substitution method • Guess a solution • T(n) = O(g(n)) • Induction goal: apply the definition of the asymptotic notation • T(n) ≤ d g(n), for some d > 0 and n ≥ n0 • Induction hypothesis: T(k) ≤ d g(k) for all k < n • Prove the induction goal • Use the induction hypothesis to find some values of the constants d and n0 for which the induction goal holds (strong induction)

  10. Substitution MethodT(n) = 2T(n/2) + n = O(n lg n) • Thus, we need to show that T(n)  c nlgn with an appropriate choice of c • Inductive hypothesis: assume T(n/2)  c (n/2) lg (n/2) • Substitute back into recurrence to show thatT(n)  c n lgn follows, when c  1 • T(n) = 2T(n/2) + n  2(c (n/2) lg (n/2)) + n = cnlg(n/2) + n = cnlgn – cnlg 2 + n = cnlgn – cn + n  cnlgn for c  1 = O(nlgn) for c  1

  11. Recursion-tree method • A recursion tree models the costs (time) of a recursive execution of an algorithm. • The recursion-tree method can be unreliable. • The recursion tree method is good for generating guesses for the substitution method

  12. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2: T(n)

  13. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  14. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  15. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  16. Master’s method • When analyzing algorithms, recall that we only care about the asymptotic behavior. • Recursive algorithms are not different. Rather than solve exactly the recurrence relation associated with the cost of an algorithm, it is enough to give an asymptotic characterization. • The main tool for doing this is the master theorem.

  17. Master’s method

  18. Master’s method

  19. Master’s method

  20. Master’s method

  21. Master’s method

  22. Counting Sort CountingSort(A, B, k) for i=1 to k C[i]= 0; for j=1 to n C[A[j]] += 1; for i=2 to k C[i] = C[i] + C[i-1]; for j=n downto 1 B[C[A[j]]] = A[j]; C[A[j]] -= 1;

  23. Sorting in linear time Counting sort: No comparisons between elements. • Input: A[1 . . n], where A[j]{1, 2, …, k}. • Output: B[1 . . n], sorted. • Auxiliary storage: C[1 . . k].

  24. 4 1 3 4 3 Counting-sort example 1 2 3 4 5 1 2 3 4 A: C: B:

  25. 4 1 3 4 3 0 0 0 0 Loop 1 1 2 3 4 5 1 2 3 4 A: C: B: fori 1tok doC[i]  0

  26. 4 1 3 4 3 0 0 0 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  27. 4 1 3 4 3 1 0 0 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  28. 4 1 3 4 3 1 0 1 1 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  29. 4 1 3 4 3 1 0 1 2 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  30. 4 1 3 4 3 1 0 2 2 Loop 2 1 2 3 4 5 1 2 3 4 A: C: B: forj 1ton doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  31. 4 1 3 4 3 1 0 2 2 1 1 2 2 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|

  32. 4 1 3 4 3 1 0 2 2 1 1 3 2 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|

  33. 4 1 3 4 3 1 0 2 2 1 1 3 5 B: Loop 3 1 2 3 4 5 1 2 3 4 A: C: C': fori 2tok doC[i] C[i] + C[i–1]⊳C[i] = |{key i}|

  34. 4 1 3 4 3 1 1 3 5 3 1 1 2 5 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1

  35. 4 1 3 4 3 1 1 2 5 3 4 1 1 2 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1

  36. 4 1 3 4 3 1 1 2 4 3 3 4 1 1 1 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1

  37. 4 1 3 4 3 1 1 1 4 1 3 3 4 0 1 1 4 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1

  38. 4 1 3 4 3 0 1 1 4 1 3 3 4 4 0 1 1 3 Loop 4 1 2 3 4 5 1 2 3 4 A: C: B: C': forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1

  39. Analysis fori 1tok doC[i]  0 (k) forj 1ton doC[A[j]] C[A[j]] + 1 (n) fori 2tok doC[i] C[i] + C[i–1] (k) forjndownto1 doB[C[A[j]]]  A[j] C[A[j]] C[A[j]] – 1 (n) (n + k)

  40. Running time If k = O(n), then counting sort takes (n) time. • But, sorting takes (nlgn) time! • Where’s the fallacy? Answer: • Comparison sorting takes (nlgn) time. • Counting sort is not a comparison sort. • In fact, not a single comparison between elements occurs!

  41. 4 1 3 4 3 A: 1 3 3 4 4 B: Stable sorting Counting sort is a stable sort: it preserves the input order among equal elements. Exercise: What other sorts have this property?

More Related