1 / 223

CS 3343: Analysis of Algorithms

CS 3343: Analysis of Algorithms. Review for final. Review for finals. In chronological order Only the more important concepts Very likely to appear in your final Does not mean to be exclusive. Asymptotic notations. O: Big-Oh Ω : Big-Omega Θ : Theta o: Small-oh ω : Small-omega

altieri
Télécharger la présentation

CS 3343: Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 3343: Analysis of Algorithms Review for final

  2. Review for finals • In chronological order • Only the more important concepts • Very likely to appear in your final • Does not mean to be exclusive

  3. Asymptotic notations • O: Big-Oh • Ω: Big-Omega • Θ: Theta • o: Small-oh • ω: Small-omega • Intuitively: • O is like  • o is like < •  is like  •  is like > •  is like =

  4. Big-Oh • Math: • O(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n)  n>n0} • Or: lim n→∞ g(n)/f(n) > 0 (if the limit exists.) • Engineering: • g(n) grows at least as faster as f(n) • g(n) is an asymptotic upper bound of f(n) • Intuitively it is like f(n) ≤ g(n)

  5. Big-Oh • Claim: f(n) = 3n2 + 10n + 5  O(n2) • Proof: 3n2 + 10n + 5  3n2 + 10n2 + 5n2 when n >118 n2 when n >1 Therefore, • Let c = 18 and n0 = 1 • We have f(n)  c n2,  n > n0 • By definition, f(n)  O(n2)

  6. Big-Omega • Math: • Ω(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n)  n>n0} • Or: lim n→∞ f(n)/g(n) > 0 (if the limit exists.) • Engineering: • f(n) grows at least as faster as g(n) • g(n) is an asymptotic lower bound of f(n) • Intuitively it is like g(n) ≤ f(n)

  7. Big-Omega • f(n) = n2 / 10 = Ω(n) • Proof: f(n) = n2 / 10, g(n) = n • g(n) = n ≤ n2 / 10 = f(n) when n > 10 • Therefore, c = 1 and n0 = 10

  8. Theta • Math: • Θ(g(n)) = {f(n):  positive constants c1, c2, and n0 such that c1 g(n)  f(n)  c2 g(n)  n  n0  n>n0} • Or: lim n→∞ f(n)/g(n) = c > 0 and c < ∞ • Or: f(n) = O(g(n)) and f(n) = Ω(g(n)) • Engineering: • f(n) grows in the same order as g(n) • g(n) is an asymptotic tight bound of f(n) • Intuitively it is like f(n) = g(n) • Θ(1) means constant time.

  9. Theta • Claim: f(n) = 2n2 + n = Θ (n2) • Proof: • We just need to find three constants c1, c2, and n0 such that • c1n2 ≤ 2n2+n ≤ c2n2 for all n > n0 • A simple solution is c1 = 2, c2 = 3, and n0 = 1

  10. Using limits to compare orders of growth 0 • lim f(n) / g(n) = c > 0 ∞ f(n)  o(g(n)) f(n)  O(g(n)) f(n) Θ (g(n)) n→∞ f(n)  Ω(g(n)) f(n) ω (g(n))

  11. Compare 2n and 3n • lim 2n / 3n = lim(2/3)n = 0 • Therefore, 2n o(3n), and 3nω(2n) n→∞ n→∞

  12. Compare n0.5 and logn • lim n0.5 / logn = ? • (n0.5)’ = 0.5 n-0.5 • (log n)’ = 1 / n • lim (n-0.5 / 1/n) = lim(n0.5) = • Therefore, log n  o(n0.5) n→∞ ∞

  13. Compare 2n and n! • Therefore, 2n = o(n!)

  14. More advanced dominance ranking

  15. General plan for analyzing time efficiency of a non-recursive algorithm • Decide parameter (input size) • Identify most executed line (basic operation) • worst-case = average-case? • T(n) = i ti • T(n) = Θ (f(n))

  16. Analysis of insertion Sort Statement cost time__ InsertionSort(A, n) { for j = 2 to n {c1 n key = A[j] c2 (n-1) i = j - 1; c3 (n-1) while (i > 0) and (A[i] > key) { c4 S A[i+1] = A[i] c5 (S-(n-1)) i = i - 1 c6 (S-(n-1)) } 0 A[i+1] = key c7 (n-1) } 0 }

  17. Best case • Array already sorted Inner loop stops when A[i] <= key, or i = 0 i j 1 Key sorted

  18. Worst case • Array originally in reverse order Inner loop stops when A[i] <= key i j 1 Key sorted

  19. Average case • Array in random order Inner loop stops when A[i] <= key i j 1 Key sorted

  20. Find the order of growth for sums • How to find out the actual order of growth? • Remember some formulas • Learn how to guess and prove

  21. Arithmetic series • An arithmetic series is a sequence of numbers such that the difference of any two successive members of the sequence is a constant. e.g.: 1, 2, 3, 4, 5 or 10, 12, 14, 16, 18, 20 • In general: Recursive definition Closed form, or explicit formula Or:

  22. Sum of arithmetic series If a1, a2, …, an is an arithmetic series, then

  23. Geometric series • A geometric series is a sequence of numbers such that the ratio between any two successive members of the sequence is a constant. e.g.: 1, 2, 4, 8, 16, 32 or 10, 20, 40, 80, 160 or 1, ½, ¼, 1/8, 1/16 • In general: Recursive definition Closed form, or explicit formula Or:

  24. Sum of geometric series if r < 1 if r > 1 if r = 1

  25. Important formulas

  26. Sum manipulation rules Example:

  27. Recursive algorithms • General idea: • Divide a large problem into smaller ones • By a constant ratio • By a constant or some variable • Solve each smaller onerecursively or explicitly • Combine the solutions of smaller ones to form a solution for the original problem Divide and Conquer

  28. How to analyze the time-efficiency of a recursive algorithm? • Express the running time on input of size n as a function of the running time on smaller problems

  29. Sloppiness:Should be T( n/2 ) + T( n/2) , but it turns out not to matter asymptotically. Analyzing merge sort T(n) Θ(1) 2T(n/2) f(n) MERGE-SORTA[1 . . n] • If n = 1, done. • Recursively sort A[ 1 . . n/2 ] and A[ n/2+1 . . n ] . • “Merge” the 2 sorted lists

  30. Analyzing merge sort • Divide: Trivial. • Conquer: Recursively sort 2 subarrays. • Combine: Merge two sorted subarrays T(n) = 2T(n/2) + f(n) +Θ(1) # subproblems Work dividing and Combining subproblem size • What is the time for the base case? • What is f(n)? • What is the growth order of T(n)? Constant

  31. Solving recurrence • Running time of many algorithms can be expressed in one of the following two recursive forms or Challenge: how to solve the recurrence to get a closed form, e.g. T(n) = Θ (n2) or T(n) = Θ(nlgn), or at least some bound such as T(n) = O(n2)?

  32. Solving recurrence • Recurrence tree (iteration) method - Good for guessing an answer • Substitution method - Generic method, rigid, but may be hard • Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases

  33. The master method The master method applies to recurrences of the form T(n) = aT(n/b) + f(n), where a³ 1, b > 1, and f is asymptotically positive. • Dividethe problem into a subproblems, each of size n/b • Conquer the subproblems by solving them recursively. • Combine subproblem solutions • Divide + combine takes f(n) time.

  34. Master theorem T(n) = aT(n/b) + f(n) Key: compare f(n) with nlogba • CASE 1:f(n) = O(nlogba – e) T(n) = Q(nlogba) . • CASE 2:f(n) = Q(nlogba) T(n) = Q(nlogba log n) . • CASE 3:f(n) = W(nlogba + e) and af(n/b) £cf(n) •  T(n) = Q(f(n)) . • e.g.: merge sort: T(n) = 2 T(n/2) + Θ(n) • a = 2, b = 2  nlogba = n •  CASE 2  T(n) = Θ(n log n) .

  35. Case 1 Compare f(n) with nlogba: f(n) = O(nlogba – e) for some constant e > 0. : f(n)grows polynomially slower than nlogba (by an ne factor). Solution:T(n) = Q(nlogba) i.e., aT(n/b) dominates e.g. T(n) = 2T(n/2) + 1 T(n) = 4 T(n/2) + n T(n) = 2T(n/2) + log n T(n) = 8T(n/2) + n2

  36. Case 3 Compare f(n) with nlogba: f(n) = W(nlogba + e) for some constant e > 0. : f(n)grows polynomially faster than nlogba (by an ne factor). Solution:T(n) = Q(f(n)) i.e., f(n) dominates e.g. T(n) = T(n/2) + n T(n) = 2 T(n/2) + n2 T(n) = 4T(n/2) + n3 T(n) = 8T(n/2) + n4

  37. Case 2 Compare f(n) with nlogba: f(n) = Q(nlogba). : f(n)and nlogba grow at similar rate. Solution:T(n) = Q(nlogba log n) e.g. T(n) = T(n/2) + 1 T(n) = 2 T(n/2) + n T(n) = 4T(n/2) + n2 T(n) = 8T(n/2) + n3

  38. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

  39. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. T(n)

  40. dn T(n/2) T(n/2) Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

  41. dn dn/2 dn/2 T(n/4) T(n/4) T(n/4) T(n/4) Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

  42. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn/2 dn/2 dn/4 dn/4 dn/4 dn/4 … Q(1)

  43. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn/2 dn/2 h = log n dn/4 dn/4 dn/4 dn/4 … Q(1)

  44. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn dn/2 dn/2 h = log n dn/4 dn/4 dn/4 dn/4 … Q(1)

  45. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn dn/2 dn dn/2 h = log n dn/4 dn/4 dn/4 dn/4 … Q(1)

  46. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn dn/2 dn dn/2 h = log n dn/4 dn/4 dn dn/4 dn/4 … … Q(1)

  47. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn dn/2 dn dn/2 h = log n dn/4 dn/4 dn dn/4 dn/4 … … Q(1) #leaves = n Q(n)

  48. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn dn/2 dn dn/2 h = log n dn/4 dn/4 dn dn/4 dn/4 … … Q(1) #leaves = n Q(n) Total Q(n log n)

  49. Substitution method The most general method to solve a recurrence (prove O and  separately): • Guess the form of the solution: • (e.g. using recursion trees, or expansion) • Verify by induction (inductive step).

  50. Proof by substitution • Recurrence: T(n) = 2T(n/2) + n. • Guess:T(n) = O(n log n). (eg. by recurrence tree method) • To prove,have to showT(n) ≤ c n log nfor somec > 0 and for all n > n0 • Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: • Fact:T(n) = 2T(n/2) + n • Assumption:T(n/2)≤ cn/2 log (n/2) • Need to Prove:T(n)≤ c n log (n)

More Related