1 / 93

Chapter 2

Chapter 2. Analysis Framework 2.1 – 2.2. Homework. Remember questions due Wed. Read sections 2.1 and 2.2 pages 41-50 and 52-59. Agenda: Analysis of Algorithms. Blackboard Issues: Correctness Time efficiency Space efficiency Optimality Approaches: Theoretical analysis

landonp
Télécharger la présentation

Chapter 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 2 Analysis Framework 2.1 – 2.2

  2. Homework • Remember questions due Wed. • Read sections 2.1 and 2.2 • pages 41-50 and 52-59

  3. Agenda: Analysis of Algorithms • Blackboard • Issues: • Correctness • Time efficiency • Space efficiency • Optimality • Approaches: • Theoretical analysis • Empirical analysis

  4. Time Efficiency Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size

  5. input size running time Number of times basic operation is executed execution time for basic operation Theoretical analysis of time efficiency • Basic operation: the operation that contributes most towards the running time of the algorithm. T(n) ≈copC(n)

  6. Input size and basic operation examples

  7. Empirical analysis of time efficiency • Select a specific (typical) sample of inputs • Use physical unit of time (e.g., milliseconds) OR • Count actual number of basic operations • Analyze the empirical data

  8. Best-case, average-case, worst-case For some algorithms efficiency depends on type of input: • Worst case: W(n) – maximum over inputs of size n • Best case: B(n) – minimum over inputs of size n • Average case: A(n) – “average” over inputs of size n • Number of times the basic operation will be executed on typical input • NOT the average of worst and best case • Expected number of basic operations repetitions considered as a random variable under some assumption about the probability distribution of all possible inputs of size n

  9. Example: Sequential search • Problem: Given a list of n elements and a search key K, find an element equal to K, if any. • Algorithm: Scan the list and compare its successive elements with K until either a matching element is found (successful search) of the list is exhausted (unsuccessful search) • Worst case • Best case • Average case

  10. Types of formulas for basic operation count • Exact formula e.g., C(n) = n(n-1)/2 • Formula indicating order of growth with specific multiplicative constant e.g., C(n) ≈ 0.5 n2 • Formula indicating order of growth with unknown multiplicative constant e.g., C(n) ≈cn2

  11. Order of growth • Most important: Order of growth within a constant multiple as n→∞ • Example: • How much faster will algorithm run on computer that is twice as fast? • How much longer does it take to solve problem of double input size? • See table 2.1

  12. Table 2.1

  13. Day 4: Agenda • O, Θ, Ω • Limits • Definitions • Examples • Code  O(?)

  14. Homework • Due on Monday 11:59PM • Electronic submission see website. • Try to log into Blackboard • Finish reading 2.1 and 2.2 • page 60-61, questions 2, 3, 4, 5, & 9

  15. Asymptotic growth rate A way of comparing functions that ignores constant factors and small input sizes • O(g(n)): class of functions t (n) that grow no faster than g(n) • Θ(g(n)): class of functions t (n) that grow at same rate as g(n) • Ω(g(n)): class of functions t (n) that grow at least as fast as g(n) see figures 2.1, 2.2, 2.3

  16. Big-oh

  17. Big-omega

  18. Big-theta

  19. Using Limits if t(n) grows slower than g(n) if t(n) grows at the same rate as g(n) if t(n) grows faster than g(n)

  20. L’Hopital’s Rule

  21. Examples: • 10n vs. 2n2 • n(n+1)/2 vs. n2 • logb n vs. logc n

  22. Definition • f(n) = O( g(n) ) if there exists • a positive constant c and • a non-negative interger n0 • such thatf(n)c·g(n) for every n > n0 Examples: • 10n is O(2n2) • 5n+20 is O(10n)

  23. Basic Asymptotic Efficiency classes

  24. Non-recursive algorithm analysis Analysis Steps: • Decide on parameter n indicating input size • Identify algorithm’s basic operation • Determine worst, average, and best case for input of size n • Set up summation for C(n) reflecting algorithm’s loop structure • Simplify summation using standard formulas

  25. Example for (x = 0; x < n; x++) a[x] = max(b[x],c[x]);

  26. Example for (x = 0; x < n; x++) for (y = x; y < n; y++) a[x][y] = max(b[x],c[y]);

  27. Example for (x = 0; x < n; x++) for (y = 0; y < n/2; y++) for (z = 0; z < n/3; z++) a[z] = max(a[x],c[y]);

  28. Example y = n while (y > 0) if (a[y--] == b[y--]) break;

  29. Day 5: Agenda • Go over the answer to hw2 • Try electronic submission • Do some problems on the board • Time permitting: Recursive algorithm analysis

  30. Homework • Remember to electronically submit hw3 before Tues. morning • Read section 2.3 thoroughly!pages 61-67

  31. Day 6: Agenda • First, what have we learned so far about non-recursive algorithm analysis • Second, logbn and bnThe enigma is solved. • Third, what is the deal with Abercrombie & Fitch • Fourth, recursive analysis tool kit.

  32. Homework 4 and Exam 1 • Last homework before Exam 1 • Due on Friday 2/6 (electronic?) • Will be returned on Monday 2/9 • All solutions will be posted next Monday 2/9 • Exam 1 Wed. 2/11

  33. Homework 4 • Page 68 questions 4, 5 and 6 • Pages 77 and 78 questions 8 and 9 • We will do similar example questions all day on Wed 2/4

  34. What have we learned? • Non-recursive algorithm analysis • First, Identify problem size. • Typically, • a loop counter • an array size • a set size • the size of a value

  35. Basic operations • Second, identify the basic operation • Usually a small block of code or even a single statement that is executed over and over. • Sometimes the basic operation is a comparison that is hidden inside of the loop • Example: • while (target != a[x]) x++;

  36. Single loops • One loop from 1 to n • O(n) • Be aware this is the same as • 2 independent loops from 1 to n • c independent loops from 1 to n • A loop from 5 to n-1 • A loop from 0 to n/2 • A loop from 0 to n/c

  37. Nested loops for (x = 0; x < n; x++) for (y = 0; y < n; y++)  O(n2) for (x = 0; x < n; x++) for (y = 0; y < n; y++) for (z = 0; z < n; z++)  O(n3) But remember: We can have c independent nested loops, or the loops can be terminated early n/c.

  38. Most non-recursive algorithms reduce to one of these efficiency classes

  39. What else? • Best cases often arise when loops terminate early for specific inputs. • For worst cases, consider the following: Is it possible that a loop will NOT terminate early • Average cases are not the mid-pointer or average of the best and worst case • Average cases require us to consider a set of typical input • We won’t really worry too much about average case until we hit more difficult problems.

  40. Anything else? • Important questions you should be asking? • How does log2n arise in non-recursive algorithms? • How does 2n arise? • Dr. B. Please, show me the actual loops that cause this!

  41. for (x = 0; x < n; x++) { Basic operation; x = x*2; } for every item in the list do Basic operation; eliminate or discard half the items; Log2n • Note: These types of log2n loops can be nested inside of O(n), O(n2), or O(nk) loops, which leads to • n log n • n2 log n • nk log n

  42. Log2n • Which, by the way, is pretty much equivalent to logbn, ln, or the magical lg(n) • But, don’t be mistaken • O(n log n) is different than O(n) even though log n grows so slow.

  43. Last thing: 2n • How does 2n arise in real algorithms? • Lets consider nn: • How can you program n nested loops • Its impossible right? • So, how the heck does it happen in practice?

  44. Think • Write an algorithm that prints “*” 2n times. • Is it hard? • It depends on how you think.

  45. Recursive way fun(int x) { if (x > 0) { print “*”; fun(x-1); fun(x-1); } } Then call the function fun(n);

  46. Non-recursive way for (x = 0; x < pow(2,n); x++) print “*”; WTF? Be very wary of your input size. If you input size is truly n, then this is truly O(2n) with just one loop.

  47. That’s it! • That’s really it. • That’s everything I want you do know about non-recursive algorithm analysis • Well, for now.

  48. Enigma • We know that log2n = (log3n) • But what about 2n = (3n) • Here is how I will disprove it… • This gives you an idea of how I like to see questions answered.

  49. Algorithm Analysis Recursive Algorithm Toolkit

  50. Example Recursive evaluation of n ! • Definition: n ! = 1*2*…*(n-1)*n • Recursive definition of n!: if n=0 then F(n) := 1 else F(n) := F(n-1) * n return F(n) • Recurrence for number of multiplications to compute n!:

More Related