1 / 29

Algorithm Efficiency

Algorithm Efficiency. Chapter 10. What Is a Good Solution?. A program incurs a real and tangible cost. Computing time Memory required Difficulties encountered by users Consequences of incorrect actions by program A solution is good if …

deanhenry
Télécharger la présentation

Algorithm Efficiency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithm Efficiency Chapter 10

  2. What Is a Good Solution? • A program incurs a real and tangible cost. • Computing time • Memory required • Difficulties encountered by users • Consequences of incorrect actions by program • A solution is good if … • The total cost incurred over all phases of its life … is minimal

  3. What Is a Good Solution? • Important elements of the solution • Good structure • Good documentation • Efficiency • Be concerned with efficiency when • Developing underlying algorithm • Choice of objects and design of interaction between those objects

  4. Measuring Efficiency of Algorithms • Important because • Choice of algorithm has significant impact • Examples • Responsive word processors • Grocery checkout systems • Automatic teller machines • Video/Gaming machines • Life support systems • E-Commerce web sites

  5. Measuring Efficiency of Algorithms • Analysis of algorithms • The area of computer science that provides tools for contrasting efficiency of different algorithms • Comparison of algorithms should focus on significant differences in efficiency (order of magnitude differences as input size increases) • We consider comparisons of algorithms, not programs • How do we measure efficiency • Space utilization – amount of memory required • Time required to accomplish the task • Time efficiency depends on : • size of input and for some algorithms, input order • speed of machine • quality of source code • quality of compiler These vary from one platform to another

  6. Measuring Efficiency of Algorithms • Difficulties with comparing programs (instead of algorithms) • How are the algorithms coded • What computer will be used • What data should the program use • Algorithm analysis should be independent of • Specific implementations, computers, and data • We can count the number of times instructions are executed • This gives us a measure of efficiency of an algorithm • So we measure computing time as:f(n) = computing time of an algorithm for input of size n = number of times the instructions are executed

  7. The Execution Time of Algorithms • An algorithm’s execution time is related to number of operations it requires. • Example: Towers of Hanoi • Solution for n disks required 2n – 1 moves • If each move requires time m • Solution requires (2n – 1) m time units

  8. Example: Calculating the Mean Task # times executed • Initialize the sum to 0 1 • Initialize index i to 0 1 • While i < n do following n+1 • a) Add x[i] to sum n • b) Increment i by 1 n • Return mean = sum/n 1 Total f(n) = 3n + 4

  9. Computing Time Order of Magnitude • As number of inputs increases • f(n) = 3n + 4 grows at a rate proportional to n • Thus f(n) has the "order of magnitude" n • The computing time of an algorithm on input of size n, • f(n) said to have order of magnitude g(n), • Written as f(n) is O(g(n)) • Defined as, f(n) is O(g(n)) iff there exist positive constants, C and N0 such that 0 < f(n) < Cg(n) for all n ≥ N0

  10. Big Oh Notation Another way of saying this: • The complexityof the algorithm is O(g(n)). • Example: For the Mean-Calculation Algorithm: f(n) is O(n) • Note that constants and multiplicative factors are ignored. f(x) ∈ O(g(x)) as there exists c > 0 (e.g., c = 1) and N0 (e.g., N0 = 5) such that f(x) ≤ cg(x) whenever N ≥ N0. Here, x is n and y is Time.

  11. Algorithm Growth Rates • Measure an algorithm’s time requirement as function of problem size • Most important thing to learn • How quickly algorithm’s time requirement grows as a function of problem size • Demonstrates contrast in growth rates

  12. Big Oh Notation • g(n) is usually simple: n, n2, n3, ...2n1, log2nn log2nlog2log2n • Note graphof commoncomputing times

  13. Big Oh Notation • Graphs of common computing times

  14. Algorithm Growth Rates • Time requirements as a function of problem size n

  15. Analysis and Big O Notation • The graphs of 3  n2 and n2 - 3  n + 10

  16. Analysis and Big O Notation • Order of growth of some common functions

  17. Common Computing Time Functions

  18. Analysis and Big O Notation • Worst-case analysis • Worst case analysis usually considered • Easier to calculate, thus more common • Average-case analysis • More difficult to perform • Must determine relative probabilities of encountering problems of given size

  19. Computing in Real Time • Suppose each instruction can be done in 1 microsecond • For n = 256 instructions how long for various f(n)

  20. Keeping Your Perspective • ADT used makes a difference • Array-based getEntryis O(1) • Link-based getEntryis O(n) • Choosing implementation of ADT • Consider how frequently certain operations will occur • Seldom used but critical operations must also be efficient

  21. Keeping Your Perspective • If problem size is always small • Possible to ignore algorithm’s efficiency • Weigh trade-offs between • Algorithm’s time and memory requirements • Compare algorithms for style and efficiency

  22. Efficiency of Searching Algorithms • Sequential search • Worst case: O(n) • Average case: O(n) • Best case: O(1) • Binary search • O(log2n) in worst case • Average case: O(log2n) • Best case: O(1) • At same time, maintaining array in sorted order requires overhead cost … can be substantial

  23. Sequential Search Algorithm linear search(x : integer; a1,...,an : distinct integers) i = 1 while (i n and x  ai) i = i + 1if i n then location = i else location = 0{location is the index(subscript) of the term that equals x, or 0 if x is not found} • Give f(n) for Sequential Search? Count operations. • Give worst case g(n) for Sequential Search? • What is the average case for Sequential Search? On average where do we find x in the array, a? • Give average case O(g(n)) for Sequential Search?

  24. Seq. Search Average Case Analysis E[X] = S (x * Pr{X=x}), assume uniform probability distribution, here read E[X] as expectation (average) of the discrete variable X. X is a function that maps elements of a sample space to the real numbers. For Sequential Search, the sample space is the finite set of comparisons required to find the key. We assume the key is located in the array, a. Comparisons(x) Pr{X=x} x* Pr{X=x} 1 1/n 1/n 2 1/n 2/n 3 1/n 3/n 4 1/n 4/n 5 1/n 5/n Sum = 1/n(S i) for 6 1/n 6/n 1 ≤ i ≤ n 7 1/n 7/n = (n(n+1))/2n . . . = (n+1)/2 . . . On average, find key at middle n 1/n n/n

  25. Binary Search Algorithm binary search(x : integer; a1,...,an : increasing integers) i = 1 { i is the left endpoint} j = n { j is the right endpoint} while i < j m = (i+j)/2if x>am then i=m+1 else j = mif x=ai then location = i else location = -1{location is the index(subscript) of the term that equals x, or -1 if x is not found} • Give f(n) for Binary Search? • Give worst case g(n) for Binary Search • What is the average case for Binary Search? On average where do we find x in the array, a? • Give average case O(g(n)) for Binary Search?

  26. Binary Search Average Case Analysis E[X] = S (x * Pr{X=x}), here read E[X] as expectation (average) of the discrete variable X. X is a function that maps elements of a sample space to the real numbers. For Binary Search, the sample space is the finite set of comparisons required to find the key. We assume the key is located in the array, a. Comparisons(x) Pr{X=x} x* Pr{X=x} You fill in the table? On average, where should we find the key in the array?

  27. Examples Assuming a linked list of n nodes, the statements Node *cur = head;while (cur != null){ cout << curr->item << endl; cur = cur->next; } // end while require ______ assignment(s). The code segment = O(___)? Consider an algorithm that contains loops of this form: for (i = 1 through n) for (j = 1 through i) for (k = 1 through 10) Task T If task T requires t time units, the innermost loop on k requires ___ time units. The middle loop on j requires ___ time units. The code segment = O(____)?

  28. Examples Order the following functions from smallest growth rate to largest. n2 n 2n log2n Use Big-O notation to specify the asymptotic run-time of the following code segments. Assume variables a and b are unsigned ints. while (a != 0) { cout <<a<<" "; a /=2; } = O(___)? If(a > b) cout<<a<<endl; else cout<<b<<endl; = O(___)?

  29. End Chapter 10

More Related