100 likes | 215 Vues
This chapter explores the efficiency of algorithms, highlighting the complexities in measuring performance through execution time and memory requirements. It introduces Big-O notation as a vital tool for characterizing the time complexity of algorithms as input sizes increase. Through examples, the chapter demonstrates how to count executions and analyze nested loops, leading to conclusions about time complexity. It emphasizes the importance of Big-O in comparing algorithm performance, outlining cases for best, worst, and average scenarios. This foundation is essential for optimizing algorithms effectively.
E N D
Efficiency of Algorithms • Difficult to get a precise measure of the performance of an algorithm or program • Can characterize a program by how the execution time or memory requirements increase as a function of increasing input size • Big-O notation • A simple way to determine the big-O of an algorithm or program is to look at the loops and to see whether the loops are nested
Counting Executions • Generally, • A statement counts as “1” • Sequential statements add: • Statement; statement 2 • Loops multiply based on how many times they run
Counting Example • Consider: • First time through outer loop, inner loop is executed n-1 times; next time n-2, and the last time once. • So we have • T(n) = 3(n – 1) + 3(n – 2) + … + 3 or • T(n) = 3(n – 1 + n – 2 + … + 1)
Counting Example (continued) • We can reduce the expression in parentheses to: • n x (n – 1) 2 • So, T(n) = 1.5n2 – 1.5n • This polynomial is zero when n is 1. For values greater than 1, 1.5n2 is always greater than 1.5n2 – 1.5n • Therefore, we can use 1 for n0 and 1.5 for c to conclude that T(n) is O(n2)
Importance of Big-O • Doesn’t matter for small values of N • Shows how time grows with size • Regardless of comparative performance for small N, smaller big-O will eventually win • Technically, what we usually use when we say big-O, is really big-Theta
Some Rules • If you have to read or write N items of data, your program is at least O(N) • Search programs range from O(log N) to O(N) • Sort programs range from O(N log N) to O(N2)
Cases • Different data might take different times to run • Best case • Worst case • Average case • Example: consider sequential search…