1 / 27

308-203A Introduction to Computing II Lecture 5: Complexity of Algorithms

308-203A Introduction to Computing II Lecture 5: Complexity of Algorithms. Fall Session 2000. How long does a program take to run?. Depends on what the input is May just depend on some parameter(s) of the input. Example: copying the String depends on the length

laddie
Télécharger la présentation

308-203A Introduction to Computing II Lecture 5: Complexity of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 308-203AIntroduction to Computing IILecture 5: Complexity of Algorithms Fall Session 2000

  2. How long does aprogram take to run? • Depends on what the input is • May just depend on some parameter(s) of the input Example: copying the String depends on the length “a”.clone( ) is less work than “abc…z”.clone()

  3. To Quantify That... • Assume some simple operations take fixed time • e.g. a[i]= 0; => 1 time unit • Complex operations depend on how many • simpler operations are performed • e.g. for i := 1 to n do a[i] = 0; => n time units

  4. Do constants matter? Q. What if a[i] = 0 and x = 0 don’t take the same time? A. As it happens, this isn’t as important as loops, recursions etc. Therefore, we will use an asymptotic notation, which ignores the exact value of these constants...

  5. The Big O( ) Definition: If there is a function f(n) where there exists N and c such that for any input of length n > N, the running time of a program P is bounded as Time(P) < c f(n) we say that P is O(f(n))

  6. The Big O( ) What does it really mean? f(x) Time n > N n < N n N

  7. The Big O( ) WARNING: CORRUPT NOTATION We write… g(n) = O(f(n)) even though g(n) and O(f(n)) are not equivalent.

  8. Examples Example Growth A.k.a. O(1) constant O(n) linear O(n2) quadratic O(log n) logarithmic O(en) exponential x = 1; for j := 1 to n do A[j] = 0; for j := 1 to n do for k := 1 to n do A[j][k] = 0; for (int j = n; j != 0; j /= 2) A[j] = 0 f(n): prints all strings of { a, b }* of length n

  9. Worst Case Analysis Big O( ) is worst-case in that the real running time may be much less ( f(n) is an upper-bound): Example: String s = … ; for (int j = 0; j < s.length( ); j++) if (s [ j ] = “a”) break;

  10. Worst Case Analysis Big O( ) is worst-case in that the real running time may be much less ( f(n) is an upper-bound): Example: String s = … ; for (int j = 0; j < s.length( ); j++) if (s [ j ] = “a”) break; Time = O(n)

  11. Best-case Analysis • We may choose to analyze the least time • the program could take to run • This is called big-W notation • If P is O( f(n) ) andW ( f(n) ) we say: • P is Q ( f(n) )

  12. (i < j) (i = j) (i > j) Intuitively... O, Q, and W do for functions what less than, equal and greater than do for numbers. f(x) = O ( g(x) ) f(x) = Q ( g(x) ) f(x) = W ( g(x) )

  13. Lower-case letters act like the corresponding strict inequalities (<, >), i.e. it is known that f(x) = Q( g(x) ): A little more notation f(x) = o ( g(x) ) f(x) = w ( g(x) ) “Little-oh” “Little-omega”

  14. Some Things to Note 1. O( ) is a bound, so: • If P = O( 1 ), it is also true that P = O( n ) • If P = O( nk ), it is also true that P = O( n j ) for (j > k) 2. If P = O( f(n) + g(n) ) and f(n) = O( g(n) ) then P = O( g(n) ) Example: P = O( x2 + x ) => P = O( x2)

  15. More examples: What about adding two numbers?? 1) In what parameter do we do the analysis? 2) O, Q, and W ?

  16. ad a(d-1) a(d-2) ... ai … a3 a2 a1 bd b(d-1) b(d-2) ... bi … b3 b2 b1 c(d+1) cd c(d-1) c(d-2) ... ci … c3 c2 c1 + More examples: What about adding two numbers?? Let n be the number of digits in the numbers (assume same length)

  17. ad a(d-1) a(d-2) ... ai … a3 a2 a1 bd b(d-1) b(d-2) ... bi … b3 b2 b1 c(d+1) cd c(d-1) c(d-2) ... ci … c3 c2 c1 + More examples: What about adding two numbers?? We do exactly one (primitive) addition for each of d digits => Q ( d )

  18. The parameter is important! Let’s say we did the analysis on the number itself rather than how many digits it contains… … is it still linear ???

  19. The parameter is important! Let’s say we did the analysis on the number itself rather than how many digits it contains… … is it still linear ??? NO! If the number is n, d = log n O( d ) = O ( log n )

  20. So what is O(1) in Java • Primitive math operations (i.e. +,-,*, / on • ints, doubles, etc) • Accessing simple variables (and data members) • Accessing an array, A[i]

  21. So what is notO(1) in Java • Method calls usually aren’t: depends on the body • of the method • This includes Java library calls like in java.lang.math • Loops of any kind

  22. Another example: Exponentiation What is the order of growth of and can we do better than: Function exp(m,n) ::= { result := 1 while (n > 0) result := result * m n := n -1 }

  23. Another example: Exponentiation What is the order of growth of and can we do better than: Answer #1: O( n ) Answer #2: yes…

  24. Better Exponentiation Observe: We can rewrite exponentiations like 513 = 5 (52)2 (( 52)2 ) 2 This has only seven multiplications (instead of thirteen)

  25. Better Exponentiation Function exp(m, n) ::= { result := 1 while (n > 0) if ( n is even) m := m2 n := n /2 else result := result * m n := n - 1 }

  26. Order of Growth?? Best-case: We always divide by 2 until n := 1 => W ( log n ) iterations Worst-case: If we’re forced into the other branch (n odd) it will be even next time, so : 2 log n = O( log n ) Conclusion: Q ( log n )

  27. Any questions?

More Related