1 / 21

Lab of CS103 Data Structure and C++

Lab of CS103 Data Structure and C++. by Han Young Ryoo, Joseph Gomes. Big-O Notation. Definition: T(n) is O(f(n)) iff (if and only if) there exist positive constants c and n 0 such that T(n) <= c f(n), for all n >= n 0. Examples. T(n) = 3n + 2 is O(n)

barbie
Télécharger la présentation

Lab of CS103 Data Structure and C++

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lab of CS103 Data Structure and C++ by Han Young Ryoo, Joseph Gomes

  2. Big-O Notation • Definition: T(n) is O(f(n)) iff (if and only if) there exist positive constants c and n0 such that T(n) <= c f(n), for all n >= n0

  3. Examples • T(n) = 3n + 2 is O(n) since 3n + 2 <= 4n for all n >= 2 where c=4 and n0 = 2. • T(n) = 10n2 + 4n + 2 is O(n2) since 10n2 + 4n + 2 <= 11n2 for all n >= 5 where c=11 and n0 = 5.

  4. Why do we need Big O • To decide whether an algorithm is adequate, • For a better implementation * The algorithm will always be too slow on a big enough input. • Quicksort vs. Bubble sort (O(n log n) vs. O(n2) Quicksort running on a small desktop computer can beat bubble sort running on a super-computer if there are a lot of numbers to sort. To sort 1,000,000 numbers, the quicksort takes 20,000,000 steps on average, while the bubble sort takes 1,000,000,000,000 steps!

  5. How to determine • Sequence of statements: statement 1; statement 2; ... statement k; The total time is found by adding the times for all statements: total time = time(statement 1) + time(statement 2) + ... + time(statement k)

  6. Rules for using big-O • Ignoring constant factors O(c f(N)) = O(f(N)), where c is a constant Example) O(20 N3) = O(N3)

  7. Rules for using big-O • Ignoring smaller terms If a<b then O(a+b) = O(b) Example) O(N2+N) = O(N2)

  8. Rules for using big-O • Upper bound only If a<b then an O(a) algorithm is also an O(b) algorithm. Example) an O(N) algorithm is also an O(N2) algorithm (but not vice versa).

  9. Rules for using big-O • N and log N are "bigger" than any constant, from an asymptotic view (that means for large enough N). So if k is a constant, an O(N + k) algorithm is also O(N), by ignoring smaller terms. Similarly, an O(log N + k) algorithm is also O(log N).

  10. Rules for using big-O • An O(N log N + N) algorithm, which is O(N(log N + 1)), can be simplified to O(N log N).

  11. Simple Statements • If each statement is "simple" (only involves basic operations) then the time for each statement is constant and the total time is also constant: O(1).

  12. if-then-else statements • if (condition) { sequence of statements 1 } else { sequence of statements 2 } • Either sequence 1 will execute, or sequence 2 will execute. • The worst-case time: the slowest of the two possibilities: max(time(sequence 1), time(sequence 2)). • If sequence 1 is O(N) and sequence 2 is O(1), what is the worst-case time for the whole if-then-else statement?

  13. loops • for (i = 0; i < N; i++) { sequence of statements } • The loop executes N times, so the sequence of statements also executes N times. • If the statements are O(1), what is the total time for the for loop?

  14. Nested loops • for (i = 0; i < N; i++) { for (j = 0; j < M; j++) { sequence of statements } } • The complexity is O(N * M). • Need Generalization: In a common special case where the stopping condition of the inner loop is j < N instead of j < M (i.e., the inner loop also executes N times), the total complexity for the two loops is O(N2).

  15. Function/Method calls • When a statement involves a method call, the complexity of the statement includes the complexity of the method call. Assume that you know that method f takes constant time, and that method g takes time proportional to (linear in) the value of its parameter k. Then the statements below have the time complexities indicated. f(k); // O(1) g(k); // O(k) • Example: for (j = 0; j < N; j++) { g(N);} What is the complexity?

  16. A program that calculates execution time in milliseconds #include <sys/timeb.h> #include "stdio.h" #include "math.h" int main(int argc, char* argv[]) { struct _timeb time_start,time_end; _ftime( &time_start ); for (int i=1;i<=10000;i++) { int m = 1; for(int j=1;j<100000;j++) { m *= 2; } } _ftime( &time_end ); printf("elapsed time = %d milliseconds\n",time_end.millitm-time_start.millitm); return 0; }

  17. Exercises • Two loops in a row: for (i = 0; i < N; i++) { sequence of statements } for (j = 0; j < M; j++) { sequence of statements } How would the complexity change if the second loop went to N instead of M?

  18. Exercises • A nested loop followed by a non-nested loop: for (i = 0; i < N; i++) { for (j = 0; j < N; j++) { sequence of statements } } for (k = 0; k < N; k++) { sequence of statements }

  19. Exercises • A nested loop with dependent loop index: for (i = 0; i < N; i++) { for (j = i; j < N; j++) { sequence of statements } } • A nested loop in which the number of times the inner loop executes depends on the value of the outer loop index

  20. Quiz • for (i = 0; i < N; i++) { for (j = 0; j < N; j++) { C[i][j] = 0; for (k = 0; k < N; k++) { C[i][j] = C[i][j] + A[i][k] * B[k][j]; } } }

  21. Quiz • Sorting for (i = N-1; i > 0; i--) for (j = 0; j < i; i++) if (a[j] > a[j+1]) swap a[j] and a[j+1];

More Related