1 / 41

Complexity

Complexity. Lecture 17-19 Ref. Handout p 66 - 77. Measuring Program Performance. How can one algorithm be ‘better’ than another? Time taken to execute ? Size of code ? Space used when running ? Scaling properties ?. Scaling. Time. Amount of data. initialise for (i = 1; i<=n; i++) {

jalila
Télécharger la présentation

Complexity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Complexity Lecture 17-19 Ref. Handout p 66 - 77

  2. Measuring Program Performance How can one algorithm be ‘better’ than another? • Time taken to execute ? • Size of code ? • Space used when running ? • Scaling properties ?

  3. Scaling Time Amount of data

  4. initialise for (i = 1; i<=n; i++) { do something } finish Time needed: t1 set up loop t2 each iteration t4 t5 Time and Counting The total ‘cost’ is t1+t2+n∗t4+t5 Instead of actual time, ti can be treated as how many instructions needed

  5. The computation cost can be estimatedas a function f(x) • In the above example, cost is f(x) = t1+t2+x∗t4+t5 = c + m∗x ( c = t1+t2+t5, m = t4 ) f(x) =c+mx a linear function m = slope c x (size of data)

  6. The computation cost can be estimatedas a function f(x) – more examples cost functions can be any type depending on the algorithm, e.g. f(x) = c ( c is a constant ), h(x) = log(x), or g(x) = x2 quadratic growth cost stays same cost grows very slow g(x) =x2 f(x) =c c h(x) = log(x) x (size of data)

  7. The Time Complexity • A cost function for an algorithm indicates how computation time grows when the amount of data increases • A special term for it – TIME COMPLEXITY complexity = time complexity/space complexity Space complexity studies the memory usage (but mostly we pay our attention to time complexity) • “Algorithm A’s time complexity is linear” = A’s computation time is proportional to the size of A’s input data

  8. The Big-O Notation • How to compare (evaluate) two cost functions • Is n+100*n2 better than 1000*n2 • It has been agreed to use a standard measurement – an order of magnitude notation – the Big-O notation

  9. The Big-O Notation - definition • A function f(x) is said to be the big-O of a function g(x) if there is an integer N and a constant C such that f(x) ≤ C∗g(x) for all x ≥ N We write f(x) = O(g(x))

  10. The Big-O Notation - Examples f(x) = 4+7∗x g(x) = x C = 8, N =4 f(x) ≤ 8∗g(x) f(x) = 3∗x g(x) = x C = 3, N =0 f(x) ≤ 3∗g(x) 8*g(x) f(x) = 4+7x f(x) = 3x = 3g(x) g(x) = x g(x) = x N 3∗x = O(x) 4+7∗x = O(x)

  11. The Big-O Notation - Examples f(x) = x+100∗x2 g(x) = x2 C = 101, N = 1 f(x) ≤ 101∗g(x) f(x) = 20+x+5∗x2+8∗x3 g(x) = x3 C = 15, N =20 f(x) ≤ 9∗g(x) when x > 1 x+100∗x2 < x2+100∗x2 x2+100∗x2 = 101*g(x) x+100∗x2= O(x2) when x > 20 20+x+5∗x2+8∗x3 < x3+x3+5∗x3+8x3 = 15∗x3 = 15*g(x) 20+x+5∗x2+8∗x3= O(x3)

  12. A smaller power is always better eventually. for larger enough n The biggest power counts most. for larger enough n Power of n 1000000000 * n < n2 10 *n4 + 23*n3 + 99*n2 + 200*n + 77 < 11*n4

  13. Using Big-O Notation for classifying algorithms 23*n3 + 11 1000000000000000*n3 +100000000*n2 + ... or even 100*n2 + n All cost functions above is O(n3) Performance is no worse than C*n3 eventually

  14. Working Out Big-O Ignore constants in multiplication - 10*n is O(n) Use worst case in sums - n2 + 100*n + 2 is O(n2) Leave products – n*log(n) is O( n*log(n) )

  15. Memo for In-class test 17 [ /5]

  16. Some Typical Algorithms - Search Search an array of size n for a particular item • Sequential search Time taken is O( ) • Binary search Time taken is O( )

  17. Some Typical Algorithms - Sorting Insertion sortUse 2 lists – unsorted and sorted insertionSort(input_list) { unsorted = input_list (assume size = n) sorted = an empty list loop from i =0 to n-1 do { insert (unsorted[i], sorted) } return sorted } Time taken is O( )

  18. Some Typical Algorithms - Sorting Bubble sort Swap if xi < xi-1 for all pairs Do this n times (n = length of the list) x0 3 x1 5 x2 1 x3 2 x4 6 Time taken is O( )

  19. Some Typical Algorithms - Sorting Quick sort quickSort(list) { If List has zero or one number return list Randomly pick a pivot xi to create two partitions larger than xi and smaller than xi return quickSort(SmallerList) + xi + quickSort(LargerList) } // ‘+’ means appending two lists into one Time taken is O( )

  20. Worst Cases, Best Cases, and Average Cases • Complexity: performance on the average cases • Worst/Best cases: The worst (best) performance which an algorithm can get • E.g. 1 - bubble sort and insertion sort can get O(n) in the best case (when the list is already ordered) • E.g. 2 – quick sort can get O(n2) in the worst case (when partition doesn’t work well)

  21. Sorting a sorted list 1 2 3 4 5 6 7 8 9

  22. A Summary on Typical Algorithms • Sequential search O(n) • Binary search (sorted list) O(log(n)) • Bubble sort O(n2) best O(n) worst O(n2) • Insertion sort O(n2) best O(n) worst O(n2) • Quick sort O(log(n)*n) best O(log(n)*n) worst O(n2)

  23. Can you order these functions ? smaller comes first log(n) √n log(n)*n n4/3 (log(n))2 n2 2n n

  24. Memo for In-class test 18 [ /5]

  25. Speeding up the Machine can solve problem of size N in one day size

  26. Code Breaking A number is the product of two primes How to work out prime factors? i.e. 35 = 7*5 5959 = 59*101 48560209712519 = 6850049 * 7089031 Complexity is O(10n) where n is the number of digits ( or O(s) where s is the number) Typically n is about 150

  27. Cost of Speeding Up For each additional digit multiply time by 10 ( as the complexity is O(10n) ) For 10 extra digits increase time from 1 second to 317 years For 18 extra digit increase time from 1 second to twice the age of the universe

  28. Where Different Formulas Appear O(log(n)) find an item in an ordered collection O( n ) look at every item O( n2 ) for every item look at every other item O( 2n ) look at every subset of the items

  29. Do Big-O Times Matter? Is n3 worse than 100000000000000000*n ? Algorithm A is O(n2) for 99.999% of cases and O(2n) for 0.001% cases Algorithm B is always O(n10) Which one is better?

  30. Degree of Difficulty log(n) polynomial time ‘tractable’ n, n2, n3, ...... n*log(n) exponential time ‘intractable’ 2n, 3n, ...... nn, n (2^n)

  31. A Simple O(2n) Problem Given n objects of different weights, split them into two equally heavy piles (or say if this isn’t possible)

  32. Hard Problem Make Easy Possible Data for problem A Magic Machine (for problem A) Correct Answer? Answer checker

  33. NP Problems A problem is in the class NP (Non-deterministic Polynomial Time) if the checking requires polynomial time i.e. O(nc) where C is a fixed number) Note: problems not algorithms

  34. Example Given n objects of different weights, split them into two equally heavy piles O(2n) {1,2,3,5,7,8} Algorithm 1+2+3+7 = 8+5 ok ✔ Checking O(n)

  35. In Previous Example ... • We know there are algorithms takes O(2n) But ...... • How do we know there isn’t a better algorithm which only takes polynomial time anyway?

  36. Some Classes of Problems problems which can be solved in polynomial time using a magic box NP P problems which can be solved in polynomial time

  37. Problems in NP but not in P P NP is a subset of but Most problems for which the best known algorithm is believed to be O(2n) are in NP ? = P NP NP P

  38. The Hardest NP Problems If a polynomial time algorithm is found for any problem in NP-Complete then every problem in NP can be solved in polynomial time. i,e, P = NP NP-Complete NP P

  39. Examples of NP-Complete Problems • The travelling salesman problem • Finding the shortest common superstring • Checking whether two finite automata accept the same language • Given three positive integers a, b, and c, do there exist positive integers such that a*x2 + b*y2 = c

  40. How to be Famous Find a polynomial time algorithm for this problem Given n objects of different weights, split them into two equally heavy piles (or say if this isn’t possible)

  41. Memo for In-class test 19 [ /5]

More Related