1 / 102

Discrete Math and Its Application to Computer Science

Discrete Math and Its Application to Computer Science. UBİ 501 Lecture - 3 İlker Kocabaş E.Ü Uluslararası Bilgisayar Enstitüsü Bornova - İzmir. Flow . ALGORITHMS Introduction Algorithmic Complexity Growing Functions NUMBER THEORY Modular Arithmetic Primary Numbers

monifa
Télécharger la présentation

Discrete Math and Its Application to Computer Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discrete Math and Its Application to Computer Science UBİ 501 Lecture - 3 İlker Kocabaş E.Ü Uluslararası Bilgisayar Enstitüsü Bornova - İzmir

  2. Flow ALGORITHMS Introduction Algorithmic Complexity Growing Functions NUMBER THEORY Modular Arithmetic Primary Numbers Greatest Common Divisor (gcd) & Least Common Multipier (lcd) Ecludian Algorithm for gcd Number Systems: Decimal, Binary, Octal, ….

  3. AlgorithmsIntroduction (1) • Algorithm: • A finite set of precise instructions for performing a computation or for solving a problem. • Synonyms for a algorithm are: program, recipe, procedure, and many others. • Pseudocode (T: Sözde Kod) • Describing an algorithm by using a specific computure language: Complex instructions and difficult to understand. • Intemadiate step between Natural Language & Programming Language

  4. Algorithms (1)Pseudocode Example • Algorithm-1: Finding the maximum element in a finite sequence INPUT procedure max(a1,a2,a3….an: integers) max := a1 fori:=0 ton ifmax < aithenmax:= ai outputmax DIFINITENESS OUTPUT

  5. AlgorithmsBasic Problems in CS • Searching (T: Arama) Algorithms • Finding element ai equals to x • Linear Search, Binary Search, … • Algorithm 2: Linear Search Algorithm procedure max(x: integer, a1,a2,a3….an: distinct integers) i:=1 while (i ≤ n and x ≠ ai) i := i + 1 if i ≤ n then location := i else location:= 0 outputlocation

  6. Algorithms (1)Basic Problems in CS • Linear Search Example • Find x = 5 while (i ≤ n and x ≠ ai) i := i + 1 if i ≤ n then location := i else location:= 0

  7. Algorithms (1)Basic Problems in CS • Sorting (Sıralama) Algorithms • Sort a sequence A for a given order criteria • Buble Sort, Insertion Sort, …..

  8. Algorithms (1)Basic Problems in CS • Merging (T: Birleştirme) Algorithms • Merge ordered sequences A & B

  9. AlgorithmsAlgorithmic Complexity (2) • How can the efficiency of an algorithm be analyzed? • Time: “Time used by a computer” to solve a problem • Space: “The amount of Computer memory” required to implement algorithm

  10. Algorithms (2)Running Time • Running time: • Measure the actual time spent by implementation of algorithm. • Deficiencies: • Actual running time changes paltform to platform (1Ghz ≠ 2 Ghz) • There is no information wrt varying n (input size) and input order. • Count the basic operations or steps processed by algorithm

  11. Algorithms (2)Running Time • Running time: • Count the basic operations or steps executed by algorithm • Comparision (T: karşılaştırma) [ Eg. X < Y ] • Assignment (T: Atama) [ Eg. X = 5 ] • Increment/Decriment [ Eg. X = X  1 ] • Function Output [ Eg. return/output X ] • Addition/Substruction/Multiplication/Division • ………..

  12. Algorithms (2)Running Time • Count the basic operations or steps processed by algorithm • Best Case Analysis: Minimum number of operations executed wrt input behaviour of a given size. • Average Case Analysis: Average number of operations used to solve the problem over all inputs of a given size. • Worst Case Analysis: Maximum number of operationsnumbers of steps executed wrt input behaviour of a given size.

  13. Algorithms (2)Algorithm 3: Surjectivity procedureisOnto( f [(1, 2,…, n)  (1, 2,…, m)]: function) • if( m > n ) 1 step comp. • return false 1 step End if exec. • soFarIsOnto:= true 1 step ass. • forj:= 1 to mm loops: 1 step comp. +1 step increment • soFarIsOnto:= false1 step ass. • fori := 1 to n n loops: 2 steps comp. + inc. • if ( f(i) = j ) 1 step comp. • soFarIsOnto:= true1 step ass. • if( !soFarIsOnto ) 1 step negation • return false 1 step End • return true;1 step End

  14. Algorithms (2)Algorithm 3: Surjectivity • Best Case Analysis: 1 operation • if( m > n ) 1 step comp. • return false 1 step End if exec. • Worst Case Analysis:2+ m(5n+3) = 5mn +3m+2 • if( m > n ) 1 • return false 1 step End if exec. • soFarIsOnto:= true1 • forj:= 1 to m n : [ 1 +1 • soFarIsOnto:= false 1 • fori := 1 to n n: (1 + 1 • if ( f(i) = j ) 1 • soFarIsOnto:= true1 • if( !soFarIsOnto ) 1 )] • return false 1 step End • return true;1 step End

  15. Algorithm (2)Comparing Running Times • At most 5mn+3m+2 for first algorithm • At most 5m+2n+2 for second algorithm Worst case when m n so replace m by n: 5n 2+3n+2 vs. 8n+2 To tell which is better, look at dominant term: 5n2+3n+2 vs. 8n+2 So second algorithm is better.

  16. Running Times IssuesBig-O Response Asymptotic notation (Big-O, Big- , Big-) gives partial resolution to problems: • For large n the largest term dominates so 5n 2+3n+2 is modeled by just n 2.

  17. Running Times IssuesBig-O Response Asymptotic notation (Big-O, Big- , Big-) gives partial resolution to problems: • Different lengths of basic steps, just change 5n 2 to Cn 2 for some constant, so doesn’t change largest term

  18. Running Times IssuesBig-O Response Asymptotic notation (Big-O, Big- , Big-) gives partial resolution to problems: • Basic operations on different (but well-designed) platforms will differ by a constant factor. Again, changes 5n 2 to Cn2 for some constant.

  19. Running Times IssuesBig-O Response Asymptotic notation (Big-O, Big- , Big-) gives partial resolution to problems: • Even if overestimated by assuming iterations of while-loops that never occurred, may still be able to show that overestimate only represents different constant multiple of largest term.

  20. Big-O, Big-, Big- • Useful for computing algorithmic complexity, i.e. the amount of time that it takes for computer program to run.

  21. Notational Issues Big-O notation is a way of comparing functions. Notation unconventional: EG: 3x 3 + 5x 2 – 9 = O (x 3) Doesn’t mean “3x 3 + 5x 2 – 9 equals the function O (x 3)” Which actually means “3x 3+5x 2 –9 is dominated by x 3” Read as: “3x 3+5x 2 –9 is big-Oh of x 3”

  22. Intuitive Notion of Big-O Asymptotic notation captures behavior of functions for large values of x. EG: Dominant term of 3x 3+5x 2 –9 is x 3. As x becomes larger and larger, other terms become insignificant and only x 3 remains in the picture:

  23. Intuitive Notion of Big-Odomain – [0,2] y = 3x 3+5x 2 –9 y = x 3 y = x 2 y = x

  24. Intuitive Notion of Big-Odomain – [0,5] y = 3x 3+5x 2 –9 y = x 3 y = x 2 y = x

  25. Intuitive Notion of Big-Odomain – [0,10] y = 3x 3+5x 2 –9 y = x 3 y = x 2 y = x

  26. Intuitive Notion of Big-Odomain – [0,100] y = 3x 3+5x 2 –9 y = x 3 y = x 2 y = x

  27. Intuitive Notion of Big-O In fact, 3x 3+5x 2 –9 is smaller than 5x 3 for large enough values of x: y = 5x 3 y = 3x 3+5x 2 –9 y = x 2 y = x

  28. Big-O. Formal Definition f (x ) is asymptotically dominated by g (x ) if there’s a constant multiple of g (x ) bigger than f (x ) as x goes to infinity: DEF: Let f , g be functions with domain R0 or N and codomain R. If there are constants C and k such  x > k,|f (x )|  C  |g (x )| then we write: f (x ) = O ( g (x ) )

  29. Common Misunderstanding It’s true that 3x 3 + 5x 2 – 9 = O (x 3) as we’ll prove shortly. However, also true are: • 3x 3 + 5x 2 – 9 = O (x 4) • x 3 = O (3x 3 + 5x 2 – 9) • sin(x)= O (x 4) NOTE: C.S. usage of big-O typically involves mentioning only the most dominant term. “The running time is O (x 2.5)” Mathematically big-O is more subtle.

  30. Big-O. Example EG: Show that 3x 3 + 5x 2 – 9 = O (x 3). Previous graphs show C = 5 good guess. Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k

  31. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9

  32. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9 • What k will make 5x 2 ≤ x 3 for x > k ?

  33. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9 • What k will make 5x 2 ≤ x 3 for x > k ? • k = 5 !

  34. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9 • What k will make 5x 2 ≤ x 3 for x > k ? • k = 5 ! • So for x > 5, 5x 2 ≤ x 3 ≤ 2x 3 + 9

  35. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9 • What k will make 5x 2 ≤ x 3 for x > k ? • k = 5 ! • So for x > 5, 5x 2 ≤ x 3 ≤ 2x 3 + 9 • Solution: C = 5, k = 5 (not unique!)

  36. EG: Show that3x 3 + 5x 2 – 9 = O (x 3). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k • Collect terms: 5x 2 ≤ 2x 3 + 9 • What k will make 5x 2 ≤ x 3 for x > k ? • k = 5 ! • So for x > 5, 5x 2 ≤ x 3 ≤ 2x 3 + 9 • Solution: C = 5, k = 5 (not unique!)

  37. Big-O. Negative Example x 4O (3x 3 + 5x 2 – 9) : No pair C, k exist for which x > k implies C (3x3 + 5x 2 – 9)x 4 Argue using limits: x 4 always catches up regardless of C. 

  38. Big-O and limits LEMMA: If the limit as x   of the quotient |f (x) / g (x)| exists then f (x ) = O ( g (x ) ). EG: 3x 3 + 5x 2 – 9 = O (x 3 ). Compute: …so big-O relationship proved.

  39. Little-o and limits DEF: If the limit as x   of the quotient |f (x) / g (x)| = 0 then f (x ) = o (g (x ) ). EG: 3x 3 + 5x 2 – 9 = o (x 3.1 ). Compute:

  40. Big- and Big- Big-: reverse of big-O. I.e. f (x ) = (g (x ))  g (x ) = O (f (x )) so f (x ) asymptotically dominatesg (x ). Big-: domination in both directions. I.e. f (x ) = (g (x ))  f (x ) = O (g (x ))  f (x ) = (g (x )) Synonym for f = (g): “f is of orderg ”

  41. Useful facts • Any polynomial is big- of its largest term • EG: x 4/100000 + 3x 3 + 5x 2 – 9 =(x 4) • The sum of two functions is big-O of the biggest • EG: x 4 ln(x ) + x 5 = O (x 5) • Non-zero constants are irrelevant: • EG: 17x 4 ln(x ) = O (x 4 ln(x ))

  42. Big-O, Big-, Big-. Examples Q: Order the following from smallest to largest asymptotically. Group together all functions which are big- of each other:

  43. Big-O, Big-, Big-. Examples A: 1. 2. 3. , (change of base formula) 4. 5. 6. 7. 8. 9. 10.

  44. Incomparable Functions Given two functions f (x ) and g (x ) it is not always the case that one dominates the other so that f and g are asymptotically incomparable. E.G: f (x) = |x 2 sin(x)| vs. g (x) = 5x 1.5

  45. Incomparable Functions y = x2 y = |x 2 sin(x)| y = 5x1.5

  46. Incomparable Functions y = x2 y = 5x1.5 y = |x 2 sin(x)|

  47. Big-OA Grain of Salt Big-O notation gives a good first guess for deciding which algorithms are faster. In practice, the guess isn’t always correct. Consider time functions n 6 vs. 1000n 5.9. Asymptotically, the second is better. Often catch such examples of purported advances in theoretical computer science publications. The following graph shows the relative performance of the two algorithms:

  48. Big-OA Grain of Salt Running-time In days Assuming each operation takes a nano-second, so computer runs at 1 GHz T(n) = 1000n 5.9 T(n) = n 6 Input size n

  49. Big-OA Grain of Salt In fact, 1000n 5.9 only catches up to n 6 when 1000n 5.9 = n 6, i.e.: 1000= n 0.1,i.e.: n = 100010 = 1030 operations = 1030/109 = 1021 seconds  1021/(3x107)  3x1013 years  3x1013/(2x1010)  1500 universe lifetimes!

  50. AlgorithmsExtra-1 • The world of computation can be subdivided into three classes: • Tractable Problems • Polynomial worst-case complexity (P Class) • (nc), c ≥ 1 constant [Eg.Bubble Sort Algorithm is (n2)] • Intractable Problems (NP Class) • Exponential worst-case complexity (E Class) • (cn), c ≥ 1 constant [Eg.Satisfiability Algorithm is (2n)] • Factorial worst-case complexity (F Class) • (n!), [Eg.Traveling Salesman Algorithm is (n!)] • Unsolvable Problems • No algorithms exists for solving them • Halting Problem

More Related