1 / 29

ALGORITHMS THIRD YEAR

ALGORITHMS THIRD YEAR. BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC. Lecture four. Dr. Hamdy M. Mousa. Growth of Functions. overview. We know that: The order of growth of the running time of an algorithm.

dani
Télécharger la présentation

ALGORITHMS THIRD YEAR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ALGORITHMSTHIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture four Dr. Hamdy M. Mousa

  2. Growth of Functions

  3. overview We know that: • The order of growth of the running time of an algorithm. • The input size nbecomes large enough, merge sort, with its (n lg n)worst-case running time, beats insertion sort, whose worst-case running time is (n2). • At input sizes large enough to make only the order of growth of the running time relevant, we are studying the asymptotic efficiency of algorithms.

  4. overview • A way to describe behavior of functions in the limit. We’re studying asymptotic efficiency. • Describe growth of functions. • Focus on what’s important by abstracting away low-order terms and constant factors. • How we indicate running times of algorithms. • A way to compare “sizes” of functions: ≈ ≤  ≈ ≥ ≈ = o ≈ < ω ≈ >

  5. Asymptotic notation The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers N = {0, 1, 2, . . .}. Such notations are convenient for describing the worst-case running-time function T (n).

  6.  -notation • The worst-case running time of insertion sort is T (n) = (n2). • Let us define what this notation means. For a given function g(n), • we denote by (g(n))the set of functions (g(n))= { f (n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f (n) ≤ c2g(n) for all n ≥ n0} .

  7. O-notation O(g(n)) = {f (n) : there exist positive constants c and n0 such that 0 ≤ f (n) ≤ cg(n) for all n ≥ n0} • g(n) is an asymptotic upper bound for f (n). • If f (n) O(g(n)), we write f (n) = O(g(n)) (will precisely explain this soon).

  8. O-notation Example : 2n2 = O(n3), with c = 1 and n0 = 2. Examples of functions in O(n2): n2 n2 + n n2 + 1000n 1000n2 + 1000n Also, n n/1000 n1.99999 n2/ lg lg lg n

  9. -notation • (g(n)) = { f (n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f (n) for all n ≥ n0} . • (g(n)) = { f (n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f (n) for all n ≥ n0} .

  10. -notation

  11. -notation (g(n)) = { f (n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f (n) ≤ c2g(n) for all n ≥ n0} . g(n) is an asymptotically tight bound for f (n). Example: n2/2 − 2n = (n2), with c1 = 1/4, c2 = 1/2, and n0 = 8.

  12. Theorem f (n) =  (g(n)) if and only if f = O(g(n)) and f = (g(n)) . Leading constants and low-order terms don’t matter.

  13. Asymptotic notation in equations • When on right-hand side: O(n2) stands for some anonymous function in the set O(n2). • 2n2+3n+1 = 2n2+(n) means 2n2+3n+1 = 2n2+ f (n) for some f (n)  (n). • In particular, f (n) = 3n + 1.

  14. By the way, we interpret # of anonymous functions as = # of times the asymptotic notation appears: • O(i ) OK: 1 anonymous function • O(1) + O(2)+…..+ O(n) not OK: n hidden constants  no clean interpretation

  15. When on left-hand side: No matter how the anonymous functions are chosen on the left-hand side, there is a way to choose the anonymous functions on the righthand side to make the equation valid. Interpret 2n2 +  (n) = (n2) as meaning for all functions f (n) (n), There exists a function g(n) (n2) such that 2n2 + f (n) = g(n). Can chain together: 2n2 + 3n + 1 = 2n2 + (n) = (n2) .

  16. Interpretation: • First equation: There exists f (n) (n) such that 2n2+3n+1 = 2n2+ f (n). • Second equation: For all g(n) (n) (such as the f (n) used to make the first equation hold), there exists h(n)  (n2) such that 2n2 + g(n) = h(n).

  17. o-notation o(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0> 0 such that 0 ≤ f (n) < cg(n) for all n ≥ n0} .

  18. ω-notation ω(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0> 0 such that 0 ≤ cg(n) < f (n) for all n ≥ n0}

  19. Comparisons of functions Relational properties: Transitivity: f (n) = (g(n)) and g(n) = (h(n)) f (n) = (h(n)). Same for O,, o, and ω. Reflexivity: f (n) = ( f (n)). Same for O and . Symmetry: f (n) = (g(n)) if and only if g(n) = ( f (n)). Transpose symmetry: f (n) = O(g(n)) if and only if g(n) = ( f (n)). f (n) = o(g(n)) if and only if g(n) = ω( f (n)).

  20. Comparisons of functions Comparisons: • f (n) is asymptotically smaller than g(n) if f (n) = o(g(n)). • f (n) is asymptotically larger than g(n) if f (n) = ω(g(n)). No trichotomy. Although intuitively, we can liken O to ≤ , , to ≥, etc., unlike real numbers, where a < b, a = b, or a > b, we might not be able to compare functions. Example:n1+sinnand n, since 1 + sin n oscillates between 0 and 2.

  21. Standard notations and common functions Monotonicity • f (n) is monotonically increasing if m ≤ n f (m) ≤ f (n). • f (n) is monotonically decreasing if m ≥ n f (m) ≥ f (n). • f (n) is strictly increasing if m < n f (m) < f (n). • f (n) is strictly decreasing if m > n f (m) > f (n).

  22. Exponentials

  23. Exponentials A suprisingly useful inequality: for all real x, ex≥ 1 + x . As x gets closer to 0, exgets closer to 1 + x.

  24. Logarithms Notations: lg n = log2n (binary logarithm) , ln n = loge n (natural logarithm) , lgk n = (lg n)k(exponentiation) , lg lg n = lg(lg n) (composition) . Logarithm functions apply only to the next term in the formula, so that lg n + kmeans (lg n) + k, and not lg(n + k).

  25. In the expression logb a: • If we hold b constant, then the expression is strictly increasing as a increases. • If we hold a constant, then the expression is strictly decreasing as b increases.

  26. Useful identities for all real a > 0, b > 0, c > 0, and n, and where logarithm bases are not 1:

  27. Changing the base of a logarithm from one constant to another only changes the value by a constant factor, so we usually don’t worry about logarithm bases in asymptotic notation. Convention is to use lg within asymptotic notation, unless the base actually matters.

  28. Factorials

More Related