650 likes | 784 Vues
This chapter explores the step count method for evaluating program performance, focusing on how it accounts for time spent in all program functions. Key concepts include defining "steps" in terms of execution time and operations, with examples using recursive functions and matrix operations. The chapter also delves into asymptotic notation, comparing time complexities of various algorithms, and establishing their growth rates. Key definitions include Big O, Omega, and Theta notations, providing a comprehensive understanding of algorithm performance.
E N D
Chapter 2 Program Performance – Part 2
Step Counts • Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function • Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics • Return a+b*c/(a-b)*4 • X =y
Use a global variable to count program steps Count = 2n + 3
Counting steps in a recursive function • tRsum = 2, n=0 • tRsum = 2+tRsum(n-1), n>0 • tRsum = 2+2+tRsum(n-2), n>0 • tRsum = 2(n+1), n>=0
Count steps in Matrix Addition count = 2rows*cols+2rows+1
Matrix Transpose Template <class T> void transpose(T** a, int rows) { for (int i = 0; i < rows ; i++) for (int j = i+1; j < rows ; j++) swap(a[i][j], a[j][i]) }
Inefficient way to compute the prefix sumsfor j = 0, 1, …, n-1 Note: number of S/E for sum() varies depending on parameters
Steps Per Execution • Sum(a, n) requires 2n + 3 steps • Sum(a , j + 1) requires 2(j+1) + 3 = 2j +5 steps • Assignment statement: b[j]=sum(….) • ==>2j + 6 steps n-1 • Total: ∑ (2j +6) = n(n+5) j=0
Average for successful searches • X has equal probability of being any one element of a. • Step count if X is a[j]
Insertion - Average • the step count for inserting into position j is 2n-2j+3 • Average count is:
Asymptotic Notation • Objectives of performance Evaluation: • Compare time complexities of two programs that do the same function • Predict the growth in run time as instance characteristics change • Operation count and step count methods not accurate for either objectives • Op count: counts some ops and ignores others • Step count: definition of a step is inexact
Asymptotic Notation • If two programs: • Program A with complexity C1n2+C2n • Program B with complexity C3n • Program B is faster than program A for sufficiently large values of n • For Small values of n, either could be faster and it may not matter any way. • There is a break-even point for n beyond which B is always faster than A.
Asymptotic Notation • Describes behavior of space and time complexities of programs for LARGE instance characteristics • To establish a relative order among functions. • To compare their relative rate of growth • Allows us to make meaningful, though inexact statements about the complexity of programs
Mathematical background T(n) denotes the time or space complexity of a program Big- Oh: Growth rate of T(n) is <= f(n) • T(n)=O ( f(n) ) iff constants c and n0 exist such that T(n)<=c f(n) when n>=n0 • f is an upper bound function for T • Example: Algoritm A is O(n2) means, for data sets big enough (n>n0), algorithm A executes less than c*n2 (c a positive constant).
The Idea • Example: • 1000n • larger than n2 for small values of n • n2 grows at a faster rate and thus n2 will eventually be the larger function. • Here we have • T(n) = 1000n, f(n) = n2 , n0 = 1000, and c=1 • T(n) <= f(n) and n > n0 • Thus we say that • 1000n = O (n2 ) • Note that we can get a tighter upper bound
Example • Suppose T(n) = 10n2 + 4n + 2 • for n>= 2, T(n) <= 10n2 + 5n • for n>=5, T(n) <= 11n2 • T(n) = O(n2 )
Big Oh Ratio Theorem • T(n) = O(f(n)) iff (T(n)/f(n)) < c for some finite constant c. • f(n) dominates T(n).
Examples • Suppose T(n) = 10n2 + 4n + 2 • T(n)/n2 = 10 + 4/n + 2/n2 • (T(n)/ n2) = 10 • T(n) = O (n2 )
Common Orders of Magnitude Functions Name 1 Constant log n Logarithmic log2n Log-squared n log n n2 Quadratic n3 Cubic 2n Exponential n! Factorial
Loose Bounds • Suppose T(n) = 10n2 + 4n + 2 • 10n2 + 4n + 2 <= 11n3 • T(n) = O(n3) • Need to get the smallest upper bound.
Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = O(nm)
Omga Notation--Lower Bound Omega: • T(n)=W ( g(n) ) iff constants c and n0 exist such that T(n)>=c g(n) for all n >=n0 • Establishes a lower bound • eg: T(n) = C1n2+C2n • C1n2+C2n C1n2 for all n 1 • T(n) C1 n2 for all n 1 • T(n) is W (n2) • Note: T(n) is also W (n) and W (1). Need to get largest lower-bound
Omega Ratio Theorem • T(n) = W (f(n)) iff (f(n)/T(n)) <= c for some finite constant c.
Lower Bound of Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = W(nm) • T(n) = n4 + 3500n3 + 400n2 +1 • T(n) is W (n4)
Theta Notation Theta: When O and W meet we indicate that with Qnotation • Definition: T(n)=Q ( h(n) ) iff constants c1, c2 and n0 exist such that c1h(n)<=T(n)<=c2h(n) for all n > n0 • T(n)=Q ( h(n) ) iff T(n)=O(h(n)) and T(n)= W (h(n)) • e.g. T(n) = 3n + 8 • 3n<= 3n+8 <= 11n for n >= 1 • T(n) = Q(n) • T(n) = 20log2(n) +8 = Q log2 (n) • log2 (n) < 20log2 (n) + 8<= 21log2 (n) for all n>=32
Theta Notation cntd • T(n) = 1000n • T(n) = O(n2) • but T(n) != Q(n2) because T(n) !=W(n2)
Theta of Polynomials • If T(n) = amnm + ….+a1n1 +a0n0 then T(n) = Q(nm)
Little o Notation Little- Oh: Growth rate of T(n) is < p(n) • T(n)=o ( p(n) ) if T(n)=O ( p(n) ) and T(n)!= W ( p(n) ) • T(n) = 1000n • T(n) o(n2)
Simplifying Rules • If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)). • If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)). • f1(n) = O(g1(n)) and f2 (n) = O(g2(n)), then (a) (f1 + f2 )(n) = max (O(g1 (n)), O(g2(n))), (b)f1 (n) * f2 (n) = O(g1(n) * g2(n))
Some Points • DO NOT include constants or low-order terms inside a Big-Oh. • For example: • T(n) = O(2n2) or • T(n) = O(n2 + n) • are the same as: • T(n) = O(n2)
Examples • Example1: a = b; This assignment takes constant time, so it is Q(1) • Example 2: sum =0; for( I= 0; I<= n; I++) sum += n; • time complexity is Q(n)
Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) a++; • time complexity is Q(n2)
Examples CNTD a = 0; for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; • a++ statement will execute n(n+1)/2 times • time complexity is Q(n2)
Examples CNTD a = 0; Q(1) for (i=1; i<=n; i++) for (j=1; j<= i ; j++) a++; Q(n2) for (k=1; k<=n; k++) Q(n) A[k] = k-1; • time complexity is Q(n2)
Examples CNTD • Not all doubly nested loops execute n2 times a = 0; for (i=1; i<=n; i++) for (j=1; j<= n ; j *= 2) a++; • Inner loop executes log2(n) • Outer loop execute n times • time complexity is Q(n log2 (n))
First determine the asymptotic complexity of each statement and then add up