461 likes | 609 Vues
BIG-O --- Algorithms. Purpose: Be able to evaluate the relative efficiency of various algorithms that are used to process data We need to be able to evaluate all of the major sort and search algorithms as well as the various implementations of the Abstract Data Types.
E N D
BIG-O --- Algorithms • Purpose: Be able to evaluate the relative efficiency of various algorithms that are used to process data • We need to be able to evaluate all of the major sort and search algorithms as well as the various implementations of the Abstract Data Types
Intro: As we began to discuss in the lecture on Algorithms, we need to be able to evaluate processes and algorithms based on a uniform criteria. • Big-O provides us with a method for such evaluations
BIG-O --- Algorithms • Searching (locating an element with a target value in a list of values) and Sorting are 2 tasks used to illustrate the concept of an algorithm • Algorithms typically use iterations or recursion. This property differentiates it from straight forward code
Algorithms are also generic in nature. The same algorithm applies to a whole set of initial states and produces corresponding final states for each of them. • A sorting algorithm, for example, must apply to any list regardless of the values of its elements
Furthermore, an algorithm must be independent of the SIZE of the task ( n )
Analyze the time efficiency and space requirements of algorithms in an abstract way by looking at an algorithm with regards to specific data types and other implementation details
Criteria to evaluate an algorithm: Space required Amount of time Complexity
SPACE: • Number and size of simple variables • Number and total size of components of compound variable • Is space dependent on size of input?
TIME: • Not necessarily measured in real clock time • Look for operation(comparison) • Express time required in terms of this characteristic operation
COMPLEXITY: • Has little to do with how complex an algorithm “looks” • function of the size(number) of input values • Average time • based on probability of the occurrence of inputs • Worst time • based on most unfavorable input
Efficiency of TIME: • Number of Comparisons (if a > b) • Number of Assignments (a = c)
We will also disregard the specifics of the hardware or the programming language so: • we will not be measuring time efficiency in real time • we will not measure in terms of the number of required program instructions/statements
We will discuss performance in terms of abstract “steps” necessary to complete a task and we assume that each step takes the same amount of time • We can compare different algorithms that accomplish the same task
The theory that we can predict the long-term behavior of functions without specific knowledge of the exact constants used in describing the function allows us to ignore constant factors in the analysis of execution time
BIG-O --- Big-O Notation • We can evaluate algorithms in terms of Best Case , Worst Case and Average Case • We assume that the number of “steps” or n is a large number • Analyze loops especially nested loops
Sequential Search algorithms grow linearly with the size (number) of the elements • We match the target value against each array value until a match is found or the entire array is scanned
Sequential Search • Worst Case is we locate the target element in the last element • Best Case is the target value is found on the first attempt • Average Case finds the target value in the middle of the array
Binary Search algorithms (compared as applied to the same task as the sequential search) grow logarithmicly with the size (number) of the elements • Elements must be ordered • We compare the target value against the middle element of the array and proceed left (if smaller) or right (larger) until a match if found
Binary Search For example, where n=7, we try a[3] then proceed left or right
Binary Search • Worst Case , Average Case is where we locate the target element in (3 ) log n comparisons (the average case is only one less than the worst case) • Best Case is the target value is found on the first attempt • THE execution time of a Binary Search is approx. proportional to the log of n
(A) Linear Growth (B) Logarithmic Growth • Logarithmic Growth is SLOWER than Linear Growth • Asymptotically, a Binary Search is FASTER than a Sequential Search as Linear Time eventually surpasses Logarithmic Time (A) (B)
Growth Rate Functions: • reference functions used to compare the rates of growth of algorithms
BIG-O --- Big-O NotationGrowth Rate Functions • O(1) Constant Time --- time required to process 1 set of steps • The algorithm requires the same fixed number of steps regardless of the size of the task: • Push and Pop Stack operations • Insert and Remove Queue operations • Finding the median value in a sorted Array
O(n) Linear Time --- increase in time is constant for a larger n (number of tasks) • The algorithm requires a number of steps proportional to the size of the task • 20 tasks = 20
O(n) Examples: • Traversal of a List • Finding min or max element in a List • Finding min or max element in a sequential search of unsorted elements • Traversing a Tree with n nodes • Calculating n-factorial, iterativly • Calculating nth Fibonacci number, iteratively
O(n^2) Quadratic Time • The number of operations is proportional to the size of the task SQUARED • 20 tasks = 400
O(n^2) Examples: • Simplistic Sorting algorithms such as a Selection Sort of n elements • Comparing 2 2-Dimensional arrays, matrics, of size n by n • Finding Duplicates in an unsorted list of n elements (implemented with 2 nested loops)
O(log n) Logarithmic Time --- the log of n is significantly lower than n • 20 tasks = 4.3 • Used in many “divide and conquer” algorithms, like binary search, and is the basis for using binary search trees and heaps
O(log n) • for example, a binary search tree of one million elements would take, at most, 20 steps to locate a target • Binary Search in a Sorted list of n elements • Insert and Find Operations for a Binary Search Tree with n nodes • Insert and Find Operations for a Heap with n nodes
O(n log n) “n log n” Time • 20 tasks = 86.4 • More advanced sorting algorithms like Quicksort, Mergesort (which will be discussed in detail in the next chapter)
O(a^n)(a > 1) Exponential Time • 20 tasks = 12^20 = very large number • Recursive Fibonacci implementation (a > 3/2) • Towers of Hanoi (a=2) • Generating all permutations of n symbols
The best time in the preceding growth rate functions is constant timeO(1) • The worst time in the preceding growth rate functions is exponential timeO(a^n) which quickly Overwhelms even the fastest computers even for a relatively small n
Polymonial Growth: Linear, quadratic, cubic… The Highest Power of N Dominates the Polymonial (see Ill Lam SDT p.42 Top) • is considered manageable as compared to exponential growth
Exponential N log n t Quadratic Linear Log n Constant O(1) n
Log n has a slower asymptotic growth rate when compared to linear growth as a thousand fold increase in the size of the task , n , results in a fixed, moderate increase in the number of operations required.
For a given ALGORITHM, you can see how it falls on the following grid to determine its “Order of Growth” or time efficiency. • Use this grid as a “Rule of Thumb” when evaluating the BIG-O of an algorithm.
Let this grid help narrow down possible solutions, but make sure you “memorize” the other charts and use them when attacking an order of growth problem • BIG-O Analysis handout has an example where the “rule of thumb” will result in an incorrect assumption
BIG-O --- Big-O NotationRule of Thumb ALGORITHM | SINGLE LOOP | NESTED LOOP(S) ____________ |________________________ |__________________ STRAIGHT | LINEAR | QUADRATIC FORWARD | O(N) | O(N^2) PROCESSING | | 0(N^3)… (SEQUENTIAL) | | ____________ |________________________|___________________ | | DIVIDE AND | LOGARITHMIC | N LOG N CONQUER | O(LOG N) | O(N LOG N) PROCESSING | |
Sample Test Times; N = 50,000 FUNCTION Running Time Log N (Log Time) 15.6 (Binary Search) N (Linear Time) 50,000 (L’List or Tree Traversal, Sequential Search) N Log N 780,482 (Quick, MergeSort) N^2 (Quadratic Time) 2.5 * 10^9 (Selection Sort, Matrics, 2 nested loops) a^N (Exponential ) 3.8 * 10^21 (recursion, fibonacci, permutations)
REVIEW 3 EXAMPLES IN THE HANDOUT: Lambert p.40-41 Examples 1.7, 1.8 & 1.9
PROJECTS: • BIG-O Exercises 1 through 5 • Workbook problems 12 through 20 • Multiple Choice Problems