1 / 45

BIG-O --- Algorithms

BIG-O --- Algorithms. Purpose: Be able to evaluate the relative efficiency of various algorithms that are used to process data We need to be able to evaluate all of the major sort and search algorithms as well as the various implementations of the Abstract Data Types.

chinara
Télécharger la présentation

BIG-O --- Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BIG-O --- Algorithms • Purpose: Be able to evaluate the relative efficiency of various algorithms that are used to process data • We need to be able to evaluate all of the major sort and search algorithms as well as the various implementations of the Abstract Data Types

  2. Resources: Java Methods Data Structures Chapter 8 p.191

  3. Intro: As we began to discuss in the lecture on Algorithms, we need to be able to evaluate processes and algorithms based on a uniform criteria. • Big-O provides us with a method for such evaluations

  4. BIG-O --- Algorithms • Searching (locating an element with a target value in a list of values) and Sorting are 2 tasks used to illustrate the concept of an algorithm • Algorithms typically use iterations or recursion. This property differentiates it from straight forward code

  5. Algorithms are also generic in nature. The same algorithm applies to a whole set of initial states and produces corresponding final states for each of them. • A sorting algorithm, for example, must apply to any list regardless of the values of its elements

  6. Furthermore, an algorithm must be independent of the SIZE of the task ( n )

  7. Analyze the time efficiency and space requirements of algorithms in an abstract way by looking at an algorithm with regards to specific data types and other implementation details

  8. Criteria to evaluate an algorithm: Space required Amount of time Complexity

  9. SPACE: • Number and size of simple variables • Number and total size of components of compound variable • Is space dependent on size of input?

  10. TIME: • Not necessarily measured in real clock time • Look for operation(comparison) • Express time required in terms of this characteristic operation

  11. COMPLEXITY: • Has little to do with how complex an algorithm “looks” • function of the size(number) of input values • Average time • based on probability of the occurrence of inputs • Worst time • based on most unfavorable input

  12. We will focus on time efficiency

  13. Efficiency of TIME: • Number of Comparisons (if a > b) • Number of Assignments (a = c)

  14. We will also disregard the specifics of the hardware or the programming language so: • we will not be measuring time efficiency in real time • we will not measure in terms of the number of required program instructions/statements

  15. We will discuss performance in terms of abstract “steps” necessary to complete a task and we assume that each step takes the same amount of time • We can compare different algorithms that accomplish the same task

  16. The theory that we can predict the long-term behavior of functions without specific knowledge of the exact constants used in describing the function allows us to ignore constant factors in the analysis of execution time

  17. BIG-O --- Big-O Notation • We can evaluate algorithms in terms of Best Case , Worst Case and Average Case • We assume that the number of “steps” or n is a large number • Analyze loops especially nested loops

  18. Sequential Search algorithms grow linearly with the size (number) of the elements • We match the target value against each array value until a match is found or the entire array is scanned

  19. Sequential Search • Worst Case is we locate the target element in the last element • Best Case is the target value is found on the first attempt • Average Case finds the target value in the middle of the array

  20. Binary Search algorithms (compared as applied to the same task as the sequential search) grow logarithmicly with the size (number) of the elements • Elements must be ordered • We compare the target value against the middle element of the array and proceed left (if smaller) or right (larger) until a match if found

  21. Binary Search For example, where n=7, we try a[3] then proceed left or right

  22. Binary Search • Worst Case , Average Case is where we locate the target element in (3 ) log n comparisons (the average case is only one less than the worst case) • Best Case is the target value is found on the first attempt • THE execution time of a Binary Search is approx. proportional to the log of n

  23. (A) Linear Growth (B) Logarithmic Growth • Logarithmic Growth is SLOWER than Linear Growth • Asymptotically, a Binary Search is FASTER than a Sequential Search as Linear Time eventually surpasses Logarithmic Time (A) (B)

  24. Big-O represents the Order of Growth for an Algorithm

  25. Growth Rate Functions: • reference functions used to compare the rates of growth of algorithms

  26. BIG-O --- Big-O NotationGrowth Rate Functions • O(1) Constant Time --- time required to process 1 set of steps • The algorithm requires the same fixed number of steps regardless of the size of the task: • Push and Pop Stack operations • Insert and Remove Queue operations • Finding the median value in a sorted Array

  27. O(n) Linear Time --- increase in time is constant for a larger n (number of tasks) • The algorithm requires a number of steps proportional to the size of the task • 20 tasks = 20

  28. O(n) Examples: • Traversal of a List • Finding min or max element in a List • Finding min or max element in a sequential search of unsorted elements • Traversing a Tree with n nodes • Calculating n-factorial, iterativly • Calculating nth Fibonacci number, iteratively

  29. O(n^2) Quadratic Time • The number of operations is proportional to the size of the task SQUARED • 20 tasks = 400

  30. O(n^2) Examples: • Simplistic Sorting algorithms such as a Selection Sort of n elements • Comparing 2 2-Dimensional arrays, matrics, of size n by n • Finding Duplicates in an unsorted list of n elements (implemented with 2 nested loops)

  31. O(log n) Logarithmic Time --- the log of n is significantly lower than n • 20 tasks = 4.3 • Used in many “divide and conquer” algorithms, like binary search, and is the basis for using binary search trees and heaps

  32. O(log n) • for example, a binary search tree of one million elements would take, at most, 20 steps to locate a target • Binary Search in a Sorted list of n elements • Insert and Find Operations for a Binary Search Tree with n nodes • Insert and Find Operations for a Heap with n nodes

  33. O(n log n) “n log n” Time • 20 tasks = 86.4 • More advanced sorting algorithms like Quicksort, Mergesort (which will be discussed in detail in the next chapter)

  34. O(a^n)(a > 1) Exponential Time • 20 tasks = 12^20 = very large number • Recursive Fibonacci implementation (a > 3/2) • Towers of Hanoi (a=2) • Generating all permutations of n symbols

  35. The best time in the preceding growth rate functions is constant timeO(1) • The worst time in the preceding growth rate functions is exponential timeO(a^n) which quickly Overwhelms even the fastest computers even for a relatively small n

  36. Polymonial Growth: Linear, quadratic, cubic… The Highest Power of N Dominates the Polymonial (see Ill Lam SDT p.42 Top) • is considered manageable as compared to exponential growth

  37. Exponential N log n t Quadratic Linear Log n Constant O(1) n

  38. Log n has a slower asymptotic growth rate when compared to linear growth as a thousand fold increase in the size of the task , n , results in a fixed, moderate increase in the number of operations required.

  39. For a given ALGORITHM, you can see how it falls on the following grid to determine its “Order of Growth” or time efficiency. • Use this grid as a “Rule of Thumb” when evaluating the BIG-O of an algorithm.

  40. Let this grid help narrow down possible solutions, but make sure you “memorize” the other charts and use them when attacking an order of growth problem • BIG-O Analysis handout has an example where the “rule of thumb” will result in an incorrect assumption

  41. BIG-O --- Big-O NotationRule of Thumb ALGORITHM | SINGLE LOOP | NESTED LOOP(S) ____________ |________________________ |__________________ STRAIGHT | LINEAR | QUADRATIC FORWARD | O(N) | O(N^2) PROCESSING | | 0(N^3)… (SEQUENTIAL) | | ____________ |________________________|___________________ | | DIVIDE AND | LOGARITHMIC | N LOG N CONQUER | O(LOG N) | O(N LOG N) PROCESSING | |

  42. Sample Test Times; N = 50,000 FUNCTION Running Time Log N (Log Time) 15.6 (Binary Search) N (Linear Time) 50,000 (L’List or Tree Traversal, Sequential Search) N Log N 780,482 (Quick, MergeSort) N^2 (Quadratic Time) 2.5 * 10^9 (Selection Sort, Matrics, 2 nested loops) a^N (Exponential ) 3.8 * 10^21 (recursion, fibonacci, permutations)

  43. REVIEW 3 EXAMPLES IN THE HANDOUT: Lambert p.40-41 Examples 1.7, 1.8 & 1.9

  44. PROJECTS: • BIG-O Exercises 1 through 5 • Workbook problems 12 through 20 • Multiple Choice Problems

  45. TEST IS THE DAY AFTER THE LABS ARE DUE

More Related