html5-img
1 / 84

Algorithm Design and Analysis (ADA)

Algorithm Design and Analysis (ADA). 242-535 , Semester 1 2013-2014. Objective look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their running time. 4. Divide and Conquer. Overview. Divide and Conquer A Faster Sort: merge sort

lisbet
Télécharger la présentation

Algorithm Design and Analysis (ADA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithm Design and Analysis (ADA) 242-535, Semester 1 2013-2014 • Objective • look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their running time 4. Divide and Conquer

  2. Overview • Divide and Conquer • A Faster Sort: merge sort • The Iteration Method • Recursion Trees • Merge Sort vs Insertion Sort • Binary Search • Recursion Tree Examples • Iteration Method Examples • The Master Method

  3. 1. Divide and Conquer • Divide the problem into subproblems • Conquerthe subproblems by solving them recursively • Combine subproblem solutions.

  4. 2. A Faster Sort: Merge Sort MERGESORT(A, left, right) • Ifleft < right, // if left ≥ right, do nothing • mid := floor(left+right)/2) • MergeSort(A, left, mid) • MergeSort( A, mid+1,right) • Merge(A, left, mid, right) • return Initial call: MergeSort(A, 1, n)

  5. A faster sort: MergeSort input A[1 .. n] A[1 . . mid] A[mid+1. . n] MERGESORT MERGESORT SortedA[1 . . mid] SortedA[mid+1 . . n] MERGE output

  6. Tracing MergeSort() merge

  7. Merging two sorted arrays 20 13 7 2 12 11 9 1 20 13 7 2 12 11 9 20 13 7 12 11 9 20 13 12 11 9 20 13 12 11 20 13 12 1 2 7 9 11 12 Time = one pass through each array = O(n) to merge a total of n elements (linear time).

  8. Analysis of Merge Sort Statement Effort MergeSort(A, left, right) T(n) if (left < right) { O(1) mid = floor((left+right)/2); O(1) MergeSort(A, left, mid); T(n/2) MergeSort(A, mid+1, right); T(n/2) Merge(A, left, mid, right); O(n) } } As shown on the previous slides

  9. merge() Code • merge(A, left, mid, right) • Merges two adjacent subranges of an array A • left == the index of the first element of the first range • mid == the index of the last element of the first range • right == to the index of the last element of the second range

  10. void merge(int[] A, int left, int mid, int right) { int[] temp = new int[right–left + 1]; int aIdx = left; int bIdx = mid+1; for (int i=0; i < temp.length; i++){ if(aIdx > mid) temp[i] = A[bIdx++]; // copy 2nd range else if (bIdx > right) temp[i] = A[aIdx++]; // copy 1st range else if (a[aIdx] <= a[bIdx]) temp[i] = A[aIdx++]; else temp[i] = A[bIdx++]; } // copy back into A for (int j = 0; j < temp.length; j++) A[left+j] = temp[j]; }

  11. 3. The Iteration Method • Up to now, we have been solving recurrences using the Iteration method • Write T() as a recursive equation using big-Oh • Convert T() equation into algebra (replace O()'s) • Expand the recurrence • Rewrite the recursion into a summation • Convert algebra back to O()

  12. MergeSort Running Time • Recursive T() equation: • T(1) = O(1) • T(n) = 2T(n/2) + O(n), for n > 1 • Convert to algebra • T(1) = a • T(n) = 2T(n/2) + cn

  13. Recurrence for Merge Sort • The expression:is called a recurrence. • A recurrence is an equation that describes a function in terms of its value for smaller function calls.

  14. T(n) = 2T(n/2) + cn 2(2T(n/2/2) + cn/2) + cn 22T(n/22) + cn2/2 + cn 22T(n/22) + cn(2/2 + 1) 22(2T(n/22/b) + cn/22) + cn(2/2 + 1) 23T(n/23) + cn(22/22) + cn(2/2 + 1) 23T(n/23) + cn(22/22 +2/2 + 1) … 2kT(n/2k) + cn(2k-1/2k-1 + 2k-2/2k-2 + … + 22/22 + 2/2 + 1)

  15. So we have • T(n) = 2kT(n/2k) + cn(2k-1/2k-1 + ... + 22/22 + 2/2 + 1) • For k = log2 n • n = 2k, so T() argument becomes 1 • T(n) = 2kT(1) + cn(k-1+1) = na + cn(log2 n) = O(n) + O(n log2n) = O(n log2 n) k-1 of these

  16. 4. Recursion Trees • A graphical technique for finding a big-oh solution to a recurrence • Draw a tree of recursive function calls • Each tree node gets assigned the big-oh work done during its call to the function. • The big-oh equation is the sum of work at all the nodes in the tree.

  17. MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. • We usually omit stating the base case because our algorithms always run in time O(1) when n is a small constant.

  18. MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)

  19. MergeSort Recursion Tree cn T(n/2) T(n/2) Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

  20. MergeSort Recursion Tree cn cn/2 cn/2 T(n/4) T(n/4) T(n/4) T(n/4) Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

  21. MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn cn/2 cn cn/2 h = log n cn/4 cn/4 cn cn/4 cn/4 … … O(1) #leaves = n O(n) Total = O(n log n)

  22. Height and no. of Leaves • Node value: n  n/2  n2/4  ...  1 •  n(1/2)h = 1 // height = h •  n = 2h // take logs of both sides •  log2n = h • No. of nodes: 1  2  22 --> ... --> no. of leaves • no. of leaves = 2h = = = n1 = n h steps why?

  23. Logarithm Equalities because of this

  24. 5. Merge Sort vs Insertion Sort •O(n lg n) grows more slowly than O(n2). • In other words, merge sort is asymptotically faster (runs faster) than insertion sort in the worst case. • In practice, merge sort beats insertion sort for n > 30 or so.

  25. Timing Comparisons • Running time estimates: • Laptop executes 108 compares/second. • Supercomputer executes 1012 compares/second. Lesson 1. Good algorithms are better than supercomputers.

  26. 6. Binary Search Example: Find 9 3 5 7 8 9 12 15 • Binary Search from part 3 is a divide and conquer algorithm. • Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Easy; return index

  27. Binary Search Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9

  28. Binary Search Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9

  29. Binary Search Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9

  30. Binary Search Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9

  31. Binary Search Find an element in a sorted array: • Divide: Check middle element. • Conquer: Recursively search 1 subarray. • Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9

  32. Binary Search Code (again) int binSrch(char A[], int i,int j, char key){ int k; if (i > j) /* key not found */ return -1; k = (i+j)/2; if (key == A[k]) /* key found */ return k; if (key < A[k]) j = k-1; /* search left half */ else i = k+1; /* search right half */ return binSrch(A, i, j, key);}

  33. Running Time (again) n == the range of the arraybeing looked at • Using big-oh. • Basis: T(1) = O(1) • Induction: T(n) = O(1) + T(< n/2 >), for n > 1 • As algebra • Basis: T(1) = a • Induction: T(n) = c + T(< n/2 >), for n > 1 • Running time for binary search is O(log2 n).

  34. Recurrence for Binary Search T(n) = 1T(n/2) + O(1) # subproblems work dividing and combining subproblem size

  35. BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. • We usually don't bother with the base case because our algorithms always run in time O(1) when n is a small constant.

  36. BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. T(n)

  37. BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c T(n/2)

  38. BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c c T(n/4)

  39. BS Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. c c c c h = log2n c c … … O(1) a Total = c log2n + a = O(log2 n)

  40. Two recurrences so far • Merge SortT(n) = 2T(n/2) + O(n) = O(n log n) • Binary SearchT(n) = T(n/2) + O(1) = O(log n) • The big-oh running times were calculated in two ways: the iteration method and using recursion trees. • Let's do some more example of both.

  41. 7. Recursion Tree Examples 1 Solve T(n) = T(n/4) + T(n/2) + n2:

  42. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: T(n)

  43. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 T(n/2) T(n/4)

  44. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/2)2 (n/4)2 T(n/16) T(n/8) T(n/8) T(n/4)

  45. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/2)2 (n/4)2 (n/16)2 (n/8)2 (n/8)2 (n/4)2 O(1)

  46. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 2 n (n/2)2 (n/4)2 • (n/8)2 (n/4)2 • (n/16)2 (n/8)2 O(1)

  47. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 2 n • (n/2)2(n/8)2 (n/4)2 5 2 (n/4)2 n 16 • (n/16)2 (n/8)2 O(1)

  48. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 2 n • (n/2)2(n/8)2 (n/4)2 5 2 (n/4)2 n 16 25 2 • (n/16)2 (n/8)2 n 256 O(1)

  49. Example 1 Solve T(n) = T(n/4) + T(n/2) + n2: n2 2 n • (n/2)2(n/8)2 (n/4)2 • (n/4)2(n/16)2 (n/8)2 5 2 n 16 25 2 n 256 O(1) 2 2 3 5 5 5 + ( ) + ( ) +L ) Total = n (1 + 16 16 16 = 16/11*n2 = O(n2) geometric series

  50. Geometric Series Reminder for x¹ 1 for |x| < 1

More Related