1 / 57

Chapter 3 Ellis Horowitz, Sartaj Sahni

Chapter 3 Ellis Horowitz, Sartaj Sahni. Divide-and-Conquer. Divide-and-Conquer. The most-well known algorithm design strategy: Divide instance of problem into two or more smaller instances Solve smaller instances recursively

Télécharger la présentation

Chapter 3 Ellis Horowitz, Sartaj Sahni

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3 Ellis Horowitz, SartajSahni Divide-and-Conquer

  2. Divide-and-Conquer The most-well known algorithm design strategy: Divide instance of problem into two or more smaller instances Solve smaller instances recursively Obtain solution to original (larger) instance by combining these solutions

  3. Divide-and-Conquer Technique (cont.) a problem of size n subproblem 1 of size n/2 subproblem 2 of size n/2 a solution to subproblem 1 a solution to subproblem 2 a solution to the original problem

  4. Control Abstraction for Divide and Conquer Algorithm DAndC( P ) { if Small( P ) then return S( P ); else { divide P into smaller instances P1, P2, …, Pk, k >= 1; Apply DAndC to each these sub-problems; return Combine(DAndC(P1), DAndC(P1),…, DAndC(Pk)); } }

  5. The General Divide-and-Conquer Recurrence • For DAndC algorithms that produce the sub-problems of the same type • In general, a problem instance of size n can be divided into b instances of size n/b, with a of them needing to be solved. Here, a and b are constants; a≥ 1 and b > 1. • We get the following recurrence for the running time T(n) T(n) = T(1) for n = 1 T(n) = a T( n/b ) + f(n) for n > 1 where n is power of b i.e. n = bk f(n) accounts for the time spent on dividing the problem into smaller ones and on combining their solutions.

  6. Solving Recurrence Using Substitution Method Example 3.2 Given the general recurrence for DAndC T(n) = a T( n/b ) + f(n) for n > 1 T(n) = T(1) for n = 1 • Let a = 2, b = 2, T(1) = 2 and f(n) = n T(n) = 2T( n/2) + n = 2[2T( n/4 )+ n/2] + n = 4T( n/4 )+ 2n = 4[2T( n/8 )+ n/4] + 2n = 8T( n/8 )+ 3n after k substitutions = 2kT( n/2k )+ knwhere log2n ≥ k ≥ 1 for k = log2n i.e. n = 2k T( n) = n T( n/n)+ n log2n T( n) = n T( 1)+ n log2n T( n) = 2n + n log2n since T( 1) =2 Therefore T(n) (n log n)

  7. Solve using substitution method • T(n) = T( n/2 ) + cn for n > 1 T(n)=T(1) for n = 1 T(n) = T( n/2) + cn = [T( n/4 )+ cn/2] + cn = T( n/4 )+ [1/2 + 1) cn = [T( n/8 )+ cn/4] + [1/2 + 1] cn = T( n/8 )+ [1/4+1/2 + 1] cnafter k substitutions = T( n/2k )+ [1/2k-1+1/2k-2 + … + 1/2 + 1] cnwhere log2n ≥ k ≥ 1 = T( n/2k )+ [1+22 + … + 2k-2 + 2k-1 ] cn / 2k-1 = T( n/2k )+ [2k - 1] cn / 2k-1 for k = log2n (i.e. n = 2k) T( n) = T( n/n )+ [n - 1] 2cn / n T( n) = T( 1)+ 2cn - 2c T( n) = c1 + 2cn – 2c (AssumingT( 1) = c1 ) Therefore T(n) (n)

  8. Solving Recurrence Using Another Method Given the recurrence T(n) = T(1) for n = 1 T(n) = a T( n/b ) + f(n) for n > 1 Solving the above using substitution method gives T( n) = nlogba[T(1)+ u(n)] (1) where h(n) = f(n) / nlogba Using the following table u(n) values for various h(n) values can be obtained in Table 1

  9. Example 3.3 Consider the following recurrence when n is power of 2 T(n) = T(1) for n = 1, T(n) = T( n/2 ) + c for n > 1 Here a = 1, b=2 and f(n)=c Using h(n) = f(n) / nlogba = c / nlog2 1 = c ( Since log2 1 = 0, nlog2 1= 1 ) Which can be written as = c (log n)0 therefore h(n) ((log n)0) Referring Table 1 for u(n), u(n) (log n) Using T( n) = nlogba[T(1)+ u(n)] = nlog2 1[c + (log n)] = (log n) + c therefore T(n) (log n)

  10. Example 3.4 Consider the following recurrence when n is power of 2 T(n) = T(1) for n = 1, T(n) = 2T( n/2 ) + cn for n > 1 Here a = 2, b=2 and f(n)=cn Using h(n) = f(n) / nlogba = cn / nlog22 = cn / n = c ( Since log2 2 = 1, nlog2 2= n ) Which can be written as = c (log n)0 therefore h(n) ((log n)0) Referring Table 1 for u(n), u(n) (log n) Using T( n) = nlogba[T(1)+ u(n)] = n [c + (log n)] therefore T(n) (n log n)

  11. Example 3.5 Consider the following recurrence when n is power of 2 T(n) = 7T( n/2 ) + 18n2 for n≥ 2 Here a = 7, b=2 and f(n)= 18n2 Using h(n) = f(n) / nlogba = 18n2 / nlog27 = 18n2 - log27 = O(nr) where r = 2 - log27 < 0 ( Since log2 7 ≈ 2.81) Referring Table 1 for u(n), u(n) O(1) Using T( n) = nlogba[T(1)+ u(n)] = nlog27[c + O(1)] therefore T(n) (n log 27)

  12. Example 3.6 Consider the following recurrence when n is power 3 T(n) = 9T( n/3) + 4n6 for n≥ 3 Here a = 9, b=3 and f(n)= 4n6 Using h(n) = f(n) / nlogba = 4n6 / nlog3 9= 4n6 / n2( log3 9 = 2) = 4n4 = Ω(n4) Referring Table 1 for u(n), u(n) (h(n)) (n4) Using T( n) = nlogba[T(1)+ u(n)] = nlog3 9[ c + (n4) ] = n2 [ c + (n4) ] therefore T(n) (n6)

  13. Example 3.a Consider the following recurrence when n is power 2 T(n) = T( n/2) + cn for n≥ 2 Here a = 1, b=2 and f(n)= cn Using h(n) = f(n) / nlogba = cn / nlog2 1 ( log2 1 = 0) = cn = Ω(n) Referring Table 1 for u(n), u(n) (h(n)) (n) Using T( n) = nlogba[T(1)+ u(n)] = nlog2 1[ c + (n) ] = [ c + (n) ] therefore T(n) (n)

  14. Divide-and-Conquer Examples • Sorting: mergesort and quicksort • Binary tree traversals • Binary search (?) • Defective Chess Board

  15. Binary Search Very efficient algorithm for searching in sorted array: K vs A[0] . . . A[m] . . . A[n-1] If K = A[m], stop (successful search); otherwise, continue searching by the same method in A[0..m-1] if K < A[m] and in A[m+1..n-1] if K > A[m]

  16. Recursive Binary search intBinSrch(Type a[], int l, int r, Type x) { // Given an array a[l:r] of elements in non decreasing // order, determine whether x is present, and // if so, return j such that x == a[j]; else return 0. if (l==r) { // If Small(P) if (x==a[l]) return l; else return 0; } else { // Reduce P into a smaller sub problem. int mid = (l+r)/2; if (x == a[mid]) return mid; else if (x < a[mid]) return BinSrch(a,l,mid-1,x); else return BinSrch(a,mid+1,r,x); } }

  17. Non Recursive Binary search intBinSearch(Type a[], int n, Type x) { // Given an array a[1:n] of elements in non-decreasing // order, n>=0, determine whether x is present, and // if so, return j such that x == a[j];else return 0. int low = 1, high = n; while (low <= high){ int mid = (low + high)/2; if (x < a[mid]) high = mid - 1; else if (x > a[mid]) low = mid + 1; else return(mid); } return(0); }

  18. Binary Search- Three Examples a [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] elements -15 -6 0 7 9 23 54 82 101 112 125 131 142 151

  19. Binary Decision Tree • a: [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] • Elements: -15 -6 0 7 9 23 54 82 101 112 125 131 142 151 • Comparisons: 3 4 2 4 3 4 1 4 3 4 2 4 3 4 • On average no. of comparisons for Successful search: 45/14  3.21 • for Unsuccessful search: (3+14*4)/15  3.93

  20. Time Analysis of Binary Search a n = 1, a a constant n > 1, c a constant – Let a = 1, b = 2, T(1)= a and f(n) = c T(n) = T( n/2) + c = [T( n/4 )+ c] + c = T( n/4 )+ 2c = [T( n/8 )+ c] + 2c = T( n/8 )+ 3c after k substitutions = T( n/2k )+ kc = T( 1)+ kcwhere n = 2k = a + c log n where k = log2 n T(n)  (log n)

  21. Time Analysis of Binary Search If n is in the range [2k-1, 2k), then BinSearch makes at most k element comparisons for a successful search and k-1 or k comparisons for an unsuccessful search. Computing time of BinSearch Successful searches Best (1), Average (log n), Worst (log n) Unsuccessful searches Best, average, worst: (log n) This is VERY fast: e.g., Cw(106) = 20 Optimal for searching a sorted array Bad (degenerate) example of divide-and-conquer

  22. Binary Search Using One Comparison Per Cycle BinSearch1(Type a[], int n, Type x) // Same specifications as BinSearch except n > 0 { intlow=1, high=n+1; // high is one more than possible. while (low < (high-1)) { intmid = (low + high)/2; // Only one comparison in the loop if (x < a[mid]) high = mid; else low = mid; // x >= a[mid] } if (x == a[low]) return(low); // x is present. else return(0); // x is not present. }

  23. Comparison between BinSearch and BinSearch 1 • BinSearch1 is better than BinSearch in average case • BinSearch1 is worse than BinSearch in the best case • best case of BinSearch: (1) • best case of BinSearch1: (log n)

  24. Mergesort • Split array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C • Sort arrays B and C recursively • Merge sorted arrays B and C into array A as follows: • Repeat the following until no elements remain in one of the arrays: • compare the first elements in the remaining unprocessed portions of the arrays • copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array • Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A.

  25. Merge Sort void MergeSort(int low, int high) // a[low : high] is a global array to be sorted. // Small(P) is true if there is only one element to // sort. In this case the list is already sorted. { // If there are more than one element if (low < high) { // Divide P into sub problems. // Find where to split the set. int mid = (low + high)/2; // Solve the sub problems. MergeSort(low, mid); MergeSort(mid + 1, high); // Combine the solutions. Merge(low, mid, high); } }

  26. Merge void Merge(int low, int mid, int high) // a[low:high] is a global array containing two sorted // subsets in a[low:mid] and in a[mid+1:high]. The goal // is to merge these two sets into a single set // in a[low:high]. b[] is an auxiliary global array. { int h = low, i = low, j = mid+1, k; while ((h <= mid) && (j <= high)) { if (a[h] <= a[j]) { b[i] = a[h]; h++; } else { b[i] = a[j]; j++; } i++; } if (h > mid) for (k=j; k<=high; k++) {b[i] = a[k]; i++;} else for (k=h; k<=mid; k++) {b[i] = a[k]; i++;} for (k=low; k<=high; k++) a[k] = b[k]; }

  27. Merge Sort Example 3.8 • a[1:10]=(310, 285, 179, 652, 351, 423, 861, 254, 450, 520) • Hand simulation • (310 285 179 652 351 | 423 861 254 450 520) split • (310 285 179 | 652 351 | 423 861 254 450 520) split • (310 285 | 179 | 652 351 | 423 861 254 450 520) split • (310 | 285 | 179 | 652 351 | 423 861 254 450 520) split • (285 310 | 179 | 652 351 | 423 861 254 450 520) merge • (179 285 310 | 652 351 | 423 861 254 450 520) merge • (179 285 310 | 652 | 351 | 423 861 254 450 520) split • (179 285 310 | 351 652 | 423 861 254 450 520) merge • (179 285 310 351 652 | 423 861 254 450 520) merge • (179 285 310 351 652 | 423 861 254 | 450 520) split • ……

  28. Tree of Calls of MergeSort • a[1:10]=(310, 285, 179, 652, 351, 423, 861, 254, 450, 520)

  29. Tree of Calls of Merge • a[1:10]=(310, 285, 179, 652, 351, 423, 861, 254, 450, 520)

  30. Time Complexity of Merge Sort a n = 1, a a constant n > 1, c a constant – Let a = 2, b = 2, T(1)= a and f(n) = cn T(n) = 2T( n/2) + cn = 2[2T( n/4 )+ cn/2] + cn = 4T( n/4 )+ 2n = 4[2T( n/8 )+ cn/4] + 2cn = 8T( n/8 )+ 3cn after k substitutions = 2kT( n/2k )+ kcn = nT( 1)+ kcnwhere n = 2k = an + cnlog n where k = log2 n T(n)  O(nlogn)

  31. Analysis of Mergesort • All cases have same efficiency: Θ(n log n) • Space requirement: Θ(n) (not in-place) • Can be implemented without recursion (bottom-up)

  32. Two inefficiencies of MergeSort • Not in place (It uses another array b[].) • Copy between a[] and b[] needed • Space and time for stack due to recursion • For small set sizes, most of time consumed by recursion instead of sorting • Improvements • Using link[1:n] • Containing integers in the range [0,n], interpreted as pointers (indices) to elements of a[] • Example • link: [1] [2] [3] [4] [5] [6] [7] [8] • 6 4 7 1 3 0 8 0 • Q=2 denoting Q=(2,4,1,6) and sorted sublist (i.e., a[2]a[4] a[1]a[6]) • R=5 denoting R=(5,3,7,8) and sorted sublist (i.e., a[5]a[3] a[7]a[8])

  33. Improved Merge Sort • Uses insertion sort for the small problem (since insertion sort works exceedingly fast on arrays of less than, say 16 elements) int MergeSort1(int low, int high) // The global array a[low : high] is sorted in // non decreasing order using the auxiliary array // link[low:high]. The values in link will // represent a list of the indices low through // high giving a[] in sorted order. A pointer // to the beginning of the list is returned. { if ((high-low+1)<16) return InsertionSort1(a, link, low, high); else { int mid = (low + high)/2; int q = MergeSort1(low, mid); int r = MergeSort1(mid+1, high); return(Merge1(q,r)); } }

  34. Improved Merge int Merge1(int q, int r) // q and r are pointers to lists contained in the global // array link[0:n]. link[0] is introduced only for // convenience and need not be initialized. The lists // pointed at by q and r are merged and a pointer to the // beginning of the merged list is returned. { inti=q, j=r, k=0; // The new list starts at link[0]. while (i && j) { // While both lists are nonempty do if (a[i] <= a[j]) { // Find the smaller key. // Add a new key to the list. link[k] = i; k = i; i = link[i]; } else { link[k] = j; k = j; j = link[j]; } } if (!i) link[k] = j; else link[k] = i; return(link[0]); }

  35. Improved Merge Sort

  36. Quicksort • Division • Partitioning a[1:n] into two subarrays a[1:m] and a[m+1:n] such that a[i]a[j] for all 1im and m+1jn • No need for merging • Each of two subarrays can be sorted independently

  37. Partition int Partition(Type a[], int l, int r) { // Within a[l], a[l+1],..., a[r-1] the elements // are rearranged in such a manner that if // initially t==a[l], then after completion // a[q]==t for some q between l and r-1, a[k]<=t // for l<=k<q, and a[k]>=t for q<k<r. q is returned. Type v=a[m]; inti=m, j=p; do { do i++; while (a[i] < v); do j--; while (a[j] > v); if (i < j) Interchange(a, i, j); } while (i < j); a[m] = a[j]; a[j] = v; return(j); } inline void Interchange(Type a[], inti, int j) { Type p = a[i]; a[i] = a[j]; a[j] = p; }

  38. Quick Sort Example (i.e., partitioned into [60 45 50 55] 65 [85 80 75 70]) • Another example ! [15 22 13 27 12 10 20 25] • Why + at a[n+1]? • Explain using the example 1 2 3 4 5 6 7 8 9 10 [65 12 15 11 16 20 23 17 21 + ]

  39. QuickSort void QuickSort(int p, int q) { // Sorts the elements a[p],..., a[q] which reside in the global array a[1:n] // into ascending order; // a[n+1] is considered to be defined and must be >= all the elements in a[1:n]. if (p < q) { // If there are more than one element // divide P into two subproblems. int j = Partition(a, p, q+1); // j is the position of the partitioning element // Solve the sub problems. QuickSort(p, j-1); QuickSort(j+1,q); // There is no need for combining solutions. } }

  40. Best Case Analysis of Quicksort • Best case: all splits happen in the middle — Θ(n log n) TB(n) = 2Tbest(n/2) + n for n>1, TB(1)=0 T(n) = 2T( n/2) + n = 2[2T( n/4 )+ n/2] + n = 4T( n/4 )+ 2n = 4[2T( n/8 )+ n/4] + 2cn = 8T( n/8 )+ 3n after k substitutions = 2kT( n/2k )+ kn = nT( 1)+ knwhere n = 2k = n log n where k = log2 n T(n)  Θ(n log n)

  41. Worst Case Analysis of Quicksort • Worst case: sorted array! — Θ(n2) CW(n) =(n+1) + n + … + 3 = (n+1)(n+2)/2 - 3 Cworst(n) Θ(n2)

  42. Average Case Analysis of Quicksort • Average case: random arrays —Θ(n log n) • CA(n) is much less than CW(n) • Under the assumption that partitioning element v has an equal probability of being the ith smallest element • The recurrence is given by • CA(n)= (n+1) + 1/n 1kn[CA(k-1) + CA(n-k)] for n>1 CA(0)=0 and CA(1)=0 Solving above gives CA (n)  n log n

  43. Iterative Quick Sort void QuickSort2(int p, int q) { // Sorts the elements in a[p:q]. Stack<int> stack(SIZE); // SIZE is 2*log(n). do { while (p < q) { int j = Partition(a, p, q+1); if ((j-p) < (q-j)) { stack.Add(j+1); stack.Add(q); q = j-1; } else { stack.Add(p); stack.Add(j-1); p = j+1; } }; // Sort the smaller subfile. if (stack.StackEmpty()) return; stack.Delete(q); stack.Delete(p); } while (1); } • Smaller of two sub arrays always sorted first by the iterative version • Maximum stack depth • QuickSort (recursive version): n-1 • QucikSort1 (iterative version): O(log n)

  44. Performance Measurement • Experiments • Comparison between QuickSort and MergeSort on Sun 10/30 • Data made by random integer generator in the range [0,1000] • Average-case on 50 random inputs

  45. Randomized Quick Sort // Aims at selecting a better partitioning elemen t void RQuickSort(int p, int q) { // Sorts the elements a[p],..., a[q] which reside in // the global array a[1:n] into ascending order. a[n+1] // is considered to be defined and must be >= all the // elements in a[1:n]. if (p < q) { if ((q-p)>5) Interchange(a, random()%(q-p+1)+p, p); int j = Partition(p, q+1); // j is the position of // the partitioning element. RQuickSort(p,j-1); RQuickSort(j+1,q); } }

  46. Quicksort Improvements • Improvements: • better pivot selection: median of three partitioning • switch to insertion sort on small sub-files • elimination of recursion These combine to 20-25% improvement

  47. A real chessboard. Tiling A Defective Chessboard

  48. 1x1 2x2 4x4 8x8 Our Definition Of A Chessboard A chessboard is an n x n grid, where n is a power of 2.

  49. 1x1 2x2 4x4 8x8 A Defective Chessboard A defective chessboard is a chessboard that has one unavailable (defective) position.

  50. A Triomino A triominois an L shaped object thatcan cover three squares of a chessboard. A triomino has four orientations.

More Related