1 / 36

RAIK 283: Data Structures & Algorithms

RAIK 283: Data Structures & Algorithms. Divide and Conquer (I) Dr. Ying Lu ylu@cse.unl.edu. RAIK 283: Data Structures & Algorithms. Giving credit where credit is due: Most of the lecture notes are based on the slides from the Textbook’s companion website http://www.aw-bc.com/info/levitin

becky
Télécharger la présentation

RAIK 283: Data Structures & Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RAIK 283: Data Structures & Algorithms Divide and Conquer (I) Dr. Ying Lu ylu@cse.unl.edu Design and Analysis of Algorithms – Chapter 5

  2. RAIK 283: Data Structures & Algorithms • Giving credit where credit is due: • Most of the lecture notes are based on the slides from the Textbook’s companion website • http://www.aw-bc.com/info/levitin • Some examples and slides are based on lecture notes created by Dr. Ben Choi, Louisiana Technical University and Dr. Chuck Cusack, Hope College • I have modified many of their slides and added new slides. Design and Analysis of Algorithms – Chapter 5

  3. Divide and Conquer The most well known algorithm design strategy: • Divide instance of problem into two or more smaller instances • Solve smaller instances recursively • Obtain solution to original (larger) instance by combining these solutions Design and Analysis of Algorithms – Chapter 5

  4. Divide-and-conquer Technique a problem of size n subproblem 1 of size n/2 subproblem 2 of size n/2 a solution to subproblem 1 a solution to subproblem 2 a solution to the original problem Design and Analysis of Algorithms – Chapter 5

  5. Divide and Conquer Examples • Sorting: mergesort and quicksort • Tree traversals • Matrix multiplication-Strassen’s algorithm • Closest pair problem • Convex hull-QuickHull algorithm Design and Analysis of Algorithms – Chapter 5

  6. Review Mergesort Design and Analysis of Algorithms – Chapter 5

  7. Mergesort Algorithm: • Split array A[1..n] in two and make copies of each half in arrays B[1.. n/2 ] and C[1.. n/2 ] • Sort arrays B and C • Merge sorted arrays B and C into array A Design and Analysis of Algorithms – Chapter 5

  8. (first  last)2 last first Sort recursively by Mergesort Sort recursively by Mergesort Sorted Sorted Merge Sorted Using Divide and Conquer: Mergesort • Mergesort Strategy Design and Analysis of Algorithms – Chapter 5

  9. Mergesort Algorithm: • Split array A[1..n] in two and make copies of each half in arrays B[1.. n/2 ] and C[1.. n/2 ] • Sort arrays B and C • Merge sorted arrays B and C into array A Design and Analysis of Algorithms – Chapter 5

  10. Mergesort Algorithm: • Split array A[1..n] in two and make copies of each half in arrays B[1.. n/2 ] and C[1.. n/2 ] • Sort arrays B and C • Merge sorted arrays B and C into array A as follows: • Repeat the following until no elements remain in one of the arrays: • compare the first elements in the remaining unprocessed portions of the arrays • copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array • Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A. Design and Analysis of Algorithms – Chapter 5

  11. Algorithm: Mergesort Input: Array E and indices first and last, such that the elements E[i] are defined for first  i last. Output: E[first], …, E[last] is a sorted rearrangement of the same elements void mergeSort(Element[] E, int first, int last) if (first < last) int mid = (first+last)/2; mergeSort(E, first, mid); mergeSort(E, mid+1, last); merge(E, first, mid, last); return; Design and Analysis of Algorithms – Chapter 5

  12. In-Class Exercise • P175 Problem 5.1.6 Apply Mergesort to sort the list E, X, A, M, P, L, E in alphabetical order. Design and Analysis of Algorithms – Chapter 5

  13. Algorithm: Mergesort void mergeSort(Element[] E, int first, int last) if (first < last) int mid = (first+last)/2; mergeSort(E, first, mid); mergeSort(E, mid+1, last); merge(E, first, mid, last); return; Time Efficiency Design and Analysis of Algorithms – Chapter 5

  14. Mergesort Complexity • Mergesort always partitions the array equally. • Thus, the recursive depth is always (lgn) • The amount of work done at each level is (n) • Intuitively, the complexity should be (nlgn) • We have, • T(n) =2T(n/2)  (n) for n>1, T(1)=0 (nlgn) • The amount of extra memory used is (n) Design and Analysis of Algorithms – Chapter 5

  15. Efficiency of Mergesort • Number of comparisons is close to theoretical minimum for comparison-based sorting: • lg n ! ≈ n lg n - 1.44 n • All cases have same efficiency: Θ( n log n) • Space requirement: Θ( n ) (NOT in-place) • Can be implemented without recursion (bottom-up) Design and Analysis of Algorithms – Chapter 5

  16. Animation http://math.hws.edu/TMCM/java/xSortLab/index.html Design and Analysis of Algorithms – Chapter 5

  17. The master theorem • Given: a divide and conquer algorithm • Then, the Master Theorem gives us a cookbook for the algorithm’s running time: Design and Analysis of Algorithms – Chapter 5

  18. A general divide-and-conquer recurrence T(n) = aT(n/b) + f (n)where f (n)∈Θ(nd) T(1) = c • a < bd T(n) ∈Θ(nd) • a = bd T(n) ∈Θ(ndlgn ) • a > bd T(n) ∈Θ(nlogb a) Please refer to Appendix B for the proof Design and Analysis of Algorithms – Chapter 5

  19. In-class exercise • Page 175: Problem 5 • 5. Find the order of growth for solutions of the following recurrences. • a. T(n) = 4T(n/2) + n, T(1) = 1 • b. T(n) = 4T(n/2) + n22, T(1) = 1 • c. T(n) = 4T(n/2) + n33, T(1) = 1 Design and Analysis of Algorithms – Chapter 5

  20. In-Class Exercise • 5.1.9 Let A[0, n-1] be an array of n distinct real numbers. A pair (A[i], A[j]) is said to be an inversion if these numbers are out of order, i.e., i<j but A[i]>A[j]. Design an O(nlogn) algorithm for counting the number of inversions. Design and Analysis of Algorithms – Chapter 5

  21. Review Quicksort Design and Analysis of Algorithms – Chapter 5

  22. p A[i]≤p A[i]p Quicksort by Hoare (1962) • Select a pivot (partitioning element) • Rearrange the list so that all the elements in the positions before the pivot are smaller than or equal to the pivot and those after the pivot are larger than or equal to the pivot • Exchange the pivot with the last element in the first (i.e., ≤) sublist – the pivot is now in its final position • Sort the two sublists recursively Design and Analysis of Algorithms – Chapter 5

  23. p A[i]≤p A[i]p Quicksort by Hoare (1962) • Select a pivot (partitioning element) • Rearrange the list so that all the elements in the positions before the pivot are smaller than or equal to the pivot and those after the pivot are larger than or equal to the pivot • Exchange the pivot with the last element in the first (i.e., ≤) sublist – the pivot is now in its final position • Sort the two sublists recursively • Apply quicksort to sort the list 7 2 9 10 5 4 Design and Analysis of Algorithms – Chapter 5

  24. Quicksort Example • Recursive implementation with the left most array entry selected as the pivot element. Design and Analysis of Algorithms – Chapter 5

  25. Quicksort Animated Example: http://math.hws.edu/TMCM/java/xSortLab/index.html Design and Analysis of Algorithms – Chapter 5

  26. Quicksort Algorithm • Input: • Array E and indices first, and last, s.t. elements E[i] are defined for first  i  last • Ouput: • E[first], …, E[last] is a sorted rearrangement of the array • Void quickSort(Element[] E, int first, int last) if (first < last) Element pivotElement = E[first]; intsplitPoint = partition(E, pivotElement, first, last); quickSort (E, first, splitPoint –1 ); quickSort (E, splitPoint +1, last ); return; Design and Analysis of Algorithms – Chapter 5

  27. Quicksort Analysis • Partition can be done in O(n) time, where n is the size of the array • Let T(n) be the number of comparisons required by Quicksort • If the pivot ends up at position k, then we have • T(n) T(nk)  T(k1) n • To determine best-, worst-, and average-case complexity we need to determine the values of k that correspond to these cases. Design and Analysis of Algorithms – Chapter 5

  28. Quicksort Analysis • Partition can be done in O(n) time, where n is the size of the array • Let T(n) be the number of comparisons required by Quicksort • If the pivot ends up at position k, then we have • T(n) T(nk)  T(k1) n • To determine best-, worst-, and average-case complexity we need to determine the values of k that correspond to these cases. Design and Analysis of Algorithms – Chapter 5

  29. Best-Case Complexity • The best case is clearly when the pivot always partitions the array equally. • Intuitively, this would lead to a recursive depth of at most lgn calls • We can actually prove this. In this case • T(n) T(n/2)  T(n/2) n (nlgn) Design and Analysis of Algorithms – Chapter 5

  30. Worst-Case and Average-Case Complexity • The worst-case is when the pivot always ends up in the first or last element. That is, partitions the array as unequally as possible. • In this case • T(n) ? Design and Analysis of Algorithms – Chapter 5

  31. Worst-Case and Average-Case Complexity • The worst-case is when the pivot always ends up in the first or last element. That is, partitions the array as unequally as possible. • In this case • T(n)  T(n1)  T(11) n  T(n1) n  n  (n1)  … + 1  n(n  1)/2 (n2) • Average case is rather complex, but is where the algorithm earns its name. The bottom line is: Design and Analysis of Algorithms – Chapter 5

  32. Summary of Quicksort • Best case: split in the middle — Θ( n log n) • Worst case: sorted array! — Θ( n2) • Average case: random arrays — Θ( n log n) Design and Analysis of Algorithms – Chapter 5

  33. Summary of Quicksort • Best case: split in the middle — Θ( n log n) • Worst case: sorted array! — Θ( n2) • Average case: random arrays — Θ( n log n) • Memory requirement? Design and Analysis of Algorithms – Chapter 5

  34. Summary of Quicksort • Best case: split in the middle — Θ( n log n) • Worst case: sorted array! — Θ( n2) • Average case: random arrays — Θ( n log n) • Memory requirement? • In-place sorting algorithm • Considered as the method of choice for internal sorting for large files (n ≥ 10000) Design and Analysis of Algorithms – Chapter 5

  35. Summary of Quicksort • Best case: split in the middle — Θ( n log n) • Worst case: sorted array! — Θ( n2) • Average case: random arrays — Θ( n log n) • Considered as the method of choice for internal sorting for large files (n ≥ 10000) • Improvements: • better pivot selection: median of three partitioning avoids worst case in sorted files • switch to insertion sort on small subfiles • elimination of recursion these combine to 20-25% improvement Design and Analysis of Algorithms – Chapter 5

  36. In-class exercises Design and Analysis of Algorithms – Chapter 5

More Related