1 / 48

Design and Analysis of Algorithms Quicksort

Design and Analysis of Algorithms Quicksort. Haidong Xue Summer 2012, at GSU. Review of insertion sort and merge sort. Insertion sort Algorithm Worst case number of comparisons = O(?) Merge sort Algorithm Worst case number of comparisons = O(?). Sorting Algorithms. Quicksort Algorithm.

nitza
Télécharger la présentation

Design and Analysis of Algorithms Quicksort

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and Analysis of AlgorithmsQuicksort HaidongXue Summer 2012, at GSU

  2. Review of insertion sort and merge sort • Insertion sort • Algorithm • Worst case number of comparisons = O(?) • Merge sort • Algorithm • Worst case number of comparisons = O(?)

  3. Sorting Algorithms

  4. Quicksort Algorithm • Input: A[1, …, n] • Output: A[1, .., n], where A[1]<=A[2]…<=A[n] • Quicksort: • if(n<=1) return; • Choose the pivot p = A[n] • Put all elements less than p on the left; put all elements lager than p on the right; put p at the middle. (Partition) • Quicksort(the array on the left of p) • Quicksort(the array on the right of p)

  5. Quicksort Algorithm Current pivots • Quicksort example Previous pivots 2 8 7 1 3 5 6 4 Quicksort 2 1 3 4 8 5 7 6 Hi, I am nothing Nothing Jr. 7 5 2 3 4 6 8 1 Nothing 3rd 5 1 2 6 3 4 7 8 5 1 2 6 3 4 7 8 5 1 2 6 3 4 7 8

  6. Quicksort Algorithm • More detail about partition • Input: A[1, …, n] (p=A[n]) • Output: A[1,…k-1, k, k+1, … n], where A[1, …, k-1]<A[k] and A[k+1, … n] > A[k], A[k]=p • Partition: • t = the tail of smaller array • from i= 1 to n-1{ if(A[i]<p) { exchange A[t+1] with A[i]; update t to the new tail; } • exchange A[t+1] with A[n];

  7. Quicksort Algorithm • Partition example 2 8 7 1 3 5 6 4 2 8 7 1 3 Exchange 2 with A[tail+1] 5 6 4 tail tail tail tail 2 8 7 1 3 Do nothing 5 6 4 2 8 7 1 3 5 6 4 Do nothing

  8. Quicksort Algorithm • Partition example 2 8 7 1 3 Exchange 1 with A[tail+1] 5 6 4 2 1 7 3 5 6 4 8 Exchange 3 with A[tail+1] tail tail tail tail 8 Do nothing 3 2 5 6 4 1 7 Do nothing 3 2 1 8 5 6 4 7 8 4 1 Final step: exchange A[n] with A[tail+1] 3 2 5 6 7

  9. Quicksort Algorithm The final version of quick sort: Quicksort(A, p, r){ if(p<r){ //if(n<=1) return; q = partition(A, p, r); //small ones on left; //lager ones on right Quicksort(A, p, q-1); Quicksort(A, q+1, r); } }

  10. Analysis of Quicksort Time complexity • Worst case • Expected Space complexity - extra memory • 0 = O(1)

  11. Analysis of Quicksort Worst case • The most unbalanced one --- • Is it the worst case? • Strict proof Expected time complexity • Strict proof

  12. Strict proof of the worst case time complexity Proof that • When n=1, (1)= • Hypothesis: when induction statement: when

  13. Strict proof of the worst case time complexity Since , +

  14. Strict proof of the expected time complexity • Given A[1, …, n], after sorting them to , the chance for 2 elements, A[i] and A[j], to be compared is . • The total comparison is calculated as:

  15. Java implementation of Quicksort • Professional programmers DO NOT implement sorting algorithms by themselves • We do it for practicing algorithm implementation techniques and understanding those algorithms • Code quicksort • Sort.java // the abstract class • Quicksort_Haydon.java // my quicksort implementation • SortingTester.java // correctness tester

  16. Design and Analysis of Algorithms Review on time complexity, “in place”, “stable” HaidongXue Summer 2012, at GSU

  17. Time complexity of algorithms • Execution time? • Pros: easy to obtain; Cons: not accurate • Number Instructions? • Pros: very accurate; Cons: calculation is not straightforward • Number of certain operations? • Pros: easy to calculate, generally accurate; Cons: not very calculate Asymptotic Notations Pros? Cons?

  18. Time complexity of algorithms In the worst case for(inti=0; i<A.length; i++) C1 + (length+1)*C2+ length*C3 length*C4 0*C5 C5 Assuming C4 >> C1, C2, C3, C5 C1: cost of “assign a value ” C2: cost of “<” C3: cost of “increment by 1” C4: cost of “==” C5: cost of “return” Worst case T(n) = n*C4 = (n) Worst case T(n) = C1+C2+ C5 + n(C2+C3+C4)+ = (n)

  19. “in place” and “stable” in sorting algorithms 2 3 3 1 3 5 Stable 1 2 3 3 3 5 1 2 3 Not stable 3 3 5

  20. Design and Analysis of AlgorithmsHeapsort HaidongXue Summer 2012, at GSU

  21. Max-Heap • A complete binary tree, and … every level is completely filled,except possibly the last, which is filled from left to right Yes No Yes

  22. Max-Heap • Satisfy max-heap property: parent >= children 16 14 10 8 7 9 3 2 Since it is a complete tree, it can be put into an array without lose its structure information.

  23. Max-Heap 16 14 10 8 7 9 3 2 1 2 3 5 7 8 4 6

  24. Max-Heap For element at i: Parent index =parent(i)= floor(i/2); Left child index = left(i)=2*i; Right child index =right(i)=2*i +1 Last non-leaf node = floor(length/2) • Use an array as a heap 1 16 2 3 14 10 4 5 6 7 8 7 9 3 8 i=3 2 floor(i/2)=floor(1.5)=1 2*i = 6 2*i+1=7 floor(length/2)=4 7 9 3 2 14 16 10 8 1 2 3 5 7 8 4 6

  25. Max-Heapify • Input: A compete binary tree rooted at i, whose left and right sub trees are max-heaps; last node index • Output: A max-heap rooted at i. • Algorithm: 1. Choose the largest node among node i, left(i), right(i) . 2. if(the largest node is not i){ • Exchange i with the largest node • Max-Heapify the affected subtree }

  26. Max-Heapify Example 2 16 10 14 7 9 3 8

  27. Heapsort for a heap • Input: a heap A • Output: a sorted array A Algorithm: 1. Last node index l = A’s last node index 2. From the last element to the second{ exchange (current, root); l--; Max-Heapify(A, root, l); }

  28. Heapsort example 16 14 10 3 9 8 7 2

  29. Heapsort example 14 8 10 3 9 2 7 16

  30. Heapsort example 10 8 9 14 3 2 7 16

  31. Heapsort example 9 8 3 14 10 2 7 16

  32. Heapsort example 8 7 3 14 10 2 9 16

  33. Heapsort example 7 2 3 14 10 8 9 16

  34. Heapsort example 3 2 7 14 10 8 9 16

  35. Array -> Max-Heap • Input: a array A • Output: a Max-Heap A Algorithm: Considering A as a complete binary tree, from the last non-leaf node to the first one{ Max-Heapify(A, current index, last index); }

  36. Build Heap Example 7 10 8 16 14 14 8 16 10 7

  37. Heapsort • Input: array A • Output: sorted array A Algorithm: 1. Build-Max-Heap(A) 2. Last node index l = A’s last node index 3. From the last element to the second{ exchange (current, root); l--; Max-Heapify(A, root, l); } Let’s try it

  38. Analysis of Heapsort • Input: array A • Output: sorted array A Algorithm: 1. Build-Max-Heap(A) 2. Last node index l = A’s last node index 3. From the last element to the second{ exchange (current, root); l--; Max-Heapify(A, root, l); } O(n) or O(nlgn) O(n) O(nlgn)

  39. Design and Analysis of AlgorithmsNon-comparison sort (sorting in linear time) HaidongXue Summer 2012, at GSU

  40. Comparison based sorting • Algorithms that determine sorted order based only on comparisons between the input elements What is the lower bound?

  41. Lower bounds for comparison based sorting For n element array, how many possible inputs are there? <? N Factorial of n ----- n! Y What is the shortest tree can have n! leaves? <? <? A perfect tree, Y …… As a result …… <? ………………………….. N Y Done Done

  42. Sorting in linear time • Can we sort an array in linear time? • Yes, but not for free • E.g. sort cards with 13 slots • What if there are more than one elements in the same slot?

  43. Counting Sort • Input: array A[1, … , n]; k (elements in A have values from 1 to k) • Output: sorted array A Algorithm: • Create a counter array C[1, …, k] • Create an auxiliary array B[1, …, n] • Scan A once, record element frequency in C • Calculate prefix sum in C • Scan A in the reverse order, copy each element to B at the correct position according to C. • Copy B to A

  44. Counting Sort 6 7 8 2 1 3 4 5 2 3 A: 7 3 2 5 3 6 6 2 3 7 3 2 5 3 6 7 2 1 3 4 5 C: 0 2 0 1 1 1 3 0 2 3 5 8 5 6 7 4 5 7 1 2 6 Position indicator: 0 6 7 8 2 1 3 4 5 B:

  45. Analysis of Counting Sort • Input: array A[1, … , n]; k (elements in A have values from 1 to k) • Output: sorted array A Algorithm: • Create a counter array C[1, …, k] • Create an auxiliary array B[1, …, n] • Scan A once, record element frequency in C • Calculate prefix sum in C • Scan A in the reverse order, copy each element to B at the correct position according to C. • Copy B to A Time Space O(k) O(n) O(n) O(k) O(n) O(n) O(n+k)=O(n) (if k=O(n)) O(n+k)=O(n) (if k=O(n))

  46. Radix-Sort • Input: array A[1, … , n]; d (number of digit a element has) • Output: sorted array A Algorithm: for each digit{ use a stable sort to sort A on a digit } T(n)=O(d(n+k))

  47. Summary Design strategies: Divide and conquer Employ certain special data structure Tradeoff between time and space

  48. Knowledge tree Algorithms Classic data structure Analysis Algorithms for classic problems Design … Sorting Shortest path Matrix multiplication Asymptotic notations Probabilistic analysis Heap, Hashing, Binary Tree, RBT, …. Dynamic Programming Greedy Divide & Conquer O(), o(), (), (), () Quicksort, Heapsort, Mergesort, … … … … … … … … … … … … … … … … … … … …

More Related