1 / 17

Theory of Algorithms: Decrease and Conquer

Theory of Algorithms: Decrease and Conquer. James Gain and Edwin Blake {jgain | edwin} @cs.uct.ac.za Department of Computer Science University of Cape Town August - October 2004. Objectives. To introduce the decrease-and-conquer mind set To show a variety of decrease-and-conquer solutions:

Télécharger la présentation

Theory of Algorithms: Decrease and Conquer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theory of Algorithms:Decrease and Conquer James Gain and Edwin Blake {jgain | edwin} @cs.uct.ac.za Department of Computer Science University of Cape Town August - October 2004

  2. Objectives • To introduce the decrease-and-conquer mind set • To show a variety of decrease-and-conquer solutions: • Depth-First Graph Traversal • Breadth-First Graph Traversal • Fake-Coin Problem • Interpolation Search • To discuss the strengths and weaknesses of a decrease-and-conquer strategy

  3. Decrease and Conquer A PROBLEM OF SIZE n • Reduce problem instance to smaller instance of the same problem and extend solution • Solve smaller instance • Extend solution of smaller instance to obtain solution to original problem • Also called inductive or incremental • Unlike Divide-and-Conquer don’t work equally on both subproblems (e.g. Binary Search) SUBPROBLEM OF SIZE n-1 A SOLUTION TO SUBPROBLEM A SOLUTION TO THE ORIGINAL PROBLEM

  4. Flavours of Decrease and Conquer • Decrease by a constant (usually 1): instance is reduced by the same constant on each iteration • Insertion sort • Graph Searching: DFS, BFS • Topological sorting • Generating combinatorials • Decrease by a constant factor (usually 2): instance is reduced by same multiple on each iteration • Binary search • Fake-coin problem • Variable-size decrease: size reduction pattern varies from one iteration to the next • Euclid’s algorithm • Interpolation Search

  5. Exercise: Spot the Difference • Problem: Derive algorithms for computing an using: • Brute Force • Divide and conquer • Decrease by one • Decrease by constant factor (halve the problem size) • Hint: each can be described in a single line

  6. Graph Traversal • Many problems require processing all graph vertices in a systematic fashion • Data Structures Reminder: • Graph traversal strategies: • Depth-first search (traversal for the Brave) • Breadth-first search (traversal for the Cautious) a b d b c a d a d c d c a b

  7. Depth-First Search • Explore graph always moving away from last visited vertex • Similar to preorder tree traversal DFS(G): G = (V, E) count  0 mark each vertex as 0 FOR each vertex v V DO IF v is marked as 0 dfs(v) dfs(v): count  count + 1 mark v with count FOR each vertex w adjacent to vDO IF w is marked as 0 dfs(w)

  8. Example: DFS a b c d e f g h Traversal Stack: (pre = push, post = pop) 1a8 2b73f24e1 5g66c5 7d48h3 Push order: a b f e g c d h Pop order: e f h d c g b a

  9. a b c d e f g h DFS Forest • DFS Forest: A graph representing the traversal structure • Types of Edges: • Tree edges: edge to next vertex in traversal • Back edges: edge in graph to ancestor nodes • Forward edges: edge in graph to descendants (digraphs only) • Cross edges: none of the above b f e a Backedge g c d h

  10. Notes on Depth-First Search • Implementable with different graph structures: • Adjacency matrices: (V2) • Adjacency linked lists: (V+E) • Yields two orderings: • preorder: as vertices are 1st encountered (pushed) • postorder: as vertices become dead-ends (popped) • Applications: • Checking connectivity, finding connected components • Checking acyclicity • Searching state-space of problems for solution (AI)

  11. Breadth-First Search • Move across to all neighbours of the last visited vertex • Similar to level-by-level tree traversals • Instead of a stack, breadth-first uses a queue bfs(v): count  count + 1 mark v with count initialize queue with v WHILE queue not empty DO a front of queue FOR each vertex w adjacent to a DO IF w is marked as 0 count  count + 1 mark w with count add w to queue remove a from queue BFS(G):G = (V, E) count  0 mark each vertex as 0 FOR each vertex v  V DO bfs(v)

  12. Example: BFS a b c d e f g h Traversal Queue: a1 b2 e3 f4 g5 c6 h7 d8 Order: a b e f g c h d a f b e g Crossedge c h d

  13. Notes on Breadth First Search • BFS has same efficiency as DFS and can be implemented with: • Adjacency matrices: (V2) • Adjacency linked lists: (V+E) • Yields single ordering of vertices • Applications: same as DFS, but can also find paths from a vertex to all other vertices with the smallest number of edges

  14. The Fake-Coin Problem: Decrease by a Constant Factor • Problem: • Among n identical looking coins, one is a fake (and weighs less) • We have a balance scale which can compare any two sets of coins • Algorithm: • Divide into two size n/2 piles (keeping a coin aside if n is odd) • If they weigh the same then the extra coin is fake • Otherwise proceed recursively with the lighter pile • Efficiency: • W(n) = W(n/2 ) + 1 for n > 1 • W(n) = log2 n = (log2 n) • But there is a better (log3 n) algorithm

  15. Euclid’s GCD: Variable-Size Decrease • Problem: • Greatest Common Divisor of two integers m and n is the largest integer that divides both exactly • Alternative Solutions: • Consecutive integer checking (brute force) • Identify common prime factors (transform and conquer) • Euclid’s Solution: • gcd(m, n) = gcd(n, m mod n) • gcd(m, 0) = m • Right-side args are smaller by neither a constant size nor factor • Example: • gcd(60, 24) = gcd(24, 12) = gcd(12, 0) = 12

  16. Interpolation Search: Variable-Size Decrease • Mimics the way humans search through a phone book (look near the beginning for ‘Brown’) • Assumes that values between the leftmost (A[l]) and rightmost (A[r]) list elements increase linearly • Algorithm (key = v, find search index = i): • Binary search with floating variable at index i • Setup straight line through (l, A[l]) and (r, A[r]) • Find point P = (x, y) on line at y = v, then i = x • x = l + (v - A[l])(r - l) / (A[r] - A[l]) • Efficiency: • Average = (log log n + 1), Worst = (n) value A[r] v A[l] index l i r

  17. Strengths and Weaknesses of Decrease-and-Conquer • Strengths: • Can be implemented either top down (recursively) or bottom up (without recursion) • Often very efficienct (possibly (log n) ) • Leads to a powerful form of graph traversal (Breadth and Depth First Search) • Weaknesses: • Less widely applicable (especially decrease by a constant factor)

More Related