1 / 47

Self-Adjusting Computation Umut Acar Carnegie Mellon University

Self-Adjusting Computation Umut Acar Carnegie Mellon University. Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo. Dynamic Algorithms. Maintain their input-output relationship as the input changes

july
Télécharger la présentation

Self-Adjusting Computation Umut Acar Carnegie Mellon University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Self-Adjusting ComputationUmut AcarCarnegie Mellon University Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo

  2. Dynamic Algorithms • Maintain their input-output relationship as the input changes • Example: A dynamic MST algorithm maintains the MST of a graph as user to insert/delete edges • Useful in many applications involving • interactive systems, motion, ... Workshop on Dynamic Algorithms and Applications

  3. Developing Dynamic Algorithms: Approach I Dynamic by design • Many papers • Agarwal, Atallah, Bash, Bentley, Chan, Cohen, Demaine, Eppstein, Even, Frederickson, Galil, Guibas, Henzinger, Hershberger, King, Italiano, Mehlhorn, Overmars, Powell, Ramalingam, Roditty, Reif, Reps, Sleator, Tamassia, Tarjan, Thorup, Vitter, ... • Efficient algorithms but can be complex Workshop on Dynamic Algorithms and Applications

  4. Approach II: Re-execute the algorithm when the input changes • Very simple • General • Poor performance Workshop on Dynamic Algorithms and Applications

  5. Smart re-execution • Suppose we can identify the pieces of execution affected by the input change • Re-execute by re-building only the affected pieces Execution (A,I) Execution (A,I+e) Workshop on Dynamic Algorithms and Applications

  6. Smart Re-execution • Time re-execute = O(distance between executions) Execution (A,I) Execution (A,I+e) Workshop on Dynamic Algorithms and Applications

  7. Incremental Computation or Dynamization • General techniques for transforming algorithms dynamic • Many papers: Alpern, Demers, Field, Hoover, Horwitz, Hudak, Liu, de Moor, Paige, Pugh, Reps, Ryder, Strom, Teitelbaum, Weiser, Yellin... • Most effective techniques are • Static Dependence Graphs [Demers, Reps, Teitelbaum ‘81] • Memoization [Pugh, Teitelbaum ‘89] • These techniques work well for certain problems Workshop on Dynamic Algorithms and Applications

  8. Bridging the two worlds • Dynamization simplifies development of dynamic algorithms • but generally yields inefficient algorithms • Algorithmic techniques yield good performance • Can we have the best of the both worlds? Workshop on Dynamic Algorithms and Applications

  9. Our Work • Dynamization techniques: • Dynamic dependence graphs [Acar,Blelloch,Harper ‘02] • Adaptive memoization [Acar, Blelloch,Harper ‘04] • Stability: Technique for analyzing performance [ABHVW ‘04] • Provides a reduction from dynamic to static problems • Reduces solving a dynamic problem to finding a stable solution to the corresponding static problem • Example: Dynamizing parallel tree contraction algorithm [Miller, Reif 85] yields an efficient solution to the dynamic trees problem [Sleator, Tarjan ‘83], [ABHVW SODA 04] Workshop on Dynamic Algorithms and Applications

  10. Outline • Dynamic Dependence Graphs • Adaptive Memoization • Applications to • Sorting • Kinetic Data Structures with experimental results • Retroactive Data Structures Workshop on Dynamic Algorithms and Applications

  11. Dynamic Dependence Graphs • Control dependences arise from function calls Workshop on Dynamic Algorithms and Applications

  12. Dynamic Dependence Graphs • Control dependences arise from function calls • Data dependences arise from reading/writing the memory c b a Workshop on Dynamic Algorithms and Applications

  13. Change propagation Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  14. Change propagation Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  15. Change propagation with Memoization Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  16. Change propagation with Memoization Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  17. Change propagation with Memoization Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  18. Change Propagation with Adaptive Memoization Change Propagation c b c b a a Workshop on Dynamic Algorithms and Applications

  19. The Internals 1. Order Maintenance Data Structure [Dietz, Sleator ‘87] • Time stamp vertices of the DDG in sequential execution order 2. Priority queue for change propagation • priority = time stamp • Re-execute functions in sequential execution order • Ensures that a value is updated before being read 3. Hash tables for memoization • Remember results from the previous execution only 4. Constant-time equality tests Workshop on Dynamic Algorithms and Applications

  20. Standard Quicksort fun qsort (l) = let fun qs (l,rest) = case l of NIL => rest | CONS(h,t) => let (smaller, bigger) = split(h,t) sbigger = qs (bigger,rest) in qs (smaller, CONS(h,sbigger)) end in qs(l,NIL) end Workshop on Dynamic Algorithms and Applications

  21. Dynamic Quicksort fun qsort (l) = let fun qs (l,rest,d) = read(l, fn l' => case l' of NIL => write (d, rest) | CONS(h,t) => let (less,bigger) = split (h,t) sbigger = mod (fn d => qs(bigger,rest,d)) in qs(less,CONS(h,sbigger,d)) end in mod(fn d => qs (l,NIL,d)) end Workshop on Dynamic Algorithms and Applications

  22. Performance of Quicksort • Dynamized Quicksort updates its output in expected • O(logn) time for insertions/deletions at the end of the input • O(n) time for insertions/deletions at the beginning of the input • O(logn) time for insertions/deletions at a random location • Other Results for insertions/deletions anywhere in the input • Dynamized Mergesort: expected O(logn) • Dynamized Insertion Sort: expected O(n) • Dynamized minimum/maximum/sum/...: expected O(logn) Workshop on Dynamic Algorithms and Applications

  23. Function Call Tree for Quicksort Workshop on Dynamic Algorithms and Applications

  24. Function Call Tree for Quicksort Workshop on Dynamic Algorithms and Applications

  25. Function Call Tree for Quicksort Workshop on Dynamic Algorithms and Applications

  26. Insertion at the end of the input Workshop on Dynamic Algorithms and Applications

  27. Insertion in the middle Workshop on Dynamic Algorithms and Applications

  28. Insertion in the middle Workshop on Dynamic Algorithms and Applications

  29. Insertion at the start, in linear time Input: 15,30,26,1,5,16,27,9,3,35,46 15 1 30 5 26 35 3 9 16 27 46 Workshop on Dynamic Algorithms and Applications

  30. Insertion at the start, in linear time Input: 20,15,30,26,1,5,16,27,9,3,35,46 15 20 15 30 1 30 1 16 26 35 5 26 35 5 27 46 3 9 16 27 46 3 9 Workshop on Dynamic Algorithms and Applications

  31. Kinetic Data Structures [Basch,Guibas,Herschberger ‘99] • Goal: Maintain properties of continuously moving objects • Example: A kinetic convex-hull data structure maintains the convex hull of a set of continuously moving objects Workshop on Dynamic Algorithms and Applications

  32. Kinetic Data Structures • Run a static algorithm to obtain a proof of the property • Certificate = Comparison + Failure time • Insert the certificates into a priority queue • Priority = Failure time • A framework for handling motion [Guibas, Karavelas, Russel, ALENEX 04] while queue ¹ empty do { certificate = remove (queue) flip (certificate) update the certificate set (proof) } Workshop on Dynamic Algorithms and Applications

  33. Kinetic Data Structures via Self-Adjusting Computation • Update the proof automatically with change propagation • A library for kinetic data structures [Acar, Blelloch, Vittes] • Quicksort: expected O(1), Mergesort: expected O(1) • Quick Hull, Chan’s algorithm, Merge Hull: expected O(logn) while queue ¹ empty do { certificate = remove (queue) flip (certificate) propagate ()} Workshop on Dynamic Algorithms and Applications

  34. H B M F G D I E C Quick Hull: Find Min and Max J O K P A L N [A B C D E F G H I J K L M N O P] Workshop on Dynamic Algorithms and Applications

  35. H B M F G D I E C Quick Hull: Furthest Point J O K P A L N [A B D F G H J K M O P] Workshop on Dynamic Algorithms and Applications

  36. H B M F G D I E C Quick Hull: Filter J O K P A L N [[A B F J ] [J O P]] Workshop on Dynamic Algorithms and Applications

  37. H M F G D I E C Quick Hull: Find left hull J O B K P A L N [ [A B][B J] [J O] [O P] ] Workshop on Dynamic Algorithms and Applications

  38. H M F G D I E C Quick Hull: Done J O B K P A L N [ [A B][B J] [J O] [O P] ] Workshop on Dynamic Algorithms and Applications

  39. Static Quick Hull fun findHull(line as (p1,p2),l,hull) = let pts = filter l (fn p => Geo.lineside(p,line)) in case pts of EMPTY => CONS(p1, hull) | _ => let pm = max (Geo.dist line) l left = findHull((pm,p2),l,hull,dest) full = findHull((p1,pm),l,left) in full end end fun quickHull l = let (mx,xx) = minmax (Geo.minX, Geo.maxX) l in findHull((mx,xx),points,CONS(xx,NIL) end Workshop on Dynamic Algorithms and Applications

  40. Kinetic Quick Hull fun findHull(line as (p1,p2),l,hull,dest) = let pts = filter l(fn p => Kin.lineside(p,line)) in modr (fn dest => read l (fn l => case l of NIL => write(dest,CONS(p1, hull)) | _ => read (max (Kin.dist line) l) (fn pm => let gr = modr (fn d => findHull((pm,p2),l,hull,d)) in findHull((p1,pm),l,gr,dest)))) end end fun quickHull l = let (mx,xx) = minmax (Kin.minX, Kin.maxX) l in modr(fn d => read (mx,xx)(fn (mx,xx) => split ((mx,xx),l, CONS(xx,NIL),d)))) end Workshop on Dynamic Algorithms and Applications

  41. Kinetic Quick Hull Certificates /Event Input size Workshop on Dynamic Algorithms and Applications

  42. Dynamic and Kinetic Changes • Often interested in dynamic as well as kinetic changes • Insert and delete objects • Change the motion plan, e.g., direction, velocity • Easily programmed via self-adjusting computation • Example: Kinetic Quick Hull code is both dynamic and kinetic • Batch changes • Real time changes: Can maintain partially correct data structures (stop propagation when time expires) Workshop on Dynamic Algorithms and Applications

  43. Retroactive Data Structures[Demaine, Iacono, Langerman ‘04] • Can change the sequence of operations performed on the data structure • Example: A retroactive queue would allow the user to go back in time and insert/remove an item Workshop on Dynamic Algorithms and Applications

  44. Retroactive Data Structures via Self-Adjusting Computation • Dynamize the static algorithm that takes as input the list of operations performed • Example: retroactive queues • Input: list of insert/remove operations • Output: list of items removed • Retroactive change: change the input list and propagate Workshop on Dynamic Algorithms and Applications

  45. Rake and Compress Trees [Acar,Blelloch,Vittes] • Obtained by dynamizing tree contraction [ABHVW ‘04] • Experimental analysis • Implemented and applied to a broad set of applications • Path queries, subtree queries, non-local queries etc. • For path queries, compared to Link-Cut Trees [Werneck] • Structural changes are relatively slow • Data changes are faster Workshop on Dynamic Algorithms and Applications

  46. Conclusions • Automatic dynamization techniques can yield efficient dynamic and kinetic algorithms/data structures • General-purpose techniques for • transforming static algorithms to dynamic and kinetic • analyzing their performance • Applications to kinetic and retroactive data structures • Reduce dynamic problems to static problems • Future work: Lots of interesting problems • Dynamic/kinetic/retroactive data structures Workshop on Dynamic Algorithms and Applications

  47. Thank you! Workshop on Dynamic Algorithms and Applications

More Related