1 / 10

CS 155, Programming Paradigms Fall 2014, SJSU design techniques

CS 155, Programming Paradigms Fall 2014, SJSU design techniques. Jeff Smith. Algorithm design techniques. Several classes of algorithms exist, based on that technique used for their design divide-and-conquer algorithms dynamic programming algorithms greedy algorithms

nguyet
Télécharger la présentation

CS 155, Programming Paradigms Fall 2014, SJSU design techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 155, Programming ParadigmsFall 2014, SJSUdesign techniques Jeff Smith

  2. Algorithm design techniques • Several classes of algorithms exist, based on that technique used for their design • divide-and-conquer algorithms • dynamic programming algorithms • greedy algorithms • randomized algorithms • branch-and-bound (backtracking) algorithms

  3. Divide & conquer algorithms • Relatively easy to design • Usually possible • especially when processing recursive structures • Relatively easy to prove correct • Analysis is often straightforward • e.g., by using the “Master Theorem” 4.1 of CLRS • Often rather efficient • especially for processing recursive structures • but sometimes subproblems recur frequently

  4. Divide & conquer algorithms – good examples • Good examples • binary search • mergesort • Strassen’s matrix multiplication (CLRS, Sec. 4.2) • Other examples • selection sort, insertion sort • integer multiplication (analog of Strassen)

  5. Dynamic programming algorithms • Often easy to design • same intuitions as divide and conquer • but usually implemented bottom up • solutions to subproblems are saved in a table • Usually possible • although not always better than divide & conquer • Analysis is often straightforward • time complexity is often that of constructing the table

  6. Dynamic programming issues • Not always highly efficient • except in the case of repeated subproblems • when they can be more efficient than divide-and-conquer versions • e.g., Fibonacci numbers & binomial coefficients • Good examples • all-pairs shortest paths (CLRS 25.2) • optimal binary search trees (CLRS 15.5) • matrix chains (CLRS, Sec. 15.2)

  7. Optimal substructure • A dynamic programming solution to a problem requires that the problem exhibit optimal substructure • A problem exhibits optimal substructure if an optimal solution to the problem contains within it optimal solutions to subproblems. • For such problems, we need only solve subproblems recursively • without worrying about how their solutions interact

  8. Greedy algorithms • Often easy to design • Often aren't available (and also correct) • Often difficult to prove correct • Usually easy to analyze • Usually efficient • Good examples • Dijkstra's algorithm (single-source shortest paths) • Prim's and Kruskal's algorithms (spanning trees) • Huffman’s algorithm (minimum-length encoding)

  9. Randomized algorithms • Not always easy to design • Not always helpful • Not always correct • and probability of correctness often hard to find • Average-case analysis is often awkward • this is generally the relevant metric • Often faster than competing algorithms • Good examples • quicksort (with randomly chosen pivot)

  10. Backtracking, best-first, and branch-and-bound algorithms • Advantages • often easy to design • usually available • easy to prove correct • usually easy to analyze for worst case behavior • Why we’ll cover them only if time permits • often the average-case behavior is most relevant, but very difficult to analyze • often inefficient

More Related