1 / 22

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels. Design Patterns for Optimization Problems Dynamic Programming Matrix Parenthesizing Longest Common Subsequence Activity Selection. Algorithmic Paradigm Context. Solve subproblem(s), then make choice.

Télécharger la présentation

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UMass Lowell Computer Science 91.503Analysis of AlgorithmsProf. Karen Daniels Design Patterns for Optimization Problems Dynamic Programming Matrix Parenthesizing Longest Common Subsequence Activity Selection

  2. Algorithmic Paradigm Context Solve subproblem(s), then make choice Make choice, then solve subproblem(s) Subproblem solution order

  3. Dynamic Programming Approach to Optimization Problems • Characterize structure of an optimal solution. • Recursively define value of an optimal solution. • Compute value of an optimal solution, typically in bottom-up fashion. • Construct an optimal solution from computed information. (separate slides for rod cutting) source: 91.503 textbook Cormen, et al.

  4. Matrix Parenthesization Dynamic Programming

  5. Example: Matrix Parenthesization Definitions • Given “chain” of n matrices: <A1, A2, … An, > • Compute product A1A2… An efficiently • Multiplication order matters! • Matrix multiplication is associative • Minimize “cost” = number of scalar multiplications source: 91.503 textbook Cormen, et al.

  6. Example: Matrix Parenthesization Step 1: Characterizing an Optimal Solution • Observation: • Any parenthesization of AiAi+1… Ajmust split it between Ak and Ak+1 for some k. • THM: Optimal Matrix Parenthesization: • If an optimal parenthesization of AiAi+1… Ajsplits at k, then • parenthesization of prefix AiAi+1… Akmust be an optimal parenthesization. • Why? • If existed less costly way to parenthesize prefix, then substituting that parenthesization would yield less costly way to parenthesize AiAi+1… Aj, contradicting optimality of that parenthesization. common DP proof technique: “cut-and-paste” proof by contradiction source: 91.503 textbook Cormen, et al.

  7. Example: Matrix Parenthesization Step 2: A Recursive Solution Recursive definition of minimum parenthesization cost: 0 if i = j m[i,j]= min{m[i,k] + m[k+1,j] + pi-1pkpj} if i < j i <= k < j How many distinct subproblems? each matrix Ai has dimensions pi-1 x pi source: 91.503 textbook Cormen, et al.

  8. 2,500 2,625 1,000 0 Example: Matrix Parenthesization Step 3: Computing Optimal Costs s: value of k that achieves optimal cost in computing m[i, j] source: 91.503 textbook Cormen, et al.

  9. Example: Matrix Parenthesization Step 4: Constructing an Optimal Solution PRINT-OPTIMAL-PARENS(s, i, j) if i== j print “A”i else print “(“ PRINT-OPTIMAL-PARENS(s, i, s[i, j]) PRINT-OPTIMAL-PARENS(s, s[i, j]+1, j) print “)“ source: 91.503 textbook Cormen, et al.

  10. LOOKUP-CHAIN(m,p,i,j) 1 if m[i,j] < 2 return m[i,j] 3 ifi==j 4m[i,j] = 0 5 elsefor k = i to j-1 6 q = LOOKUP-CHAIN(m,p,i,k) + LOOKUP-CHAIN(m,p,k+1,j) + pi-1 pkpj 7 ifq < m[i,j] 8 m[i,j] = q 9 return m[i,j] MEMOIZED-MATRIX-CHAIN(p) n = p.length – 1 let m[1…n,1…n] be a new table. fori= 1 to n 4for j = ito n 5m[i,j] = 6 returnLOOKUP-CHAIN(m, p,1,n) Example: Matrix ParenthesizationMemoization source: 91.503 textbook Cormen, et al. • Provide Dynamic Programming efficiency • But with top-down strategy • Use recursion • Fill in m table “on demand” • (can modify to fill in s table)

  11. Longest Common Subsequence Dynamic Programming

  12. Example: Longest Common Subsequence (LCS): Motivation • Strand of DNA: string over finite set {A,C,G,T} • each element of set is a base: adenine, guanine, cytosine or thymine • Compare DNA similarities • S1 = ACCGGTCGAGTGCGCGGAAGCCGGCCGAA • S2 = GTCGTTCGGAATGCCGTTGCTCTGTAAA • One measure of similarity: • find the longest string S3 containing bases that also appear (not necessarily consecutively) in S1 and S2 • S3 = GTCGTCGGAAGCCGGCCGAA source: 91.503 textbook Cormen, et al.

  13. Example: LCS Definitions source: 91.503 textbook Cormen, et al. • Sequence is a subsequence of if (strictly increasing indices of X) such that • example: is subsequence of with index sequence • Z is common subsequence of X and Y if Z is subsequence of both X and Y • example: • common subsequence but not longest • common subsequence. Longest? Longest Common Subsequence Problem: Given 2 sequences X, Y, find maximum-length common subsequence Z.

  14. Example: LCS Step 1: Characterize an LCS THM 15.1: Optimal LCS Substructure Given sequences: For any LCS of X and Y: 1 if then and Zk-1 is an LCS of Xm-1 and Yn-1 2 if then Z is an LCS of Xm-1 and Y 3 if then Z is an LCS of X and Yn-1 (using prefix notation) PROOF: based on producing contradictions 1 a) Suppose . Appending to Z contradicts longest nature of Z. b) To establish longest nature of Zk-1, suppose common subsequence W of Xm-1and Yn-1has length > k-1. Appending to W yields common subsequence of length > k = contradiction. 2 To establish optimality (longest nature), common subsequence W of Xm-1and Y of length > k would also be common subsequence of Xm, Y, contradicting longest nature of Z. 3 Similar to proof of (2) source: 91.503 textbook Cormen, et al.

  15. ? Example: LCS Step 2: A Recursive Solution Implications of Theorem 15.1: no yes Find LCS(Xm-1, Yn-1) Find LCS(X, Yn-1) Find LCS(Xm-1, Y) LCS1(X, Y) = LCS(Xm-1, Yn-1) + xm LCS2(X, Y) = max(LCS(Xm-1, Y), LCS(X, Yn-1)) An LCS of 2 sequences contains, as a prefix, an LCS of prefixes of the sequences.

  16. 0 if i=0 or j=0 c[i,j]= c[i-1,j-1]+1 if i,j > 0 and xi=yj max(c[i,j-1], c[i-1,j]) if i,j > 0 and xi=yj Example: LCS Step 2: A Recursive Solution (continued) source: 91.503 textbook Cormen, et al. • Overlapping subproblem structure: • Recurrence for length of optimal solution: Q(mn) distinct subproblems Conditions of problem can exclude some subproblems!

  17. Example: LCS Step 3: Compute Length of an LCS What is the asymptotic worst-case time complexity? 0 1 2 3 4 c table (represent b table) source: 91.503 textbook Cormen, et al.

  18. Example: LCS Step 4: Construct an LCS 8 source: 91.503 textbook Cormen, et al.

  19. Activity Selection Dynamic Programming …leading to a Greedy Algorithm…

  20. Activity Selection Optimization Problem • Problem Instance: • Set S = {a1,a2,...,an} of n activities • Each activity i has: • start time: si • finish time: fi • Activities require exclusive use of a common resource. • Activities i, j are compatible iff non-overlapping: • Objective: • select a maximum-sized set of mutually compatible activities source: 91.404 textbook Cormen, et al.

  21. 1 2 4 5 9 3 12 16 6 10 14 15 11 7 13 8 1 2 3 4 5 6 7 8 Activity Selection Activity Time Duration Activity Number What is an answer in this case?

  22. Activity Selection Solution to Sij including ak produces 2 subproblems: 1) Sik (start after ai finishes; finish before ak starts) 2) Skj (start after ak finishes; finish before aj starts) c[i,j]=size of maximum-size subset of mutually compatible activities in Sij. source: 91.404 textbook Cormen, et al.

More Related