1 / 22

Dynamic programming vs Greedy algo – con’t

Dynamic programming vs Greedy algo – con’t. KNAPSACK. KNAPSACK. KNAPSACK. KNAPSACK. Input: Output: Objective:. Input: Output: Objective:. Input: Output: Objective:. a number W and a set of n items, the i-th item has a weight w i and a cost c i

Télécharger la présentation

Dynamic programming vs Greedy algo – con’t

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic programming vs Greedy algo – con’t KNAPSACK KNAPSACK KNAPSACK KNAPSACK Input: Output: Objective: Input: Output: Objective: Input: Output: Objective: a number W and a set of n items, the i-th item has a weight wi and a cost ci a subset of items with total weight · W maximize cost a number W and a set of n items, the i-th item has a weight wi and a cost ci a subset of items with total weight · W maximize cost a number W and a set of n items, the i-th item has a weight wi and a cost ci a subset of items with total weight · W maximize cost a number W and a set of n items, the i-th item has a weight wi and a cost ci a subset of items with total weight · W maximize cost Version 1: Items are divisible. Version 1: Items are divisible.

  2. KNAPSACK – divisible: a greedy solution • KNAPSACK-DIVISIBLE(n,c,w,W) • sort items in decreasing order of ci/wi • i = 1 • currentW = 0 • while (currentW + wi < W) { • take item of weight wi and cost ci • currentW += wi • i++ • } • take W-currentW portion of item i Correctness: Running time:

  3. KNAPSACK – indivisible Version 2: Items are indivisible. Does previous algorithm work for this version of KNAPSACK?

  4. KNAPSACK – indivisible: a dyn-prog solution The heart of the algorithm: S[k][v] =

  5. KNAPSACK – indivisible: a dyn-prog solution The heart of the algorithm: S[k][v] = maximum cost of a subset of the first k items, where the weight of the subset is at most v

  6. KNAPSACK – indivisible: a dyn-prog solution The heart of the algorithm: S[k][v] = maximum cost of a subset of the first k items, where the weight of the subset is at most v • KNAPSACK-INDIVISIBLE(n,c,w,W) • init S[0][v]=0 for every v=0,…,W • init S[k][0]=0 for every k=0,…,n • 3. for v=1 to W do • for k=1 to n do • S[k][v] = S[k-1][v] • if (wk· v) and • (S[k-1][v-wk]+ck > S[k][v]) • then • S[k][v] = S[k-1][v-wk]+ck • RETURN S[n][W]

  7. KNAPSACK – indivisible: a dyn-prog solution The heart of the algorithm: S[k][v] = maximum cost of a subset of the first k items, where the weight of the subset is at most v • KNAPSACK-INDIVISIBLE(n,c,w,W) • init S[0][v]=0 for every v=0,…,W • init S[k][0]=0 for every k=0,…,n • 3. for v=1 to W do • for k=1 to n do • S[k][v] = S[k-1][v] • if (wk· v) and • (S[k-1][v-wk]+ck > S[k][v]) • then • S[k][v] = S[k-1][v-wk]+ck • RETURN S[n][W] Running time:

  8. KNAPSACK – indivisible: a dyn-prog solution The heart of the algorithm: S[k][v] = maximum cost of a subset of the first k items, where the weight of the subset is at most v • KNAPSACK-INDIVISIBLE(n,c,w,W) • init S[0][v]=0 for every v=0,…,W • init S[k][0]=0 for every k=0,…,n • 3. for v=1 to W do • for k=1 to n do • S[k][v] = S[k-1][v] • if (wk· v) and • (S[k-1][v-wk]+ck > S[k][v]) • then • S[k][v] = S[k-1][v-wk]+ck • RETURN S[n][W] How to output a solution ?

  9. Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. ASCII code A = 01000001 B = 01000010 C = 01000011 … fixed-length code How to decode: ? 01000001010000100100001101000001

  10. Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. code A = 0 B = 10 C = 11 … variable-length code How to decode: ? 0101001111

  11. Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. code A = 0 B = 10 C = 11 … variable-length code Def: A code is prefix-free if no codeword is a prefix of another codeword. How to decode: ? 0101001111

  12. Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. another code A = 1 B = 10 C = 11 … variable-length code Def: A code is prefix-free if no codeword is a prefix of another codeword. How to decode: ? 10101111

  13. Problem: Huffman Coding Def: Huffman coding is an optimal prefix-free code. • Optimization problems • Input: • Output: • Objective:

  14. Problem: Huffman Coding Def: Huffman coding is an optimal prefix-free code. • Huffman coding • Input: • Output: • Objective: an alphabet with frequencies a prefix-free code minimize expected number of bits per character

  15. Problem: Huffman Coding Example: A 60% B 20% C 10% D 10% Is fixed-width coding optimal ? • Huffman coding • Input: • Output: • Objective: an alphabet with frequencies a prefix-free code minimize expected number of bits per character

  16. Problem: Huffman Coding Example: A 60% B 20% C 10% D 10% Is fixed-width coding optimal ? NO, exists a prefix-free code using 1.6 bits per character ! • Huffman coding • Input: • Output: • Objective: an alphabet with frequencies a prefix-free code minimize expected number of bits per character

  17. Problem: Huffman Coding • Huffman ( [a1,f1],[a2,f2],…,[an,fn]) • if n=1 then • code[a1] “” • else • let fi,fj be the 2 smallest f’s • Huffman ( [ai,fi+fj],[a1,f1],…,[an,fn] ) • omits ai,aj • code[aj]  code[ai] + “0” • code[ai]  code[ai] + “1” • Huffman coding • Input: • Output: • Objective: an alphabet with frequencies a prefix-free code minimize expected number of bits per character

  18. Problem: Huffman Coding Lemma 1: Let x,y be the symbols with frequencies fx > fy. Then in an optimal prefix code length(Cx)  length(Cy).

  19. Problem: Huffman Coding Lemma 1: Let x,y be the symbols with frequencies fx > fy. Then in an optimal prefix code length(Cx)  length(Cy). Lemma 2: If w is a longest codeword in an optimal code then there exists another codeword of the same length.

  20. Problem: Huffman Coding Lemma 1: Let x,y be the symbols with frequencies fx > fy. Then in an optimal prefix code length(Cx)  length(Cy). Lemma 2: If w is a longest codeword in an optimal code then there exists another codeword of the same length. Lemma 3: Let x,y be the symbols with the smallest frequencies. Then there exists an optimal prefix code such that the codewords for x and y differ only in the last bit.

  21. Problem: Huffman Coding Lemma 3: Let x,y be the symbols with the smallest frequencies. Then there exists an optimal prefix code such that the codewords for x and y differ only in the last bit.

  22. Problem: Huffman Coding Lemma 1: Let x,y be the symbols with frequencies fx > fy. Then in an optimal prefix code length(Cx)  length(Cy). Lemma 2: If w is a longest codeword in an optimal code then there exists another codeword of the same length. Lemma 3: Let x,y be the symbols with the smallest frequencies. Then there exists an optimal prefix code such that the codewords for x and y differ only in the last bit. Theorem: The prefix code output by the Huffman algorithm is optimal.

More Related