CS 332: Algorithms

# CS 332: Algorithms

Télécharger la présentation

## CS 332: Algorithms

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. CS 332: Algorithms Go Over Midterm Intro to Graph Algorithms David Luebke 14/2/2014

2. T(n) = 1 + 1 + 1 + 1 + … + 1 = O(n) O(n) terms Problem 1: Recurrences • Give asymptotic bounds for the following recurrences. Justify by naming the case of the master theorem, iterating, or substitution a. T(n) = T(n-2) + 1 What is the solution? How would you show it? David Luebke 24/2/2014

3. Problem 1: Recurrences b. T(n) = 2T(n/2) + n lg2 n • This is a tricky one! What case of the master theorem applies? • Answer: case 2, as generalized in Ex 4.4.2: • if , where k  0, then • Thus T(n) = O(n lg3n) David Luebke 34/2/2014

4. Problem 1: Recurrences c. T(n) = 9T(n/4) + n2 • Which case of the master theorem applies? • A: case 3 • What is the answer? • A: O(n2) David Luebke 44/2/2014

5. Problem 1: Recurences d. T(n) = 3T(n/2) + n • What case of the master theorem applies? • A: case 1 • What is the answer? • A: David Luebke 54/2/2014

6. Problem 1: Recurrences e. T(n) = T(n/2 + n) + n • Recognize this one? Remember the solution? • A: O(n) • Proof by substitution: • Assume T(n)  cn • Then T(n)  c(n/2 + n) + n  cn/2 + cn + n  cn - cn/2 + cn + n  cn - (cn/2 - cn - n) what could n and c be to cn make the 2nd term positive? David Luebke 64/2/2014

7. Problem 2: Heaps • Implement BUILD-HEAP() as a recursive divide-and-conquer procedure David Luebke 74/2/2014

8. Problem 2: Heaps • Implement BUILD-HEAP() as a recursive divide-and-conquer procedure BuildHeap(A, i) { if (i <= length(A)/2) { BuildHeap(A, 2*i); BuildHeap(A, 2*i + 1); Heapify(A, i); } } David Luebke 84/2/2014

9. Problem 2: Heaps • Describe the running time of your algorithm as a recurrence BuildHeap(A, i) { if (i <= length(A)/2) { BuildHeap(A, 2*i); BuildHeap(A, 2*i + 1); Heapify(A, i); } } David Luebke 94/2/2014

10. Problem 2: Heaps • Describe the running time of your algorithm as a recurrence:T(n) = ??? BuildHeap(A, i) { if (i <= length(A)/2) { BuildHeap(A, 2*i); BuildHeap(A, 2*i + 1); Heapify(A, i); } } David Luebke 104/2/2014

11. Problem 2: Heaps • Describe the running time of your algorithm as a recurrence:T(n) = 2T(n/2) + O(lg n) BuildHeap(A, i) { if (i <= length(A)/2) { BuildHeap(A, 2*i); BuildHeap(A, 2*i + 1); Heapify(A, i); } } David Luebke 114/2/2014

12. Problem 2: Heaps • Describe the running time of your algorithm as a recurrence:T(n) = 2T(n/2) + O(lg n) • Solve the recurrence:??? David Luebke 124/2/2014

13. Problem 2: Heaps • Describe the running time of your algorithm as a recurrence:T(n) = 2T(n/2) + O(lg n) • Solve the recurrence: T(n) = (n) by case 1 of the master theorem David Luebke 134/2/2014

14. Problem 3: Short Answer • Prove that (n+1)2 = O(n2) by giving the constants of n0 and c used in the definition of O notation • A: c = 2, n0 = 3 • Need (n+1)2 cn2 for all n  n0 • (n+1)2  2n2 if 2n + 1  n2, which is true for n  3 David Luebke 144/2/2014

15. Problem 3: Short Answer • Suppose that you want to sort n numbers, each of which is either 0 or 1. Describe an asymptotically optimal method. • What is one method? • What is another? David Luebke 154/2/2014

16. Problem 3: Short Answer • Briefly describe what we mean by a randomized algorithm, and give two examples. • Necessary: • Behavior determined in part by random-number generator • Nice but not necessary: • Used to ensure no sequence of operations will guarantee worst-case behavior (adversary scenario) • Examples: • r-qsort, r-select, skip lists, universal hashing David Luebke 164/2/2014

17. Problem 4: BST, Red-Black Trees • Label this tree with {6, 22, 9, 14, 13, 1, 8} so that it is a legal binary search tree: 13 6 14 1 9 22 What’s the smallest number? Where’s it go? What’s the next smallest? Where’s it go? etc… 8 David Luebke 174/2/2014

18. Problem 4: BST, Red-Black Trees • Label this tree with R and B so that it is a legal red-black tree: The root is always black black-height (right subtree) = b.h.(left) 13 For this subtree to work,child must be red 6 14 1 9 22 Same here 8 David Luebke 184/2/2014

19. Problem 4: BST, Red-Black Trees • Label this tree with R and B so that it is a legal red-black tree: B 13 6 14 Can’t have two reds in a row R 1 9 22 8 R David Luebke 194/2/2014

20. Problem 4: BST, Red-Black Trees • Label this tree with R and B so that it is a legal red-black tree: B 13 for this subtree towork, both childrenmust be black B 6 14 R B 1 9 22 8 R David Luebke 204/2/2014

21. Problem 4: BST, Red-Black Trees • Label this tree with R and B so that it is a legal red-black tree: B 13 R B 6 14 R B B 1 9 22 8 R David Luebke 214/2/2014

22. Problem 4: BST, Red-Black Trees • Rotate so the left child of the root becomes the new root. Can it be labeled as red-black tree? 6 1 13 9 14 8 22 David Luebke 224/2/2014

23. Problem 4: BST, Red-Black Trees • Rotate so the left child of the root becomes the new root. Can it be labeled as red-black tree? B 6 B R 1 13 B B 9 14 R R 8 22 David Luebke 234/2/2014

24. Graphs • A graph G = (V, E) • V = set of vertices • E = set of edges = subset of V  V • Thus |E| = O(|V|2) David Luebke 244/2/2014

25. Graph Variations • Variations: • A connected graphhas a path from every vertex to every other • In an undirected graph: • Edge (u,v) = edge (v,u) • No self-loops • In a directed graph: • Edge (u,v) goes from vertex u to vertex v, notated uv David Luebke 254/2/2014

26. Graph Variations • More variations: • A weighted graph associates weights with either the edges or the vertices • E.g., a road map: edges might be weighted w/ distance • A multigraph allows multiple edges between the same vertices • E.g., the call graph in a program (a function can get called from multiple other functions) David Luebke 264/2/2014

27. Graphs • We will typically express running times in terms of |E| and |V| (often dropping the |’s) • If |E|  |V|2 the graph is dense • If |E|  |V| the graph is sparse • If you know you are dealing with dense or sparse graphs, different data structures may make sense David Luebke 274/2/2014

28. Representing Graphs • Assume V = {1, 2, …, n} • An adjacency matrixrepresents the graph as a n x n matrix A: • A[i, j] = 1 if edge (i, j)  E (or weight of edge) = 0 if edge (i, j)  E David Luebke 284/2/2014

29. Graphs: Adjacency Matrix • Example: 1 a d 2 4 b c 3 David Luebke 294/2/2014

30. Graphs: Adjacency Matrix • Example: 1 a d 2 4 b c 3 David Luebke 304/2/2014

31. Graphs: Adjacency Matrix • How much storage does the adjacency matrix require? • A: O(V2) • What is the minimum amount of storage needed by an adjacency matrix representation of an undirected graph with 4 vertices? • A: 6 bits • Undirected graph  matrix is symmetric • No self-loops don’t need diagonal David Luebke 314/2/2014

32. Graphs: Adjacency Matrix • The adjacency matrix is a dense representation • Usually too much storage for large graphs • But can be very efficient for small graphs • Most large interesting graphs are sparse • E.g., planar graphs, in which no edges cross, have |E| = O(|V|) by Euler’s formula • For this reason the adjacency list is often a more appropriate respresentation David Luebke 324/2/2014

33. Graphs: Adjacency List • Adjacency list: for each vertex v  V, store a list of vertices adjacent to v • Example: • Adj[1] = {2,3} • Adj[2] = {3} • Adj[3] = {} • Adj[4] = {3} • Variation: can also keep a list of edges coming into vertex 1 2 4 3 David Luebke 334/2/2014

34. Graphs: Adjacency List • How much storage is required? • The degree of a vertex v = # incident edges • Directed graphs have in-degree, out-degree • For directed graphs, # of items in adjacency lists is out-degree(v) = |E|takes (V + E) storage (Why?) • For undirected graphs, # items in adj lists is  degree(v) = 2 |E| (handshaking lemma)also (V + E) storage • So: Adjacency lists take O(V+E) storage David Luebke 344/2/2014

35. The End • Coming up: actually doing something with graphs David Luebke 354/2/2014

36. Exercise 1 Feedback • First, my apologies… • Harder than I thought • Too late to help with midterm • Proof by substitution: • T(n) = T(n/2 + n) + n • Most people assumed it was O(n lg n)…why? • Resembled proof from class: T(n) = 2T(n/2 + 17) + n • The correct intuition: n/2 dominates n term, so it resembles T(n) = T(n/2) + n, which is O(n) by m.t. • Still, if it’s O(n) it’s O(n lg n), right? David Luebke 364/2/2014

37. Exercise 1: Feedback • So, prove by substitution thatT(n) = T(n/2 + n) + n = O(n lg n) • Assume T(n)  cn lg n • Then T(n)  c(n/2 + n) lg (n/2 + n)  c(n/2 + n) lg (n/2 + n)  c(n/2 + n) lg (3n/2)  … David Luebke 374/2/2014