1 / 45

Greedy Minimum Spanning Tree Algorithm

Greedy Minimum Spanning Tree Algorithm. The next greedy spanning tree algorithm we examine is Prim’s Algorithm . Like Kruskal’s algorithm, Prim builds a spanning tree by adding one edge at a time The difference is that in the Prim algorithm, the edges must form a tree on its vertices

lenora
Télécharger la présentation

Greedy Minimum Spanning Tree Algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Greedy Minimum Spanning Tree Algorithm The next greedy spanning tree algorithm we examine is Prim’s Algorithm. Like Kruskal’s algorithm, Prim builds a spanning tree by adding one edge at a time The difference is that in the Prim algorithm, the edges must form a tree on its vertices Thus, the component partition (not used) will have at most one set with more than one element General Formulation: Prim: T  an arbitrary vertex of G for ( i = 1 to n-1) e  minimum weight edge not in T that has exactly one endpoint in T T  T + e

  2. Greedy Minimum Spanning Tree Algorithms General Formulations: Kruskal: T  a minimum-weight edge; for ( i = 1 to n-2) { e  minimum weight edge not in T that does not create a simple cycle when added to T T  T + e } Prim: T  an arbitrary vertex of G; for ( i = 1 to n-1) { e  minimum weight edge not in T that has exactly one endpoint in T T  T + e }

  3. Implementing Prim’s Algorithm • The two minimum spanning tree algorithms are fairly simple conceptually • It is not hard to follow each algorithm for a specific weighted graph • As seen previously with Kruskal’s algorithm, what makes each algorithm work is the underlying data structures • The data structure supports the critical step where a new edge is chosen for the tree or forest • We will illustrate the geometric tracing of the Prim’s algorithm first • We will then show how the data structures are used. This is critical to an understanding of the complexity of the algorithm

  4. Example: Prim’s Algorithm We will illustrate Prim’s Algorithm by applying it to our computer network. We also list the edges in the order in which they were inserted. We suppose that San Francisco is the vertex chosen to start the process.

  5. Example: Prim’s Algorithm T  a minimum-weight edge; {San Francisco,Denver}

  6. Example: Prim’s Algorithm T  a minimum-weight edge; {San Francisco,Denver}, {San Francisco,Chicago}

  7. Example: Prim’s Algorithm T  a minimum-weight edge; {San Francisco,Denver}, {San Francisco,Chicago}, {Chicago,Atlanta}

  8. Example: Prim’s Algorithm T  a minimum-weight edge; {San Francisco,Denver}, {San Francisco,Chicago}, {Chicago,Atlanta} {Atlanta,New York} Total Cost: $3700/Month

  9. Implementing Prim’s Algorithm • To implement Prim, we keep a list of candidate edges to add to the tree T • Instead of a list of all such edges, we maintain exactly one min weight edge for each vertex not in T • For each vertex u of G not in T, let min[u] be the weight of a lightest edge with one endpoint equal to u and the other endpoint in T • Moreover, let parent[u] be a vertex of T such that w(u,parent[u]) = min[u] • The next edge and vertex to be added to T is the vertex u having the least value of min[u]; the edge to be added is (u,parent[u]). • Once we add this vertex and edge, we must check to see if adding u to the tree has changed the value of min[v] for any vertex still outside T. • This can only happen if v is adjacent to u and w(u,v) < min[v] • If so, we need to change min[v] to w(u,v) and change parent[v] to u • This leads to the capabilities needed by the data structure that maintains the values min[v].

  10. Data Structure of min[v] • What data structure should we use to maintain the values min[v] for the vertices not in T? • If we were to use an unsorted array, we might have to search the entire array to find the vertex with the smallest min-value • If we use a sorted array, then we will have to re-sort the array every time we update a min-value after adding a vertex to T • Neither of these is acceptable • Since we need to delete the vertex with minimum value, a min-heap seems a good idea. • However, we will also need to be able to update values in the heap • This is difficult to do in the regular heap representation studied earlier • Instead we will use an indirect heap representation

  11. Data Structure for min[v] • We will use the min array as the key array in the indirect heap representation • To indicate that the value in the key array at index i is not in the heap, we will set into[i] to -1. • Thus we can test whether key[i] is in the heap by checking to see that into[i] is positive.

  12. Data Structure for min[v] • The data structure we will use will be an object h with the following internal structure: • an array key[] to hold the data values • arrays into[] and outof[] which, with the key array, define the heap • A vital operation on our object is the initialization method init(A,m), where A is an array of n elements • This method copies the entries of A into h.key[] and executes the algorithm Heapify for the indirect heap • The other operations are given on the next slide

  13. Prim Implementation • The container h is an ADT that supports the following operations: • h.init(min,n)initializes h to the values in min • h.del()deletes and returns an item in h with minimum weight • h.isin(w)returns true if w is the vertex of an item in h • h.keyval(w)returns the weight corresponding to w in h • h.decrease(w,wgt)changes the weight associated with w to wgt, a smaller value • As described previously, we use a binary minheap to implement h, using an indirect representation • We then get the following running times for the above operations • h.init(min,n)(n) • h.del()(lg n) • h.isin(w)(1) • h.keyval(w)(1) • h.decrease(w,wgt)(lg n)

  14. Initial tree consists of vertex S indicates closest connection to the current tree

  15. Prim Implementation • We assume an adjacency list representation for weighted graphs • The entries in the adjacency list for vertex v are objects with two fields, one for the other endpoint of the edge and and another for the weight of the edge joining the two vertices • Thus adj[v] is a pointer to the first object in the adjacency list for v

  16. Prim Implemented • prim(adj,start,parent) { n = adj.last for i = 1 to n min[i] =  parent[i] = nil min[start] = 0 h.init(min,n) for i = 1 to n { v = h.del() ref = adj[v] // pointer to the first record in the adjacency list while (ref != null) { u = ref.vert // the adjacent vertex if ( h.isin(u) && ref.weight < h.keyval(u) ) { parent[u] = v h.decrease(u,ref.weight) } ref = ref.next // the next adjacent vertex or null } }}

  17. (n) Overall, executed twice for each edge, so (m) iterations of while-loop Prim Implemented • prim(adj,start,parent) { n = adj.last for i = 1 to n min[i] =  min[start] = 0 parent[start] = 0 h.init(min) for i = 1 to n { v = h.del() ref = adj[v] while (ref != null) { w = ref.vert if ( h.isin(w) && ref.weight < h.keyval(w) ) { parent[w] = v h.decrease(w,ref.weight) } ref = ref.next } }} (lg n), done n times, so (n lg n) O(lg n), executed once each time through the while-loop, so O(m lg n)

  18. Prim Running Time • Since m  n-1, the previous analysis shows that the worst-case running time for Prim’s algorithm is O(m lg n) • One can show that the algorithm is also (m lg n) and thus is (m lg n) • See the text, pages 291-294

  19. Prim Correctness • The proof is analogous to the proof for Kruskal, showing that at each step the partial tree that has been formed is a subgraph of a minimal spanning tree for the graph • We now turn to the indirect min-heap representation needed by Prim’s algorithm

  20. Indirect Heap Structures • We maintain one array, key, to hold the values that may be stored in the heap in some arbitrary order • For example, the key array indices may correspond to the vertices of a graph and the values in the key array is the value being stored at some node in the heap • We will never move values from one position to another in the key array • Thus, when we wish to modify the value corresponding to, say a vertex, we know where to find it • We will not use a heap array per se • Instead, we define the configurationof the heap structure by means of the key array and two other arrays: • into: into[i] is the index (node number) in theheap structure where key[i] is found • outof: outof[i] is the index in key array where the key value of node i of the heap structure is found

  21. Example: Indirect Heaps 1 2 3 4 5 6 7 8 9 10 • We will convert the following heap to an indirect heap representation: node indices key indices key Node containing key[10] into Key index for value in node 10 outof

  22. Example: Indirect Heaps 1 2 3 4 5 6 7 8 9 10 • We will convert the following heap to an indirect heap representation: node indices key indices key Nodecontaining key[10] into node indices Key index for value in node 10 outof • Key property: into[outof[i]] = i and outof[into[j]] = j

  23. Example: Indirect Heap Swap 1 2 3 4 5 6 7 8 9 10 We will show how to swap two values (G and E) in an indirect heap: Note that E is in node 5 and is at key index 6 Also G is in node 3 and is at key index 9 When we swap E and G key into outof

  24. Example: Indirect Heap Swap 1 2 3 4 5 6 7 8 9 10 We will show how to swap two values (G and E) in an indirect heap: Note that E is in node 5 and is at key index 6 Also G is in node 3 and is at key index 9 When we swap E and G We must update the outof array key into outof

  25. Example: Indirect Heap Swap 1 2 3 4 5 6 7 8 9 10 We will show how to swap two values (G and E) in an indirect heap: Note that E is in node 5 and is at key index 6 Also G is in node 3 and is at key index 9 When we swap E and G We must update the outof array swap(outof[3], outof[5]) key into outof

  26. Example: Indirect Heap Swap 1 2 3 4 5 6 7 8 9 10 We will show how to swap two values (G and E) in an indirect heap: Note that E is in node 5 and is at key index 6 Also G is in node 3 and is at key index 9 When we swap E and G We must update the outof array and update the into array swap(outof[3], outof[5]) into[outof[3]] = 3; into[outof[5]] = 5 key into outof

  27. Example: Indirect Heap Swap 1 2 3 4 5 6 7 8 9 10 • Swapping two values (G and E) in an indirect heap: swap(outof[3], outof[5]); into[outof[3]] = 3; into[outof[5]] = 5] key into outof

  28. Example: Indirect Heap Swap 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 Swap code original heap structure new heap structure swap(outof[3], outof[5]); into[outof[3]] = 3; into[outof[5]] = 5] key into outof

  29. Code for Indirect Heap Swapping void indirectSwap(IndHeapStr * H, int node1, int node2){ int tmp; if (node1 < 1 || node2 < 1 || node1 > H->size || node2 > H->size) { printf("Attempting to swap nonexistent nodes "); printf("in an indirect Heap structure\n");exit(1); } // swap outof entriestmp = H->outof[node1]; H->outof[node1] = H->outof[node2]; H->outof[node2] = tmp; // adjust into entries H->into[H->outof[node1]] = node1; H->into[H->outof[node2]] = node2;}

  30. Increasing a Value in an Indirect Heap Increase 12 to 89 key into outof

  31. Increasing a Value in an Indirect Heap Increase 12 to 89 key into outof

  32. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Move parent value down outof[8] = outof[4] into[outof[8]] = 8 into outof

  33. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Move parent value down outof[8] = outof[4] into[outof[8]] = 8 into outof

  34. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Move parent value down outof[8] = outof[4] into[outof[8]] = 8 into outof

  35. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key outof[4] = outof[2] into[outof[4]] = 4 into outof

  36. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key outof[4] = outof[2] into[outof[4]] = 4 into outof

  37. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Slot ok, so: outof[2] = 8 into[8] = 2 into outof

  38. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Slot ok, so: outof[2] = 8 into[8] = 2 into outof

  39. Increasing a Value in an Indirect Heap Increase 12 to 89; restore heap property by “siftup” key Done! Slot ok, so: outof[2] = 8 into[8] = 2 into outof

  40. Code for Heap Increase Function • Input parameters: i, newval Output parameters: None increase(i, newval) { key[i] = newval c = i // possible location while (c > 1) { // while possible location is not the root p = c/2 // parent location if ( key[outof[p]]  newval) // we can put newval into position c// because its parent has value  newval so break; // break out of the while loop else outof[c] = outof[p] // move value at p down to c into[outof[c]] = c c = p // move c up one level} outof[c] = i // put newval into position cinto[i] = c }

  41. Running Time of Indirect Heap Increase • The worst that can happen is that you are increasing the value at a leaf that is at the lowest level of the tree and that the new value is greater than the value at the root • In that case you must move up the longest path, so the running time is (h), where h is the height of the tree • Thus, as a function of the number n of values in the heap, the running time is (lg n),

  42. Homework Page 294: #2 Special problem on next two slides If the indirect representation of heaps has been covered: Page 149: # 28

  43. (a) Prim’s algorithm is being run on the graph below. The current tree vertices are the black nodes and the vertices not in the tree are the white nodes. The shaded edges are the tree edges. Fill in the values in the parent array  and the key array for the white nodes. Recall that the nodes not in the tree are kept in a priority queue based on the key values.

  44. (b) Now fill in the values in the two arrays after the next vertex and edge are added to the tree. Make sure you shade out the table entries for the new tree vertex.

More Related