340 likes | 469 Vues
In this detailed lecture, we cover the fundamentals of priority queues and heaps, including the essential operations such as ExtractMin, DecreaseKey, Insert, and the heap property. The priority queue is implemented using heaps, which not only allow efficient data handling but also facilitate sorting through Heapsort. We further delve into the Bellman-Ford algorithm, focusing on its ability to handle graphs with negative edge weights. The lecture highlights the workings of vertex distance estimates, the significance of cycles, and algorithm correctness through comprehensive explanations and examples.
E N D
Algorithms (andDatastructures) Lecture 7 MAS 714 part 2 Hartmut Klauck
Data Structure • Store n vertices and their distance estimate d(v) • Operations: • ExtractMin: Find (and remove) the vertex with minimum d(v) • DecreaseKey(v,x): replace keyof v with a smaller value • Initialize • Insert(v,x): insert v withkey x • Test for emptiness • Priority Queue
Heaps • We will implement a priorityqueuewith a heap • Heaps can also beusedforsorting! • Heapsort:Insert all elements, ExtractMinuntilempty • If all operations take time O(log n) we can sort in time O(n log n)
Heaps • A heapis an arrayoflength n • can hold atmost n keys/vertices/numbers • The keys in thearrayare not sorted, but their order hastheheap-property • Namelytheycanbeviewedasa binarytree, in whichparentshavesmallerkeysthantheirchildren • ExtractMinis easy! • Unfortunatelyweneedtoworktomaintaintheheap-property after removingtheroot
Heaps • keys in a heap are arranged as a full binary tree who’s last level is filled up from the left up to some point • Example:
Heaps • Heap property: • Element in cell i is smaller than elements in cells 2i and 2i+1 • Example:Tree Array
Heaps • Besides the array storing the keys we also keep a counter SIZE that tells us how many keys are in H • in cells 1…SIZE
Procedures for Heaps • Initialize: • Declare the array H of correct length n • Set SIZE to 0 • Test for Emptiness: • Check SIZE • FindMin: • Minimum is in H[1] • All this in time O(1)
Procedures for Heaps • Parent(i) is bi/2c • LeftChild(i) is 2i • Rightchild(i) is 2i+1
Procedures for Heaps • Suppose we remove the root, and replace it with the rightmost element of the heap • Now we still have a tree, but we (probably) violate the heap property at some vertex i, i.e., H[i]>H[2i] or H[i]>]H[2i+1] • The procedure Heapify(i) will fix this • Heapifiy(i) assumes that the subtrees below i are correct heaps, but there is a (possible) violation at i • And no other violations in H (i.e., above i)
Procedures for Heaps • Heapify(i) • l=LeftChild(i), r=Rightchild(i) • If l·SIZE and H[l]<H[i] Smallest=l else Smallest=i • If r·SIZE and H[r]<H[Smallest] Smallest=r • If Smallesti • Swap H[i] and H[Smallest] • Heapify(Smallest)
Heapify • Running Time: • A heap with SIZE=n has depth at most log n • Running time is dominated by the number of recursive calls • Each call leads to a subheap that is 1 level shallower • Time O( log n)
Proceduresfor Heaps • ExtractMin(): • Return H[1] • H[1]=H[SIZE] • SIZE=SIZE-1 • Heapify(1) • Time is O(log n)
Proceduresfor Heaps • DecreaseKey(i,key) • If H[i]<keyreturnerror • H[i]=key \\Now parent(i) mightviolateheapproperty • While i>1 and H[parent(i)]>H[i] • Swap H[parent(i)] and H[i], i=parent(i) \\Move theelementtowardstheroot • Time is O(log n)
Procedures for Heaps • Insert(key): • SIZE=SIZE+1 • H[SIZE]=1 • DecreaseKey(SIZE,key) • Time is O(log n)
Note for Dijkstra • DecreaseKey(i,x) works on thevertexthatisstored in position i in theheap • But wewanttodecreasethekeyforvertex v! • Weneedtorememberthepositionof all v in theheap H • Keep an arraypos[1...n] • Wheneverwemove a vertex in H weneedtochangepos
The single-sourceshortest-pathproblemwith negative edgeweights • Graph G, weight function W, start vertex s • Output: a bitindicatingifthereis a negative cyclereachablefrom sAND (if not) theshortestpathsfrom s to all v
Bellman-Ford Algorithm • Initialize d(s)=0, d(v)=1forother v • For i=1 to n-1: • Relax all edges(u,v) • For all (u,v): ifd(v)>d(u)+W(u,v) thenoutput: „negative cycle!“ • Remark: d(v) and(v) containdistancefrom s andpredecessor in a shortestpathtree
Running time • Running time is O(nm) • n-1 times relax all m edges
Correctness • Assume that no cycle of negative length is reachable from s • Theorem:After n-1 iterations of the for-loop we have d(v)=(s,v) for all v. • Lemma:Let v0,…,vkbe ashortest path from v0 to vk. Relax edges (v0,v1)….(vk-1,vk) successively. Then d(vk)=(s,vk). This hold regardless of other relaxations performed.
Correctness • Proof of the theorem: • Let v denote a reachable vertex • Let s, …,v be a shortest path with k edges • k· n-1 • In every iteration all edges are relaxed • By thelemma d(v) is correct after k·n-1 iterations • For all unreachable vertices we have d(v)=1at all times • To show: the algorithm decides the existence of negative cyclescorrectly • No neg. cyclepresent: for all edges (u,v): • d(v)=(s,v)·(s,u)+W(u,v)=d(u)+W(u,v), pass test
Correctness • If a negative cycle exists: • Let v0,…,vk be a path with negative length and v0=vk • Assume the algorithm does NOT stop with error message, then • d(vi)·d(vi-1)+W(vi-1,vi) for all i=1...k • Hence
Korrektheit • v0=vk and all vertices appear once, so • d(vi)< 1for all reachable vertices, hence
The Lemma Lemma: Let v0,…,vk be a shortest path from v0 to vk. Relax edges (v0,v1)….(vk-1,vk) successively. Then d(vk)=(s,vk). This hold regardless of other relaxations performed. Proof: By induction. After relaxing (vi-1, vi) the value d(vi) correct. Base: i=0, d(v0)=d(s)=0 is correct Assume d(vi-1) correct. Accordingtoan earlierobservation after relaxing(vi-1,vi) also d(vi) correct Once d(v) is correct, thevaluestays correct d(v) is always an upper bound
Application of Bellman Ford • Distributed networks • We look for distance ofverticesfrom s • Computation can beperformedin a distributed way, without • global control • global knowledge about the network • Dijkstra needs global knowledge • Runningtime: n-1 phases, vertices compute (in parallel)
All-pairs shortest path • Given a graph • Variants: • directed/undirected • weighted/unweighted/pos./neg. weights • Output: For all pairs of vertices u,v: • Distance in G (APD: All-pairs distances) • Shortest Paths (APSP: All-pairs shortest-path)
APSP • APD: n2outputs, running time at least n2 • Can just use adjacency matrix • APSP: problem: how to represent n2 paths? • Easy to construct a graph, such that for (n2) vertex pairs the distance is (n) • Simply writing pathsrequiresoutput length n3
APSP output convention • Implicit representation of shortest paths as a successor matrix • Successor matrix S is n£n, S[i,j]=k for the neighbor k of i, which is first on the shortest path from i to j • Easy to compute the shortest path from i to j using S: • e.g. S[i,j]=k, S[k,j]=l, S[l,j]=a, S[a,j]=j
APSP: some observations • Edge weights ¸ 0: use n times Disjktra, running time: O(nm+n2log n) • Unweighted graphs: n times BFS for time O(nm+n2) • For dense graphs m=(n2) and we get O(n3) • Can we save work?
Floyd-Warshall Algorithmus • Input: G, directed graph with positive and negative weights, no negative cycles • O(n3) algorithmus based on Dynamic Programming • Compute shortest paths (from u to v) that use only vertices 1… k
Floyd-Warshall Algorithmus • Definition: • d[u,v,k]= length of the shortest path from u to v that (besides u,v) uses vertices {1,…,k} only • d[u,v,0]=W(u,v) • is =1if (u,v) is no edge • Recursion: • d[u,v,k]= minimum of • d[u,v,k-1] paths using only 1,…,k-1 • d[u,k,k-1] + d[k,v,k-1] paths also using k
Floyd-Warshall Algorithmus • Initialize d[u,v,0]=W(u,v) for all u,v • For k=1,…,n • compute d[u,v,k] for all u,v • Total running time: O(n3)
Floyd-Warshall Algorithmus • Computing the paths: exercise • Note that this algorithm is very simple, no fancy datastructures, so constant factors are small
Dynamic Programming • The values d[u,v,0] aregivenimmediately • The values d[u,v,n] arethesolutiontotheproblem • Wecaneasilycompute all d[u,v,k] onceweknow all d[u,v,k-1] • Thisprocessofcomputingsolutionsbottomupiscalleddynamicprogramming • Note thedifferencetocomputing top down byrecursion!