1 / 72

State space representations and search strategies - 2

State space representations and search strategies - 2. Spring 2007, Juris V īksna. Search strategies - A*. <S,P,I,G,W> - state space h*(x) - a minimum path weight from x to the goal state h(x) - heuristic estimate of h*(x) g*(x) - a minimum path weight from I to x

rambert
Télécharger la présentation

State space representations and search strategies - 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. State space representations and search strategies - 2 Spring 2007,Juris Vīksna

  2. Search strategies - A* <S,P,I,G,W> - state space h*(x) - a minimum path weight from x to the goal state h(x) - heuristic estimate of h*(x) g*(x) - a minimum path weight from I to x g(x) - estimate of g*(x) (i.e. minimal weight found as far f*(x) = g*(x) + h*(x) f(x) = g(x) + h(x)

  3. Search strategies - A* [Adapted from J.Pearl]

  4. Search strategies - A* A*Search(state space =<S,P,I,G,W>,h) Open  {<h(I),0,I>} Closed  whileOpen   do <fx,gx,x>  ExtractMin(Open) [minimum for fx] ifGoal(x, ) then return x Insert(<fx,gx,x>,Closed) for y  Child(x ,)do gy = gx + W(x,y) fy = gy + h(y) if there is no <f,g,y>Closedwith ffythen Insert(< fy,gy,y>,Open) [replace existing <f,g,y>, if present] returnfail

  5. Complete search Definition An algorithm is said to be complete if it terminates with a solution when one exists.

  6. Admissible search Definition An algorithm is admissible if it is guaranteed to return an optimal solution (with minimal possible path weight from the start state to a goal state) whenever a solution exists.

  7. Dominant search Definition An algorithm A is said to dominate algorithm B, if every node expanded by A is also expanded by B. Similarly, Astrictly dominatesB if A dominates B and B does not dominate A. We will also use the phrase “more efficient than” interchangeably with dominates.

  8. Optimal search Definition An algorithm is said to be optimal over a class of algorithms if it dominates all members of that class.

  9. Locally finite state spaces • Definition • A state space <S,P,I,G,W> is locally finite, if • for every xS, there is only a finite number of yS • such that (x,y)P • there exists  > 0 such that for all (x,y)P we have • W(x,y)  .

  10. Completeness of A* Theorem A* algorithm is complete on locally finite state spaces.

  11. Admissibility of A* Definition A heuristic function h is said to be admissible if 0  h(n)  h*(n) for all nS.

  12. Admissibility of A* Theorem A* which uses admissible heuristic function is admissible on locally finite state spaces.

  13. Admissibility of A* Lemma If A* uses admissible heuristic function h, then at any time before A* terminates there exists a node n’ in Open, such that f(n’)  f*(I).

  14. Admissibility of A* Theorem A* which uses admissible heuristic function is admissible on locally finite state spaces.

  15. Informedness of heuristic functions Definition A heuristic function h2 is said to be more informed than h1, if both h1 and h2 are admissible and h2(n) > h1(n) for every non-goal node nS. Similarly, an A* algorithm using h2 is said to be more informed than that using h1.

  16. Dominance of A* Theorem If A2* is more informed than A1*, then A2* dominates A1*.

  17. Dominance of A* Lemma Any node expanded by A* cannot have an f value exceeding f*(I), i.e. f(n) f*(I) for all nodes expanded.

  18. Dominance of A* Lemma Every node n on Open for which f(n) <f*(I) will eventually be expanded by A*.

  19. C-bounded paths Definition We say that path P is C-bounded if every node along this path satisfies gP(n)+h(n)  C. Similarly, if a strict inequality holds for every n along P, we say that P is strictly C-bounded. When it becomes necessary to identify which heuristic was used, we will use the notation C(h)-bounded.

  20. C-bounded paths Theorem A sufficient condition for A* to expand a node n is that there exists some strictly f*(I)-bounded path P from I to n.

  21. C-bounded paths Theorem A necessary condition for A* to expand a node n is that there exists a f*(I)-bounded path P from I to n.

  22. Dominance of A* Theorem If A2* is more informed than A1*, then A2* dominates A1*.

  23. Consistent heuristic functions Definition A heuristic function h is said to be consistent, if h(n)  k(n,n’) + h(n’) for all nodes n and n’. (where k(n,n’) denotes the weight of cheapest path from n to n’).

  24. Monotone heuristic functions Definition A heuristic function h is said to be monotone, if h(n)  W(n,n’) + h(n’) for all (n,n’)P.

  25. Monotonicity and consistency Theorem Monotonicity and consistency are equivalent properties.

  26. Monotonicity and admissibility Theorem Every monotone heuristic is also admissible.

  27. A* with monotone heuristic Theorem An A* algorithm with monotone heuristic finds optimal paths to all expanded nodes, i.e. g(n) = g*(n) for all <fn,gn,n>  Closed.

  28. Some terminology A* - we just discussed that A - basically the same as A*, but we check whether we have reached a goal state already at the time when nodes are generated Z* - generalization of A*, instead of f(x) = g(x) + h(x) uses more general function f(x’) = F(E(x),f(x),h(X’)) Z - related to Z* similarly as A to A*

  29. Implementation issues A*Search(state space =<S,P,I,G,W>,h) Open  {<h(I),0,I>} Closed  whileOpen   do <fx,gx,x>  ExtractMin(Open) [minimum for fx] ifGoal(x, ) then return x Insert(<fx,gx,x>,Closed) for y  Child(x ,)do gy = gx + W(x,y) fy = gy + h(y) if there is no <f,g,y>Closedwith ffythen Insert(< fy,gy,y>,Open) [replace existing <f,g,y>, if present] returnfail

  30. Implementation issues - Heaps • They are binary trees with all levels completed, except • the lowest one which may have uncompleted section • on the right side • They satisfy so called Heap Property - for each subtree • of heap the key for the root of subtree must be at least • as large as the keys for its (left and right) children

  31. Implementation issues - Heaps 1 2 12 3 45 13

  32. Implementation issues - Heaps Insert 2 3 12 7 45 13 1 T(n) = (h) = (log n)

  33. Implementation issues - Heaps Delete 2 3 12 7 45 13 14 T(n) = (h) = (log n)

  34. Implementation issues - Heaps ExtractMin 1 3 12 7 45 13 14 T(n) = (h) = (log n)

  35. Implementation issues - BST • T is a binary search tree, if • it is a binary tree (with a key associated with each node) • for each node x in T the keys at all nodes of left subtree • of x are not larger than key at node x and keys at all nodes • of right subtree of x are not smaller than key at node x

  36. Implementation issues - BST 7 3 12 5 11

  37. Implementation issues - BST Insert 7 3 13 2 5 12 14 11 9 8 10 11

  38. Implementation issues - BST Delete 7 3 13 2 5 12 14 9 8 10 11

  39. Implementation issues - BST Delete 7 3 13 2 5 12 14 9 8 11 10

  40. Implementation issues - BST Delete 7 3 13 2 5 12 35 20 18 30 31 19

  41. Implementation issues - AVL trees • T is an AVL tree, if • it is a binary search tree • for each node x in T we have • Height(LC(x)) –Height(RC(x))  {– 1, 0, 1}

  42. Implementation issues - skip lists “Perfect” Skip List

  43. How to chose a heuristic? • Original problem PRelaxed problem P' • A set of constraints removing one or more constraints • P is complex P' becomes simpler • Use cost of a best solution path from n in P' as h(n) for P • Admissibility: • h* h • cost of best solution in P >= cost of best solution in P' Solution space of P Solution space of P'

  44. How to chose a heuristic - 8-puzzle • Example: 8-puzzle • Constraints: to move from cell A to cell B • cond1: there is a tile on A • cond2: cell B is empty • cond3: A and B are adjacent (horizontally or vertically) • Removing cond2: • h2 (sum of Manhattan distances of all misplaced tiles) • Removing cond2 and cond3: • h1 (# of misplaced tiles) • Removing cond3: • h3, a new heuristic function

  45. How to chose a heuristic - 8-puzzle h3: • repeat • if the current empty cell A is to be occupied by tile x • in the goal, move x to A. Otherwise, move into A any • arbitrary misplaced tile. • until the goal is reached • h2>= h3 >= h1 h1(start) = 7 h2(start) = 18 h3(start) = 7

  46. How to chose a heuristic - TSP • Example: TSP. A legal tour is a (Hamiltonian) circuit

  47. How to chose a heuristic - TSP The given complete graph A legal tour Other second degree graphs • Example: TSP. A legal tour is a (Hamiltonian) circuit • It is a connected second degree graph (each node has exactly two adjacent edges) • Removing the connectivity constraint leads to h1: • find the cheapest second degree graph from the • given graph • (with o(n^3) complexity)

  48. How to chose a heuristic - TSP The given graph A legal tour Other MST • It is a spanning tree (when an edge is removed) with the constraint that each node has at most 2 adjacent edges) • Removing the constraint leads to h2: • find the cheapest minimum spanning tree from the given graph • (with O(n^2/log n)

  49. How complicated heuristic to chose? [Adapted from R.Shinghal]

  50. Relaxing optimality requirements • is f = g + h the best choice, if we want to minimize search • efforts, not solution cost? • even if solution cost is important, admissible f can lead to • non terminating A*. Can speed be gained by decreasing • solution quality? • it may be hard to find good admissible heuristic. What • happens, if we do not require admissibility?

More Related