1 / 27

For Monday

This text explores various search methods and heuristic functions used in problem solving, including Depth First Search, Breadth First Search, Best First Search, and Simulated Annealing. It also discusses the advantages and disadvantages of local search algorithms such as Hill-Climbing and Genetic Algorithms.

tstephen
Télécharger la présentation

For Monday

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. For Monday • Chapter 6 • Homework: • Chapter 3, exercise 7

  2. Lisp questions?

  3. Program 1

  4. Late Tickets • You have 2 for the semester. • Only good for programs. • Allow you to hand in up to 5 days late IF you have a late ticket left. • Each good for +.05 on final grade if unused. • Must make a note on the program printout. • Only way to turn in late work in this course.

  5. Comparing DFS and BFS • When might we prefer DFS? • When might we prefer BFS?

  6. Improving on DFS • Depth-limited Search • Iterative Deepening • Wasted work??? • What kinds of problems lend themselves to iterative deepening?

  7. Bi-directional Search • What advantages are there to bi-directional search? • What do we have to have to use bi-directional search?

  8. Repeated States • Problem? • How can we avoid them? • Do not follow loop to parent state (or me) • Do not create path with cycles (check all the way to root) • Do not generate any state that has already been generated. -- How feasible is this??

  9. Informed Search • So far we’ve looked at search methods that require no knowledge of the problem • However, these can be very inefficient • Now we’re going to look at searching methods that take advantage of the knowledge we have a problem to reach a solution more efficiently

  10. Best First Search • At each step, expand the most promising node • Requires some estimate of what is the “most promising node” • We need some kind of evaluation function • Order the nodes based on the evaluation function

  11. Greedy Search • A heuristic function, h(n), provides an estimate of the distance of the current state to the closest goal state. • The function must be 0 for all goal states • Example: • Straight line distance to goal location from current location for route finding problem

  12. Heuristics Don’t Solve It All • NP-complete problems still have a worst-case exponential time complexity • Good heuristic function can: • Find a solution for an average problem efficiently • Find a reasonably good (but not optimal) solution efficiently

  13. Beam Search • Variation on greedy search • Limit the queue to the best n nodes (n is the beam width) • Expand all of those nodes • Select the best n of the remaining nodes • And so on • May not produce a solution

  14. Focus on Total Path Cost • Uniform cost search uses g(n)--the path cost so far • Greedy search uses h(n)--the estimated path cost to the goal • What we’d like to use instead isf(n) = g(n) + h(n)to estimate the total path cost

  15. Admissible Heuristic • An admissible heuristic is one that never overestimates the cost to reach the goal. • It is always less than or equal to the actual cost. • If we have such a heuristic, we can prove that best first search using f(n) is both complete and optimal. • A* Search

  16. 8-Puzzle Heuristic Functions • Number of tiles out of place • Manhattan Distance • Which is better? • Experiment • Effective branching factor

  17. Inventing Heuristics • Relax the problem • Cost of solving a subproblem • Learn weights for features of the problem

  18. SMA* • Make best possible use of available memory • “Forget” the worst nodes in the search tree when we need space • Record information about the quality of sub-trees in each node (so we avoid going back to bad choices)

  19. Local Search • Works from the “current state” • No focus on path • Also useful for optimization problems

  20. Local Search • Advantages? • Disadvantages?

  21. Hill-Climbing • Also called gradient descent • Greedy local search • Move from current state to a state with a better overall value • Issues: • Local maxima • Ridges • Plateaux

  22. Variations on Hill Climbing • Stochastic hill climbing • First-choice hill climbing • Random-restart hill climbing

  23. Evaluation of Hill Climbing

  24. Simulated Annealing • Similar to hill climbing, but-- • We select a random successor • If that successor improves things, we take it • If not, we may take it, based on a probability • Probability gradually goes down

  25. Local Beam Search • Variant of hill-climbing where multiple states and successors are maintained

  26. Genetic Algorithms • Have a population of k states (or individuals) • Have a fitness function that evaluates the states • Create new individuals by randomly selecting pairs and mating them using a randomly selected crossover point. • More fit individuals are selected with higher probability. • Apply random mutation. • Keep top k individuals for next generation.

  27. Other Issues • What issues arise from continuous spaces? • What issues do online search and unknown environments create?

More Related