1 / 14

Teaching Stochastic Local Search

Teaching Stochastic Local Search. Todd W. Neller. Motivation. Intro AI course has many topics, little time Best learning is experiential, but experience takes time! "One must learn by doing the thing; for though you think you know it, you have no certainty, until you try." -Sophocles

pascha
Télécharger la présentation

Teaching Stochastic Local Search

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teaching StochasticLocal Search Todd W. Neller

  2. Motivation • Intro AI course has many topics, little time • Best learning is experiential, but experience takes time! • "One must learn by doing the thing; for though you think you know it, you have no certainty, until you try." -Sophocles • How to introduce stochastic local search • Simply • Concisely • Experientially FLAIRS-2005

  3. The Drunken Topographer • Drunken Topographer Analogy • Formalize the problem • Longitude, latitude  state • Altitude  energy (objective function) • Random step  next state generation • Goal: Find the lowest energy! FLAIRS-2005

  4. State Interface • step – take a stochastic local step in the state space • undo – revert one step (never two!) • energy – measure state “badness” • clone – copy state for future reference FLAIRS-2005

  5. Simple Hill Descent FLAIRS-2005

  6. Simple Hill Descent • For a fixed number of iterations: • [Report state every 100000 iterations] • Step to next state. • If step is not uphill, • check if it is the best (minimal energy) yet, • Otherwise reject (undo) the step. • Return best state FLAIRS-2005

  7. Observing Local Minima • The Rastrigin Function • sinusoidally perturbed parabolic bowl • energy(x, y) = x2 + y2 − cos(18x) − cos(18y) + 2 • Initialize at (10,10) • x, y Gaussian step distribution with σ = .05 • Apply simple hill descent FLAIRS-2005

  8. Uphill Steps • Allow uphill steps with some probability • Experiment with acceptance rate: • 0.0 = hill descent • 1.0 = random walk • 0.1  close approximation • 0.01  closer approximation • 0.001  local minima FLAIRS-2005

  9. Discerning Uphill Steps • Not all uphill steps are equal • Introduce Boltzmann distribution • What is the behavior at temperature extremes? • High temperature: random walk (“super drunk”) • Low temperature: hill descent (“dead-tired drunk”) • Vary from high to low temperature… FLAIRS-2005

  10. Simulated Annealing • Why simulated annealing? • Simple • Powerful • However, annealing (cooling) schedules are a “black art”. • Side note: decision-theoretic simulated annealing • There is no substitute for experience. FLAIRS-2005

  11. Applet Annealing Experiences FLAIRS-2005

  12. Next Steps • Homework: Assign one or more simple combinatorial optimization problems (e.g. TSP, n-queens, etc.) • Optional Labs: • Project group formation problem • Pizza ordering problem FLAIRS-2005

  13. Web Resources • http://cs.gettysburg.edu/~tneller/resources/sls/index.html • FLAIRS’05 paper • Relevant code • Links to demo applets • Readings on SA and SLS in general • Suggested Syllabus FLAIRS-2005

  14. In Conclusion • “We can only possess what we experience. Truth, to be understood, must be lived.” – Charlie Peacock • Further exploration: Hoos and Stutzle text • Other SLS distillations are possible. May you and your students benefit from our good experiences! FLAIRS-2005

More Related