1 / 78

Particle Filters

Particle Filters. Outline. Introduction to particle filters Recursive Bayesian estimation Bayesian Importance sampling Sequential Importance sampling ( SIS ) Sampling Importance resampling ( SIR ) Improvements to SIR On-line Markov chain Monte Carlo Basic Particle Filter algorithm

Télécharger la présentation

Particle Filters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Particle Filters

  2. Outline • Introduction to particle filters • Recursive Bayesian estimation • Bayesian Importance sampling • Sequential Importance sampling (SIS) • Sampling Importance resampling (SIR) • Improvements to SIR • On-line Markov chain Monte Carlo • Basic Particle Filter algorithm • Example for robot localization • Conclusions

  3. But what if not a gaussian distribution in our problem?

  4. Motivation for particle filters

  5. Key Idea of Particle Filters • Idea = we try to have more samples where we expect to have the solution

  6. Motion Model Reminder • Density of samples represents the expected probability of robot location

  7. Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif • This is the lost robot problem

  8. Particles are used for probability density function Approximation

  9. Function Approximation • Particle sets can be used to approximate functions • The more particles fall into an interval, the higher the probability of that interval • How to draw samples from a function/distribution?

  10. Importance Sampling Principle

  11. Importance Sampling Principle weight • w = f / g • f is often calledtarget • g is often calledproposal • Pre-condition:f(x)>0  g(x)>0

  12. Importance sampling: another example of calculating weight samples • How to calculate formally the f/g value?

  13. Importance Sampling Formulas for f, g and f/g f g f/g

  14. History of Monte Carlo Idea and especially Particle Filters • First attempts – simulations of growing polymers • M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956. • First application in signal processing - 1993 • N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993. • Books • A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001. • B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004. • Tutorials • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.

  15. What is the problem that we want to solve? • The problem is tracking the state of a system as it evolves over time • Sequentially arriving (noisy or ambiguous) observations • We want to know: Best possible estimate of the hidden variables

  16. Solution: Sequential Update • Storing and processing all incoming measurements is inconvenient and may be impossible • Recursive filtering: • Predict next state pdf from current estimate • Update the prediction using sequentially arriving new measurements • Optimal Bayesian solution: • recursively calculating exact posterior density These lead to various particle filters

  17. Particle Filters • Sequential Monte Carlo methods for on-line learning within a Bayesian framework. • Known as • Particle filters • Sequential sampling-importance resampling (SIR) • Bootstrap filters • Condensation trackers • Interacting particle approximations • Survival of the fittest

  18. Particle Filter characteristics

  19. Approaches to Particle Filters METAPHORS

  20. Particle filters • Sequential and Monte Carlo properties • Representing belief by sets of samples or particles • are nonnegative weights called importance factors • Updating procedure is sequential importance sampling with re-sampling

  21. Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.

  22. Short, more formal, Introduction to Particle Filters and Monte CarloLocalization

  23. Proximity Sensor Model Reminder

  24. Particle filtering ideas • Recursive Bayesian filter by Monte Carlo sampling • The idea: represent the posterior density by a set of random particles with associated weights. • Compute estimates based on these samples and weights • Posterior density • Sample space

  25. Particle filtering ideas • Particle filters are based onrecursive generation of random measures that approximatethe distributions of the unknowns. • Random measures: particles and importance weights. • As new observationsbecome available, the particles and the weights are propagated by exploiting Bayes theorem. • Posterior density • Sample space

  26. Recall “law of total probability” and “Bayes’ rule” Mathematical tools needed for Particle Filters

  27. Recursive Bayesian estimation (I) • Recursive filter: • System model: • Measurement model: • Information available:

  28. Recursive Bayesian estimation (II) • Seek: • i = 0: filtering. • i > 0: prediction. • i<0: smoothing. • Prediction: • since:

  29. Recursive Bayesian estimation (III) • Update: • where: • since:

  30. System state dynamics • Observation dynamics • We are interested in: Belief or posterior density Bayes Filters (second pass) • Estimating system state from noisy observations

  31. From above, constructing two steps of Bayes Filters • Predict: • Update:

  32. Assumptions: Markov Process • Predict: • Update:

  33. Bayes Filter • How to use it? What else to know? • Motion Model • Perceptual Model • Start from:

  34. Particle Filters: Compare Gaussian and Particle Filters

  35. Example 1: theoretical PDF

  36. Step 0: initialization • Step 1: updating • Example 1: theoretical PDF

  37. Step 0: initialization • Each particle has the same weight • Step 1: updating weights. Weights are proportional to p(z|x) Example 2: Particle Filter

  38. Step 3: updating • Step 4: predicting • Step 2: predicting • Example 1 (continue) • 1

  39. Robot Motion

  40. Example 2: Particle Filter

  41. Step 3: updating weights. Weights are proportional to p(z|x) • Step 4: predicting. • Predict the new locations of particles. • Step 2: predicting. • Predict the new locations of particles. Example 2: Particle Filter • Particles are more concentrated in the region where the person is more likely to be

  42. Robot Motion

  43. Compare Particle Filter with Bayes Filter with Known Distribution • Updating • Example 1 • Example 2 • Predicting • Example 1 • Example 2

  44. Classical approximations • Analytical methods: • Extended Kalman filter, • Gaussian sums… (Alspach et al. 1971) • Perform poorly in numerous cases of interest • Numerical methods: • point masses approximations, • splines. (Bucy 1971, de Figueiro 1974…) • Very complex to implement, not flexible.

  45. Monte Carlo Localization

  46. Mobile Robot Localization • Each particle is a potential pose of the robot • Proposal distribution is the motion model of the robot (prediction step) • The observation model is used to compute the importance weight (correction step)

  47. Monte Carlo Localization • Each particle is a potential pose of the robot • Proposal distribution is the motion model of the robot (prediction step) • The observation model is used to compute the importance weight (correction step)

  48. Sample-based Localization (sonar)

More Related