1 / 36

Particle Filters Pieter Abbeel UC Berkeley EECS

Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun , Burgard and Fox, Probabilistic Robotics. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A. Motivation.

morty
Télécharger la présentation

Particle Filters Pieter Abbeel UC Berkeley EECS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAA

  2. Motivation • For continuous spaces: often no analytical formulas for Bayes filter updates • Solution 1: Histogram Filters: (not studied in this lecture) • Partition the state space • Keep track of probability for each partition • Challenges: • What is the dynamics for the partitioned model? • What is the measurement model? • Often very fine resolution required to get reasonable results • Solution 2: Particle Filters: • Represent belief by random samples • Can use actual dynamics and measurement models • Naturally allocates computational resources where required (~ adaptive resolution) • Aka Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter

  3. Sample-based Localization (sonar)

  4. Problem to be Solved • Given a sample-based representation of Bel(xt) = P(xt| z1, …, zt, u1, …, ut) Find a sample-based representation ofBel(xt+1) = P(xt+1 | z1, …, zt, zt+1, u1, …, ut+1)

  5. Dynamics Update • Given a sample-based representation of Bel(xt) = P(xt| z1, …, zt, u1, …, ut) Find a sample-based representation of P(xt+1 | z1, …, zt, u1, …, ut+1) • Solution: • For i=1, 2, …, N • Sample xit+1from P(Xt+1 | Xt = xit)

  6. Sampling Intermezzo

  7. Observation update • Given a sample-based representation of P(xt+1 | z1, …, zt) Find a sample-based representation of P(xt+1 | z1, …, zt, zt+1) = C * P(xt+1 | z1, …, zt) * P(zt+1 | xt+1) • Solution: • For i=1, 2, …, N • w(i)t+1= w(i)t* P(zt+1 | Xt+1 = x(i)t+1) • the distribution is represented by the weighted set of samples

  8. Sequential Importance Sampling (SIS) Particle Filter • Sample x11, x21, …, xN1 from P(X1) • Set wi1= 1 for all i=1,…,N • For t=1, 2, … • Dynamics update: • For i=1, 2, …, N • Sample xit+1 from P(Xt+1 | Xt = xit) • Observation update: • For i=1, 2, …, N • wit+1= wit* P(zt+1 | Xt+1 = xit+1) • At any time t, the distribution is represented by the weighted set of samples { <xit, wit> ; i=1,…,N}

  9. SIS particle filter major issue • The resulting samples are only weighted by the evidence • The samples themselves are never affected by the evidence  Fails to concentrate particles/computation in the high probability areas of the distribution P(xt | z1, …, zt)

  10. Sequential Importance Resampling (SIR) • At any time t, the distribution is represented by the weighted set of samples { <xit, wit> ; i=1,…,N} • Sample N times from the set of particles • The probability of drawing each particle is given by its importance weight  More particles/computation focused on the parts of the state space with high probability mass

  11. Algorithm particle_filter( St-1, ut , zt): • For Generate new samples • Sample index j(i) from the discrete distribution given by wt-1 • Sample from using and • Compute importance weight • Update normalization factor • Insert • For • Normalize weights

  12. Particle Filters

  13. Sensor Information: Importance Sampling

  14. Robot Motion

  15. Sensor Information: Importance Sampling

  16. Robot Motion

  17. Summary – Particle Filters • Particle filters are an implementation of recursive Bayesian filtering • They represent the posterior by a set of weighted samples • They can model non-Gaussian distributions • Proposal to draw new samples • Weight to account for the differences between the proposal and the target

  18. Summary – PF Localization • In the context of localization, the particles are propagated according to the motion model. • They are then weighted according to the likelihood of the observations. • In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation.

More Related