1 / 80

Particle Filter/Monte Carlo Localization

Advanced Mobile Robotics. Probabilistic Robotics: . Particle Filter/Monte Carlo Localization. Dr. J izhong Xiao Department of Electrical Engineering CUNY City College jxiao@ccny.cuny.edu. Probabilistic Robotics. Bayes Filter Implementations Particle filters Monte Carlo Localization.

shen
Télécharger la présentation

Particle Filter/Monte Carlo Localization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Mobile Robotics Probabilistic Robotics: Particle Filter/Monte Carlo Localization Dr. Jizhong Xiao Department of Electrical Engineering CUNY City College jxiao@ccny.cuny.edu

  2. Probabilistic Robotics Bayes Filter Implementations Particle filters Monte Carlo Localization

  3. Sample-based Localization (sonar)

  4. Particle Filter Definition: • Particle filter is a Bayesian based filter that sample the whole robot work space by a weight function derived from the belief distribution of previous stage. Basic principle: • Set of state hypotheses (“particles”) • Survival-of-the-fittest

  5. Why Particle Filters • Represent belief by random samples • Estimation of non-Gaussian, nonlinear processes • Particle filtering  non-parametric inference algorithm - suited to track non-linear dynamics. - efficiently represent non-Gaussian distributions

  6. Function Approximation • Particle sets can be used to approximate functions • The more particles fall into an interval, the higher the probability of that interval

  7. Particle Filter Projection

  8. Rejection Sampling • Let us assume that f(x)<1 for all x • Sample x from a uniform distribution • Sample c from [0,1] • if f(x) > c keep the sample • otherwise reject the sample

  9. Importance Sampling Principle • We can even use a different distribution g to generate samples from f • By introducing an importance weight w, we can account for the “differences between g and f ” • w = f / g • f is often called target • g is often called proposal

  10. Importance Sampling Samples cannot be drawn conveniently from the target distribution f. Importance factor Instead, the importance sampler draws samples from the proposal distribution g, which has a simpler form. A sample of f is obtained by attaching the weight f/g to each sample x.

  11. Particle Filter Basics • Sample: Randomly select M particles based on weights (same particle may be picked multiple times) • Predict: Move particles according to deterministic dynamics • Measure: Get a likelihood for each new sample by making a prediction about the image’s local appearance and comparing; then update weight on particle accordingly

  12. Particle Filter Basics • Particle filters represent a distribution by a set of samples drawn from the posterior distribution. • The denser a sub-region of the state space is populated by samples, the more likely it is that true state falls into this region. • Such a representation is approximate, but it is nonparametric, and therefore can represent a much broader space of distributions than Gaussians. • Weight of particle are given through the measurement model. • Re-sampling allows to redistribute particles approximately according to the posterior . • The re-sampling step is a probabilistic implementation of the Darwinian idea of survival of the fittest: it refocuses the particle set to regions in state space with high posterior probability • By doing so, it focuses the computational resources of the filter algorithm to regions in the state space where they matter the most.

  13. Importance Sampling with Resampling:Landmark Detection Example

  14. Distributions

  15. Distributions Wanted: samples distributed according to p(x| z1, z2, z3)

  16. This is Easy! We can draw samples from p(x|zl) by adding noise to the detection parameters.

  17. Importance Sampling with Resampling

  18. Importance Sampling with Resampling Weighted samples After resampling

  19. draw xit-1from Bel(xt-1) draw xitfrom p(xt | xit-1,ut-1) Importance factor for xit: Particle Filter Algorithm • Prediction (Action) • Correction (Measurement)

  20. Particle Filter Algorithm Each particle is a hypothesis as to what the true world state may be at time t Sampling from the state transition distribution Importance factor, incorporates the measurement into particle set Re-sampling, importance sampling

  21. Particle Filter Algorithm • Line 4 – hypothetical state - sampling from the state transition distribution - set of particles obtained after M iterations is the filter’s representation of the posterior • Line 5 – importance factor - incorporate measurement into the particle set - set of weighted particles represents Bayes filter posterior • Line 8 to 11 – Importance sampling Particle distribution changes - incorporating importance weights into the re-sampling process. • Survival of the fittest: After resampling step, refocuses particle set to the regions in state space with higher posterior probability, distributed according to

  22. Particle Filter Algorithm(from Fox’s slides) • Algorithm particle_filter( St-1, ut-1 zt): • For Generate new samples • Sample index j(i) from the discrete distribution given by wt-1 • Sample from using and • Compute importance weight • Update normalization factor • Insert • For • Normalize weights

  23. Resampling • Given: Set S of weighted samples. • Wanted : Random sample, where the probability of drawing xi is given by wi. • Typically done n times with replacement to generate new sample set S’.

  24. Resampling Algorithm • Algorithm systematic_resampling(S,n): • For Generate cdf • Initialize threshold • For Draw samples … • While ( ) Skip until next threshold reached • Insert • Increment threshold • ReturnS’ Also called stochastic universal sampling

  25. Particle Filters Pose particles drawn at random and uniformly over the entire pose space

  26. Sensor Information: Importance Sampling After robot senses the door, MCL Assigns importance factors to each particle

  27. Robot Motion After incorporating the robot motion and after resampling, leads to new particle set with uniform importance weights, but with an increased number of particles near the three likely places

  28. Sensor Information: Importance Sampling New measurement assigns non-uniform importance weights to the particle sets, most of the cumulative probability mass is centered on the second door

  29. Robot Motion Further motion leads to another re-sampling step, and a step in which a new particle set is generated according to the motion model

  30. Practical Considerations Particles represent discrete approximation of continuous beliefs 1. Density Extraction (Estimation) • extracting a continuous density from samples Density Estimation Methods • Gaussian approximation – unimodal distribution • Histogram - Multi-model distributions • The probability of each bin is by summing the weights of the particles that fall in its range • Kernel Density Estimation – multi-modal, smooth each particle is used as the kernel placing a Gaussian kernel at each particle overall density is a mixture of the kernel density

  31. Different Ways of Extracting Densities from Particles Gaussian Approximation Histogram Approximation Kernel density Estimate

  32. Properties of Particle Filters Sampling Variance • Variation due to random sampling • The sampling variance decreases with the number of samples • Higher number of samples result in accurate approximations with less variability • If enough samples are chosen, the observations made by a robot – sample based belief “close enough” to the true belief.

  33. Drawbacks • In order to explore a significant part of the state space, the number of particles should be very large which induces complexity problems not adapted to a real-time implementation. • PF methods are very sensitive to non consistent measures or high measurement errors.

  34. Importance Sampling with Resampling Resampling too often increases the risk of losing diversity. Too infrequently many samples might be wasted in regions of low probability, Reducing sampling error 1. Variance reduction • Reduce the frequency of resampling • Maintains importance weight in memory & updates them as follows: 2. Low variance sampling • Computes a single random number • Any number points to exactly only one particle, Instead of choosing M random numbers and selecting those particles accordingly, this procedure computes a single random number and selects samples according to this number but still with a probability proportional to the sample weight

  35. Low variance resampling Draw a random number r in the interval (0; M-1 ) Select particles by repeatedly adding the fixed amount of M-1 to r and by choosing the particle that corresponds to the resulting number

  36. Low Variance Resampling Procedure Principle: • Selection involves sequential stochastic process • Select samples w.r.t. a single random number,r but with a probability proportional to sample weight

  37. Advantages of Particle Filters: • can deal with non-linearities • can deal with non-Gaussian noise • mostly parallelizable • easy to implement • PFs focus adaptively on probable regions of state-space

  38. Summary • Particle filters are an implementation of recursive Bayesian filtering • They represent the posterior by a set of weighted samples. • In the context of localization, the particles are propagated according to the motion model. • They are then weighted according to the likelihood of the observations. • In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation

  39. Monte Carlo Localization • One of the most popular particle filter methods for robot localization • Reference: • Dieter Fox, Wolfram Burgard, Frank Dellaert, Sebastian Thrun, “Monte Carlo Localization: Efficient Position Estimation for Mobile Robots”, Proc. 16th National Conference on Artificial Intelligence, AAAI’99, July 1999

  40. MCL in action “Monte Carlo” Localization -- refers to the resampling of the distribution each time a new observation is integrated

  41. Monte Carlo Localization • the probability density function is represented by samples • randomly drawn from it • it is also able to represent multi-modal distributions, and thus localize the robot globally • considerably reduces the amount of memory required and can integrate measurements at a higher rate • state is not discretized and the method is more accurate than the grid-based methods • easy to implement

  42. Robot modeling map m and location x p( z | x, m ) sensor model p( | x, m ) = .75 p( | x, m ) = .05 potential observations z

  43. Robot modeling map m and location r p( z | x, m ) sensor model p( xnew | xold, u, m )action model p( | x, m ) = .75 p( | x, m ) = .05 potential observationso “probabilistic kinematics” -- encoder uncertainty • red lines indicate commanded action • the cloud indicates the likelihood of various final states

  44. Robot modeling: how-to p(z | x, m ) sensor model p( xnew | xold, u, m )action model (0) Model the physics of the sensor/actuators (with error estimates) theoretical modeling (1) Measure lots of sensing/action results and create a model from them empirical modeling • take N measurements, find mean (m) and st. dev. (s) and then use a Gaussian model • or, some other easily-manipulated model... 0if |x-m| > s 0 if |x-m| > s p( x )= p( x )= 1otherwise 1- |x-m|/sotherwise (2) Make something up...

  45. Motion Model Reminder Start

  46. Proximity Sensor Model Reminder Sonar sensor Laser sensor

  47. Monte Carlo Localization Start by assuming p( x0 ) is the uniform distribution. take K samples of x0 and weight each with an importance factor of 1/K

  48. Monte Carlo Localization Start by assuming p( x ) is the uniform distribution. take K samples of x0 and weight each with an importance factor of 1/K Get the current sensor observation, z1 For each sample point x0multiply the importance factor by p(z1 | x0, m)

More Related