Download
markov localization bayes filtering n.
Skip this Video
Loading SlideShow in 5 Seconds..
Markov Localization & Bayes Filtering PowerPoint Presentation
Download Presentation
Markov Localization & Bayes Filtering

Markov Localization & Bayes Filtering

134 Views Download Presentation
Download Presentation

Markov Localization & Bayes Filtering

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Markov Localization & Bayes Filtering with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics

  2. Markov Localization The robot doesn’t know where it is. Thus, a reasonable initial believe of it’s position is a uniform distribution.

  3. Markov Localization A sensor reading is made (USE SENSOR MODEL) indicating a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

  4. Markov Localization The robot is moving (USE MOTION MODEL) which adds noise.

  5. Markov Localization A new sensor reading (USE SENSOR MODEL) indicates a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

  6. Markov Localization The robot is moving (USE MOTION MODEL) which adds noise. …

  7. Bayes Formula

  8. Bayes Rule with Background Knowledge

  9. Normalization Algorithm:

  10. Recursive Bayesian Updating Markov assumption: znis independent of z1,...,zn-1if we know x.

  11. Putting oberservations and actions together: Bayes Filters • Given: • Stream of observations z and action data u: • Sensor modelP(z|x). • Action modelP(x|u,x’). • Prior probability of the system state P(x). • Wanted: • Estimate of the state X of a dynamical system. • The posterior of the state is also called Belief:

  12. Graphical Representation and Markov Assumption Underlying Assumptions • Static world • Independent noise • Perfect model, no approximation errors

  13. Bayes Markov Total prob. Markov Markov z = observation u = action x = state Bayes Filters

  14. Prediction • Correction

  15. Bayes Filter Algorithm • Algorithm Bayes_filter( Bel(x),d ): • h=0 • Ifd is a perceptual data item z then • For all x do • For all x do • Else ifd is an action data item uthen • For all x do • ReturnBel’(x)

  16. Bayes Filters are Familiar! • Kalman filters • Particle filters • Hidden Markov models • Dynamic Bayesian networks • Partially Observable Markov Decision Processes (POMDPs)

  17. Probabilistic Robotics Bayes Filter Implementations Gaussian filters

  18. m Univariate -s s Gaussians Linear transform of Gaussians

  19. Multivariate Gaussians • We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations.

  20. Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement

  21. Linear Gaussian Systems: Initialization • Initial belief is normally distributed:

  22. Linear Gaussian Systems: Dynamics • Dynamics are linear function of state and control plus additive noise:

  23. Linear Gaussian Systems: Observations • Observations are linear function of state plus additive noise:

  24. Kalman Filter Algorithm • Algorithm Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  25. Kalman Filter Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Optimal for linear Gaussian systems! • Most robotics systems are nonlinear!

  26. Nonlinear Dynamic Systems • Most realistic robotic problems involve nonlinear functions

  27. Linearity Assumption Revisited

  28. Non-linear Function

  29. EKF Linearization (1)

  30. EKF Linearization (2)

  31. EKF Linearization (3)

  32. EKF Linearization: First Order Taylor Series Expansion • Prediction: • Correction:

  33. EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  34. Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] • Given • Map of the environment. • Sequence of sensor measurements. • Wanted • Estimate of the robot’s position. • Problem classes • Position tracking • Global localization • Kidnapped robot problem (recovery)

  35. Landmark-based Localization

  36. EKF Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Not optimal! • Can diverge if nonlinearities are large! • Works surprisingly well even when all assumptions are violated!

  37. Kalman Filter-based System • [Arras et al. 98]: • Laser range-finder and vision • High precision (<1cm accuracy) [Courtesy of Kai Arras]

  38. Multi-hypothesisTracking

  39. Localization With MHT • Belief is represented by multiple hypotheses • Each hypothesis is tracked by a Kalman filter • Additional problems: • Data association: Which observation corresponds to which hypothesis? • Hypothesis management: When to add / delete hypotheses? • Huge body of literature on target tracking, motion correspondence etc.

  40. MHT: Implemented System (2) Courtesy of P. Jensfelt and S. Kristensen

  41. Probabilistic Robotics Bayes Filter Implementations Discrete filters

  42. Piecewise Constant

  43. Discrete Bayes Filter Algorithm • Algorithm Discrete_Bayes_filter( Bel(x),d ): • h=0 • Ifd is a perceptual data item z then • For all x do • For all x do • Else ifd is an action data item uthen • For all x do • ReturnBel’(x)

  44. Grid-based Localization

  45. Sonars and Occupancy Grid Map

  46. Probabilistic Robotics Bayes Filter Implementations Particle filters

  47. Sample-based Localization (sonar)

  48. Particle Filters • Represent belief by random samples • Estimation of non-Gaussian, nonlinear processes • Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, Particle filter • Filtering: [Rubin, 88], [Gordon et al., 93], [Kitagawa 96] • Computer vision: [Isard and Blake 96, 98] • Dynamic Bayesian Networks: [Kanazawa et al., 95]d

  49. Importance Sampling Weight samples: w = f / g