1 / 64

Introduction to Mobile Robotics

Introduction to Mobile Robotics. Bayes Filter Implementations Gaussian filters. Bayes Filter Reminder. Prediction Correction. m. Univariate. - s. s. m. Multivariate. Gaussians. Properties of Gaussians. Multivariate Gaussians.

Télécharger la présentation

Introduction to Mobile Robotics

E N D

Presentation Transcript

1. Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters

2. Bayes Filter Reminder • Prediction • Correction

3. m Univariate -s s m Multivariate Gaussians

4. Properties of Gaussians

5. Multivariate Gaussians • We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations.

6. Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement

7. Components of a Kalman Filter Matrix (nxn) that describes how the state evolves from t to t-1 without controls or noise. Matrix (nxl) that describes how the control ut changes the state from t to t-1. Matrix (kxn) that describes how to map the state xt to an observation zt. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Rt and Qt respectively.

8. Kalman Filter Updates in 1D

9. Kalman Filter Updates in 1D

10. Kalman Filter Updates in 1D

12. Linear Gaussian Systems: Initialization • Initial belief is normally distributed:

13. Linear Gaussian Systems: Dynamics • Dynamics are linear function of state and control plus additive noise:

14. Linear Gaussian Systems: Dynamics

15. Linear Gaussian Systems: Observations • Observations are linear function of state plus additive noise:

16. Linear Gaussian Systems: Observations

17. Kalman Filter Algorithm • Algorithm Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

18. Prediction The Prediction-Correction-Cycle

19. Correction The Prediction-Correction-Cycle

20. Prediction Correction The Prediction-Correction-Cycle

21. Kalman Filter Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Optimal for linear Gaussian systems! • Most robotics systems are nonlinear!

22. Nonlinear Dynamic Systems • Most realistic robotic problems involve nonlinear functions

23. Linearity Assumption Revisited

24. Non-linear Function

25. EKF Linearization (1)

26. EKF Linearization (2)

27. EKF Linearization (3)

28. EKF Linearization: First Order Taylor Series Expansion • Prediction: • Correction:

29. EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

30. Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] • Given • Map of the environment. • Sequence of sensor measurements. • Wanted • Estimate of the robot’s position. • Problem classes • Position tracking • Global localization • Kidnapped robot problem (recovery)

31. Landmark-based Localization

32. EKF_localization ( mt-1,St-1, ut, zt,m):Prediction: Jacobian of g w.r.t location Jacobian of g w.r.t control Motion noise Predicted mean Predicted covariance

33. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

34. EKF Prediction Step

35. EKF Observation Prediction Step

36. EKF Correction Step

37. Estimation Sequence (1)

38. Estimation Sequence (2)

39. Comparison to GroundTruth

40. EKF Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Not optimal! • Can diverge if nonlinearities are large! • Works surprisingly well even when all assumptions are violated!

41. UKF Sigma-Point Estimate (2) EKF UKF

42. UKF Sigma-Point Estimate (3) EKF UKF

43. Unscented Transform Intuition Sample a 2N+1 set of points from an N dimensional Gaussian distribution f(…) For each sample compute its mapping via f() Recover a Gaussian approximation from the samples

44. Unscented Transform Sigma points Weights Pass sigma points through nonlinear function Recover mean and covariance

45. Unscented Prediction Construct a N+M dimensional Gaussian from the from the previous state distribution and the controls ut-1 xt-1 g(xt-1,ut-1) For each of the 2(N+M)+1 samples <x,u>(i),compute its mapping via g(x,u) Recover a Gaussian approximation from the samples

46. Motion noise UKF_localization ( mt-1,St-1, ut, zt,m): Prediction: Measurement noise Augmented state mean Augmented covariance Sigma points Prediction of sigma points Predicted mean Predicted covariance

47. R Unscented Correction Sample from the predicted state and the observation noise, to obtain the expected measurement h(xt) Compute the cross correlation matrix of measurements and states, and perform a Kalman update.

48. Measurement sigma points UKF_localization ( mt-1,St-1, ut, zt,m): Correction: Predicted measurement mean Pred. measurement covariance Cross-covariance Kalman gain Updated mean Updated covariance

49. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

More Related