1 / 40

The “  ” Paige in Kalman Filtering

The “  ” Paige in Kalman Filtering. K. E. Schubert. Kalman’s Interest . State Space (Matrix Representation) Discrete Time (difference equations). Optimal Control Starting at x 0  Go to x G Minimize or maximize some quantity (time, energy, etc.). Why Filtering?.

chamm
Télécharger la présentation

The “  ” Paige in Kalman Filtering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The “” Paige in Kalman Filtering K. E. Schubert

  2. Kalman’s Interest • State Space (Matrix Representation) • Discrete Time (difference equations) • Optimal Control • Starting at x0  Go to xG • Minimize or maximize some quantity (time, energy, etc.)

  3. Why Filtering? • State (xi) is not directly known • Must observe through minimum measurements • Observer Equation • Want to reconstruct the state vector

  4. y=ax+b Random Variables • Process and observation noise • Independent, white Gaussian noise

  5. Complete Problem • Control and estimation are independent • Concerned only with observer • Obtain estimate:

  6. Predictor-Corrector Measurements Correct (Measurement Update) Predict (Time Update)

  7. To Err Is Kalman! • How accurate is the estimate? • What is its distribution?

  8. Predictor-Corrector Measurements Correct (Measurement Update) Predict (Time Update)

  9. No random variable You don’t know it Predict Eigenvalues must be <1 (For convergence) Distribution does effect error covariance

  10. Kalman Gain Innovations (What’s New) Oblique Projection Correct

  11. System 1 (Basic Example) • X 2, • Companion Form • Nice but not perfect numerics and stability

  12. System 1

  13. System 1

  14. System 1

  15. System 1

  16. System 1 (Again) • X 2, • Companion Form • Nice but not perfect numerics and stability

  17. System 1

  18. System 1

  19. System 1

  20. System 1

  21. X 2, Large Eigenvalue Spread Condition number around 109 Large sampling time (big steps) System 2 (Stiffness)

  22. System 2

  23. System 2

  24. Trouble in Paradise • Inversion in the Kalman gain is slow and generally not stable • A is usually in companion form • numerically unstable (Laub) • Covariance are symmetric positive definite • Calculation cause P to become unsymmetric then lose positivity

  25. Square Root Filters • Kailath suggested propegating the square root rather than the whole covariance • Not really square root, actually Choleski Factor • rTr=R • Use on Rw, Rv, P

  26. Our Square Roots

  27. State Error

  28. Observations

  29. Measurement Equation

  30. Measurement Update • Then, by definition

  31. Updating for Free?

  32. Error Part 2

  33. Time Updating

  34. Paige’s Filter

  35. System 3 (Fun Problem) • X 20, • Known difficult matrix that was scaled to be stable

  36. System 3

  37. System 3

  38. System 3

  39. System 3

  40. Conclusions • Called Paige’s filter but really Paige and Saunders developed • O(n3) and about 60% faster than regular square root • Current interests: faster, special structures, robustness

More Related