400 likes | 430 Vues
Explore the fundamentals of Kalman Filtering, including state space representation, optimal control, and observer equations. Learn the importance of measurements in reconstructing state vectors and the innovations in Kalman Gain. Delve into system examples, challenges in stability, and the advancements in Square Root Filters. Discover Paige's involvement and the latest interests in Kalman Filtering techniques.
E N D
The “” Paige in Kalman Filtering K. E. Schubert
Kalman’s Interest • State Space (Matrix Representation) • Discrete Time (difference equations) • Optimal Control • Starting at x0 Go to xG • Minimize or maximize some quantity (time, energy, etc.)
Why Filtering? • State (xi) is not directly known • Must observe through minimum measurements • Observer Equation • Want to reconstruct the state vector
y=ax+b Random Variables • Process and observation noise • Independent, white Gaussian noise
Complete Problem • Control and estimation are independent • Concerned only with observer • Obtain estimate:
Predictor-Corrector Measurements Correct (Measurement Update) Predict (Time Update)
To Err Is Kalman! • How accurate is the estimate? • What is its distribution?
Predictor-Corrector Measurements Correct (Measurement Update) Predict (Time Update)
No random variable You don’t know it Predict Eigenvalues must be <1 (For convergence) Distribution does effect error covariance
Kalman Gain Innovations (What’s New) Oblique Projection Correct
System 1 (Basic Example) • X 2, • Companion Form • Nice but not perfect numerics and stability
System 1 (Again) • X 2, • Companion Form • Nice but not perfect numerics and stability
X 2, Large Eigenvalue Spread Condition number around 109 Large sampling time (big steps) System 2 (Stiffness)
Trouble in Paradise • Inversion in the Kalman gain is slow and generally not stable • A is usually in companion form • numerically unstable (Laub) • Covariance are symmetric positive definite • Calculation cause P to become unsymmetric then lose positivity
Square Root Filters • Kailath suggested propegating the square root rather than the whole covariance • Not really square root, actually Choleski Factor • rTr=R • Use on Rw, Rv, P
Measurement Update • Then, by definition
System 3 (Fun Problem) • X 20, • Known difficult matrix that was scaled to be stable
Conclusions • Called Paige’s filter but really Paige and Saunders developed • O(n3) and about 60% faster than regular square root • Current interests: faster, special structures, robustness