1 / 31

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 35: Uncertainty Quantification and Smoothing. Announcements. Homework 11 due on Friday, Dec. 6 Lecture quiz due by 5pm on Wednesday after Thanksgiving.

marin
Télécharger la présentation

ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 35: Uncertainty Quantification and Smoothing

  2. Announcements • Homework 11 due on Friday, Dec. 6 • Lecture quiz due by 5pm on Wednesday after Thanksgiving

  3. Uncertainty Quantification

  4. The Probability Ellipsoid Views of Error Ellipsoid view (-37.5°,0) standard MATLAB view

  5. Why is the ellipsoid useful?

  6. Why is the ellipsoid useful?

  7. Long-Duration Propagation

  8. Hot Topic of Research • Problem first identified in 1996: • Junkins, et al., “Non-Gaussian Error Propagation in Orbital Mechanics”, Journal of Astronautical Sciences, V. 44, N. 4, 1996 pp. 541-563 • Early studies motivated by need to perform regular collision risk assessment for the ISS • Multiple methods exists for nonlinear propagation: • Monte Carlo • State transition tensors (STT) • Gaussian Mixtures • Polynomial Chaos / Separation of Variables

  9. State Transition Tensors • The STM represents a 2nd-order tensor • Generated via the first derivative of the force model • Accuracy improved with the inclusion of higher-order effects • Keep higher-order terms of Taylor expansion • A STT maintains higher order derivatives for mapping of the a priorip.d.f.

  10. Example STT Propagation Fujimoto, et al., 2011

  11. Gaussian Mixtures Horwood, et al., JGCD, Nov.-Dec., 2011

  12. Gaussian Mixtures • Under what constraints is this a probability density function?

  13. Polynomial Chaos (PC) • Based on Weiner’s Homogeneous Chaos (1938) • Generates an approximate solution to a stochastic ODE: • More commonly used in structures, CFD, applied physics, and other fields • We are applying it to orbital mechanics

  14. Example PCE Result for a Molniya Orbit • Use polynomial surrogate to approximate the p.d.f. • PC requires ~100-200 ODE evaluations • Monte Carlo requires more than 100,000 evaluations Image: Jones, et al., 2013

  15. Forthcoming Use of PC in Spacecraft Operations • Part of NASA/GSFC-based navigation team for the Magnetospheric Multi-Scale (MMS) mission • Leveraging CU-developed methods and applications of uncertainty quantification • Applying polynomial chaos (PC) to the estimation of collision probabilities • Includes post-maneuver uncertainty quantification

  16. Uncertainty Quantification (UQ) • For more information on general UQ: • ASEN 6519 – Uncertainty Quantification • Spring 2014 • E-mail instructor (Alireza.Doostan@colorado.edu) about pre-requisites • More details on use in astrodynamics: • ASEN 6519 – Orbital Debris • Fall 2014 (planned)

  17. Homework 11

  18. Homework 11 • Leverage code from HW10 • New data set generated with a different force model • Otherwise, same format, data noise, etc. • Process observations in existing filter • Do not add J3 to your filter model! • Observe the effects of such errors on OD • Add process noise to improve state estimation accuracy

  19. Sample Post-Fit Residual Plot

  20. Sample State Error Plot

  21. Fixed Interval Smoothing

  22. Motivation • The batch processor provides an estimate based on a full span of data • When including process noise, we lose this equivalence between the batch and any of the sequential processors • Is there some way to update the estimated state using information gained from future observations?

  23. Smoothing • Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch. • Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.). • Step 2. Start with the last observation processed and smooth back through the observations.

  24. Notation • As presented in the book, the most common source of confusion for the smoothing algorithm is the notation Based on observations up to and including Value/vector/matrix Time of current estimate

  25. Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):

  26. Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):

  27. Smoothing visualization • Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does. • Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.

  28. Smoothing • Caveats: • If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby. • While this is good, it also means smoothing doesn’t always have a big effect. • Smoothing shouldn’t remove the white noise found on the signals. • It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.

  29. Smoothing of State Estimate • First, we use • If Q = 0,

  30. Smoothing of State Estimate • Hence, in the CKF, we store:

  31. Smoothing of Covariance • Optionally, we may smooth the state error covariance matrix

More Related