1 / 10

Retrieval Theory

Retrieval Theory. Mar 23, 2008 Vijay Natraj. The Inverse Modeling Problem. Optimize values of an ensemble of variables ( state vector x ) using observations:. a priori estimate x a + e a. “MAP solution” “optimal estimate” “retrieval”. Bayes’ theorem. Forward model y = F(x) + e.

jean
Télécharger la présentation

Retrieval Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Retrieval Theory Mar 23, 2008 Vijay Natraj

  2. The Inverse Modeling Problem Optimize values of an ensemble of variables (state vector x) using observations: a priori estimate xa + ea “MAP solution” “optimal estimate” “retrieval” Bayes’ theorem Forward model y = F(x) + e Measurement vector y

  3. Applications for Atmospheric Concentration • Retrieve atmospheric concentrations (x) from observed atmospheric radiances (y) using a radiative transfer (RT) model as forward model • Invert sources (x) from observed atmospheric concentrations (y) using a chemical transport model (CTM) as forward model • Construct a continuous field of concentrations (x) by assimilation of sparse observations (y) using a forecast model (initial-value CTM) as forward model

  4. Optimal Estimation • Forward problem typically not linear • No analytical solution to express state vector in terms of measurement vector • Approximate solution by linearizing forward model about reference state x0 • K: weighting function (Jacobian) matrix • K describes measurement sensitivity to state.

  5. Optimal Estimation • Causes of non-unique solutions • m > n (more measurements than unknowns) • Amplification of measurement and/or model noise • Poor sensitivity of measured radiances to one or more state vector elements (ill-posed problem) • Need to use additional constraints to select acceptable solution (e.g., a priori)

  6. P(x,y)dxdy Bayes’ Theorem observation pdf a priori pdf a posteriori pdf Bayes’ theorem a normalizing factor (unimportant) Maximum a posteriori (MAP) solution for x given y is defined by solve for

  7. Gaussian PDFs Scalar x Vector where Sa is the a priori error covariance matrix describing error statistics on (x-xa) In log space: Similarly:

  8. Maximum A Posteriori (MAP) Solution Bayes’ theorem: bottom-up constraint top-down constraint minimize cost function J: MAP solution: Solve for Analytical solution: with gain matrix

  9. Averaging Kernel A describes the sensitivity of retrieval to true state and hence the smoothing of the solution: smoothing error retrieval error MAP retrieval gives A as part of the retrieval: Sensitivity of retrieval to measurement

  10. Number of unknowns that can be independently retrieved from measurement DFS = n: measurement completely defines state DFS = 0: no information in the measurement Degrees of Freedom

More Related