100 likes | 210 Vues
This document delves into the inverse modeling problem focused on optimizing values of an ensemble of variables, termed as the state vector (x), using observational data. It highlights the Maximum A Posteriori (MAP) solution and illustrates its application in retrieving atmospheric concentrations and sources through radiative transfer and chemical transport models. The crucial role of Bayes' theorem is examined, outlining the challenges of non-linear forward problems, measurement sensitivity, and the necessity for additional constraints to avoid non-unique solutions. Practical examples underscore the implications for atmospheric science.
E N D
Retrieval Theory Mar 23, 2008 Vijay Natraj
The Inverse Modeling Problem Optimize values of an ensemble of variables (state vector x) using observations: a priori estimate xa + ea “MAP solution” “optimal estimate” “retrieval” Bayes’ theorem Forward model y = F(x) + e Measurement vector y
Applications for Atmospheric Concentration • Retrieve atmospheric concentrations (x) from observed atmospheric radiances (y) using a radiative transfer (RT) model as forward model • Invert sources (x) from observed atmospheric concentrations (y) using a chemical transport model (CTM) as forward model • Construct a continuous field of concentrations (x) by assimilation of sparse observations (y) using a forecast model (initial-value CTM) as forward model
Optimal Estimation • Forward problem typically not linear • No analytical solution to express state vector in terms of measurement vector • Approximate solution by linearizing forward model about reference state x0 • K: weighting function (Jacobian) matrix • K describes measurement sensitivity to state.
Optimal Estimation • Causes of non-unique solutions • m > n (more measurements than unknowns) • Amplification of measurement and/or model noise • Poor sensitivity of measured radiances to one or more state vector elements (ill-posed problem) • Need to use additional constraints to select acceptable solution (e.g., a priori)
P(x,y)dxdy Bayes’ Theorem observation pdf a priori pdf a posteriori pdf Bayes’ theorem a normalizing factor (unimportant) Maximum a posteriori (MAP) solution for x given y is defined by solve for
Gaussian PDFs Scalar x Vector where Sa is the a priori error covariance matrix describing error statistics on (x-xa) In log space: Similarly:
Maximum A Posteriori (MAP) Solution Bayes’ theorem: bottom-up constraint top-down constraint minimize cost function J: MAP solution: Solve for Analytical solution: with gain matrix
Averaging Kernel A describes the sensitivity of retrieval to true state and hence the smoothing of the solution: smoothing error retrieval error MAP retrieval gives A as part of the retrieval: Sensitivity of retrieval to measurement
Number of unknowns that can be independently retrieved from measurement DFS = n: measurement completely defines state DFS = 0: no information in the measurement Degrees of Freedom