Download
dynamical climate reconstruction n.
Skip this Video
Loading SlideShow in 5 Seconds..
Dynamical Climate Reconstruction PowerPoint Presentation
Download Presentation
Dynamical Climate Reconstruction

Dynamical Climate Reconstruction

113 Vues Download Presentation
Télécharger la présentation

Dynamical Climate Reconstruction

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Dynamical Climate Reconstruction Greg Hakim University of Washington Sebastien Dirren, Helga Huntley, Angie Pendergrass David Battisti, Gerard Roe

  2. Plan • Motivation: fusing observations & models • State estimation theory • Results for a simple model • Results for a less simple model • Optimal networks • Plans for the future

  3. Motivation • Range of approaches to climate reconstruction. • Observations: • time-series analysis; multivariate regression • no link to dynamics • Models • spatial and temporal consistency • no link to observations • State estimation (this talk) • few attempts thus far • stationary statistics

  4. Goals • Test new method • Reconstruct last 1-2K years • Unique dataset for climate variability? • E.g. hurricane variability. • E.g. rational regional downscaling (hydro). • Test network design ideas • Where to take highest impact new obs?

  5. Medieval warm period temperature anomalies IPCC Chapter 6

  6. Climate variability: a qualitative approach GRIP δ18O (temperature) GISP2 K+ (Siberian High) North Swedish tree line limit shift Sea surface temperature from planktonic foraminiferals hematite-stained grains in sediment cores (ice rafting) Varve thickness (westerlies) foraminifera Cave speleotherm isotopes (precipitation) Mayewski et al., 2004

  7. Statistical reconstructions • “Multivariate statistical calibration of multiproxy network” (Mann et al. 1998) • Requires stationary spatial patterns of variability Mann et al. 1998

  8. Paleoclimate modeling IPCC Chapter 6

  9. An attempt at fusion Multivariate regression Data Assimilation through Upscaling and Nudging (DATUN)Jones and Widmann 2003

  10. Fusion Hierarchy • Nudging: no error estimates • Statistical interpolation • 3DVAR • 4DVAR • Kalman filters • Kalman smoothers } } oper NWP fixed stats } Today’s talk The curse of dimensionality looms large in geoscience

  11. State Estimation Primer

  12. Gaussian Update analysis = background + weighted observations new obs information Kalman gain matrix analysis error covariance ‘<’ background

  13. Gaussian PDFs

  14. Ensemble Kalman Filter Crux: use an ensembleof fully non-linear forecasts tomodel the statistics of the background (expected value and covariance matrix). Advantages • No à priori assumption about covariance; state-dependent corrections. • Ensemble forecasts proceed immediately without perturbations.

  15. Summary of Ensemble Kalman Filter (EnKF) Algorithm • Ensemble forecast provides background estimate & statistics (B) for new analyses. • Ensemble analysis with new observations. (3) Ensemble forecast to arbitrary future time.

  16. Paleo-assimilation dynamical climate reconstruction • Observations often time-averaged. • e.g. gauge precip; wind; ice cores. • Sparse networks. • Issue: • How to combine averaged observations with instantaneous model states?

  17. Issue with Traditional Approach Problem: Conventional Kalman filtering requires covariance relationships between time-averaged observations and instantaneous states. High-frequency noise in the instantaneous states contaminates the update. Solution: Only update the time-averaged state.

  18. Algorithm 1. Time-averaged of background 2. Compute model-estimate of time-av obs 3. Perturbation from time mean 4. Update time-mean with existing EnKF 5. Add updated mean and unmodified perturbations 6. Propagate model states 7. Recycle with the new background states

  19. Illustrative ExampleDirren & Hakim (2005) • Model (adapted from Lorenz & Emanuel (1998)): • Linear combination of fast & slow processes “low-freq.” “high-freq.” • LE ~ a scalar discretized around a latitude circle. • - LE has elements of atmos. dynamics: • chaotic behavior, linear waves, damping, forcing

  20. RMS instantaneous (dashed : clim) Instantaneous states have large errors(comparable to climatology) Due to lack of observational constraint

  21. Improvement Percentage of RMS errors RMS all means Obs uncertainty Climatology uncertainty Total state variable Averaging time of state variable Constrains signal at higher freq.than the obs themselves!

  22. A less simple modelHelga Huntley (U. Delaware) • QG “climate model” • Radiative relaxation to assumed temperature field • Mountain in center of domain • Truth simulation • Rigorous error calculations • 100 observations (50 surface & 50 tropopause) • Gaussian errors • Range of time averages

  23. Snapshot

  24. Correlation Patterns as a Function of averaging time (tau)

  25. Observation Locations

  26. Average Spatial RMS Error

  27. Average Spatial RMS Error Ensemble used for control

  28. Implications • State is well constrained by few, noisy, obs. • Forecast error saturates at climatology for tau ~ 30. • For longer averaging times, the model adds little. • Equally good results can be obtained by assimilating the observations with an ensemble drawn from climatology (no model runs required)!

  29. Observation error Experiments

  30. Changing o (Observation Error) • Previously: • o = 0.27 for all . • Now: • o ≈ c/3 • (a third of control error).

  31. Observing Network DesignHelga Huntley (U. Delaware)

  32. Optimal Observation Locations • Rather than use random networks, can we devise a strategy to optimally site new observations? • Yes: choose locations with the largest impact on a metric of interest. • New theory based on ensemble sensitivity (Hakim & Torn 2005; Ancell & Hakim 2007; Torn and Hakim 2007) • Here, metric = projection coefficient for first EOF.

  33. Ensemble Sensitivity • Given metric J, find the observation that reduces uncertainy most (ensemble variance). • Find a second observation conditional on first. • Sketch of theory (let x denote the state). • Analysis covariance • Changes in metric given changes in state + O(x2) • Metric variance

  34. Sensitivity + State Estimation • Estimate variance change for the i’th observation • Kalman filter theory gives Ai: where • Given  at each point, find largest value.

  35. Ensemble Sensitivity (cont’d) • If H chooses a specific location xi, this all simplifies very nicely: • For the first observation: • For the second observation, given assimilation of the first observation: • Etc.

  36. Ensemble Sensitivity (cont’d) • In fact, with some more calculations, one can find a nice recursive formula, which requires the evaluation of just k+3 lines (1 covariance vector + (k+6) entry-wise mults/divs/adds/subs) for the k’th point.

  37. Results for tau = 20 First EOF

  38. Results for tau = 20 • The ten most sensitive locations (without accounting for prior assimilations) • o = 0.10

  39. Results for tau = 20 • The four most sensitive locations, accounting for previously found pts.

  40. Results for tau = 20; o = 0.10 Note the decreasing effect on the variance.

  41. Control Case: No Assimilation Avg error = 5.4484

  42. 100 Random Observation Locations Avg Error - Anal = 1.0427 - Fcst = 3.6403

  43. 4 Random Observation Locations Avg Error - Anal = 5.5644 - Fcst = 5.6279

  44. 4 Optimal Observation Locations Avg Error - Anal = 2.0545 - Fcst = 4.8808

  45. Summary Percent of ctr error Assimilating just the 4 chosen locations yields a significant portion of the gain in error reduction in J achieved with 100 obs.