1 / 76

Bayesian methods for calibrating and comparing process-based vegetation models

Bayesian methods for calibrating and comparing process-based vegetation models. Marcel van Oijen (CEH-Edinburgh). Contents. Process-based modelling of forests and uncertainties Bayes’ Theorem (BT) Bayesian Calibration (BC) of process-based models Bayesian Model Comparison (BMC)

garren
Télécharger la présentation

Bayesian methods for calibrating and comparing process-based vegetation models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian methods for calibrating and comparing process-based vegetation models Marcel van Oijen (CEH-Edinburgh)

  2. Contents • Process-based modelling of forests and uncertainties • Bayes’ Theorem (BT) • Bayesian Calibration (BC) of process-based models • Bayesian Model Comparison (BMC) • BC & BMC in NitroEurope • Examples of BC & BMC in other sciences • BC & BMC as tools to develop theory • References, Summary, Discussion

  3. 1. Introduction:Process-based modelling of forests and uncertainties

  4. 1.1 Forest growth in Europe Project RECOGNITION (FAIRCT98-4124): 15 partner countries across Europe Previous observations RECOGNITION Forests across Europe have started to grow faster in the 20th century: Causes? Future trend? 22 sites Empirical methods + process-based modelling Modelling groups in UK, Sweden and Finland (2), coordinated by CEH-Edinburgh

  5. 1.2 Forest growth in Europe N-deposition CONCLUSION 20th century Growth accelerated by N-deposition. CO2 Temperature Environmental change 2000-2080: Environmental change 2000-2080: Effects on NPP Effects on NPP EFM 25 25 20 20 CONCLUSION 21st century: Growth likely to be accelerated by climate change and increasing [CO2]. 15 15 10 10 CO 2 Climate N-deposition 5 5 CUMULATIVE EFFECTS 0 0 -5 -5 % Change in NPP % Change in NPP Latitude Latitude -10 -10 HOG HOG PUN PUN HEL HEL SOL SOL LOP LOP GA2 GA2 GA1 GA1 SKO SKO KEM KEM KOL KOL KAR KAR PUS PUS RAJ RAJ AAL AAL BLA BLA JAD JAD KAN KAN PFZ PFZ PFF PFF ALT ALT BRI BRI TRI TRI NPP before growth rate increase (1920)

  6. 1.3 Reality check ! In every study using systems analysis and simulation: Model parameters, inputs and structure are uncertain How reliable is the European forest study: • Sufficient data for model parameterization? • Sufficient data for model input? • Would another model have given different results? How to deal with uncertainties optimally?

  7. 1.4 Forest models and uncertainty Model [Levy et al, 2004]

  8. 1.4 Forest models and uncertainty bgc century hybrid NdepUE (kg C kg-1 N) [Levy et al, 2004]

  9. 1.5 Model-data fusion Uncertainties are everywhere: Models (environmental inputs, parameters, structure), Data Uncertainties can be expressed as probability distributions (pdf’s) We need methods that: • Quantify all uncertainties • Show how to reduce them • Efficiently transfer information: data  models  model application Calculating with uncertainties (pdf’s) = Probability Theory

  10. 2. Bayes’ Theorem

  11. 2.1 Dealing with uncertainty: Medical diagnostics P(pos|dis) = 0.99 P(pos|hlth) = 0.01 Bayes’ Theorem P(dis|pos) = P(pos|dis) P(dis) / P(pos) P(dis) = 0.01 A flu epidemic occurs: one percent of people is ill Diagnostic test, 99% reliable • Test result is positive (bad news!) • What is P(diseased|test positive)? • 0.50 • 0.98 • 0.99

  12. 2.1 Dealing with uncertainty: Medical diagnostics P(pos|dis) = 0.99 P(pos|hlth) = 0.01 P(dis) = 0.01 A flu epidemic occurs: one percent of people is ill Diagnostic test, 99% reliable Bayes’ Theorem P(dis|pos) = P(pos|dis) P(dis) / P(pos) = P(pos|dis) P(dis) P(pos|dis) P(dis) + P(pos|hlth) P(hlth) • Test result is positive (bad news!) • What is P(diseased|test positive)? • 0.50 • 0.98 • 0.99

  13. 2.1 Dealing with uncertainty: Medical diagnostics P(pos|dis) = 0.99 P(pos|hlth) = 0.01 P(dis) = 0.01 A flu epidemic occurs: one percent of people is ill Diagnostic test, 99% reliable Bayes’ Theorem P(dis|pos) = P(pos|dis) P(dis) / P(pos) = P(pos|dis) P(dis) P(pos|dis) P(dis) + P(pos|hlth) P(hlth) = 0.99 0.01 0.99 0.01 + 0.01 0.99 = 0.50 • Test result is positive (bad news!) • What is P(diseased|test positive)? • 0.50 • 0.98 • 0.99

  14. 2.2 Bayesian updating of probabilities Bayes’ Theorem: Prior probability → Posterior prob. Medical diagnostics: P(disease) → P(disease|test result) Model parameterization: P(params) → P(params|data) Model selection: P(models) → P(model|data) SPAM-killer: P(SPAM) → P(SPAM|E-mail header) Weather forecasting: … Climate change prediction: … Oil field discovery: … GHG-emission estimation: … Jurisprudence: … …

  15. 2.2 Bayesian updating of probabilities Bayes’ Theorem: Prior probability → Posterior prob. Model parameterization: P(params) → P(params|data) Model selection: P(models) → P(model|data) Application of Bayes’ Theorem to process-based models (not analytically solvable): Markov Chain Monte-Carlo (Metropolis algorithm)

  16. 2.3 What and why? • We want to use data and models to explain and predict ecosystem behaviour • Data as well as model inputs, parameters and outputs are uncertain • No prediction is complete without quantifying the uncertainty. No explanation is complete without analysing the uncertainty • Uncertainties can be expressed as probability density functions (pdf’s) • Probability theory tells us how to work with pdf’s: Bayes Theorem (BT) tells us how a pdf changes when new information arrives • BT: Prior pdf  Posterior pdf • BT: Posterior = Prior x Likelihood / Evidence • BT: P(θ|D) = P(θ) P(D|θ) / P(D) • BT: P(θ|D)  P(θ) P(D|θ)

  17. 3. Bayesian Calibration (BC)of process-based models

  18. 3.1 Process-based forest models Height Environmental scenarios NPP Initial values Soil C Parameters Model

  19. 3.2 Process-based forest model BASFOR 40+ parameters 12+ output variables BASFOR

  20. 3.3 BASFOR: outputs Carbon in trees (standing + thinned) Volume (standing) Carbon in soil

  21. 3.4 BASFOR: parameter uncertainty

  22. 3.5 BASFOR: prior output uncertainty Carbon in trees (standing + thinned) Volume (standing) Carbon in soil

  23. 3.6 Data Dodd Wood (R. Matthews, Forest Research) Carbon in trees (standing + thinned) Volume (standing) Carbon in soil

  24. 3.7 Using data in Bayesian calibration of BASFOR Prior pdf Data Bayesian calibration Posterior pdf

  25. 3.8 Bayesian calibration: posterior uncertainty Carbon in trees (standing + thinned) Volume (standing) Carbon in soil

  26. 3.9 How does BC work again? P(|D) = P() P(D| ) / P(D) P() P(D|f()) “Posterior distribution of parameters” “Prior distribution of parameters” “Likelihood” of data, given mismatch with model output f = the model, e.g. BASFOR

  27. Bayesian calibration in action! Bayes’ Theorem: P( |D)P() P(D|(f()) Parameter prob. distr. Output Data

  28. 3.10 Calculating the posterior using MCMC MCMC trace plots P(|D)P() P(D|f()) • Start anywhere in parameter-space: p1..39(i=0) • Randomly choose p(i+1) = p(i) + δ • IF: [ P(p(i+1)) P(D|f(p(i+1))) ] / [ P(p(i)) P(D|f(p(i))) ] > Random[0,1] • THEN: accept p(i+1) • ELSE: reject p(i+1) • i=i+1 • 4. IF i < 104 GOTO 2 Metropolis et al (1953) Sample of 104 -105 parameter vectors from the posterior distribution P(|D) for the parameters

  29. 3.11 MCMC in action BC3D.AVI

  30. 3.12 Using data in Bayesian calibration of BASFOR Prior pdf Data Bayesian calibration Posterior pdf

  31. 3.13 Parameter correlations 39 parameters 39 parameters

  32. 3.14 Continued calibration when new data become available Prior pdf New data Bayesian calibration Prior pdf Posterior pdf

  33. 3.14 Continued calibration when new data become available Prior pdf Posterior pdf Prior pdf New data Bayesian calibration

  34. 3.15 Bayesian projects at CEH-Edinburgh Parameterization and uncertainty quantification of 3-PG model of forest growth & C-stock (Genevieve Patenaude, Ronnie Milne, M. v.Oijen) [CO2] • Selection of forest models • Data Assimilation forest EC data (David Cameron, Mat Williams, M.v.Oijen) • Risk of frost damage in grassland • Uncertainty in UK C-sequestration (Marcel van Oijen, Jonathan Rougier, Ron Smith, Tommy Brown, Amanda Thomson) Uncertainty in earth system resilience (Clare Britton & David Cameron) Time

  35. 3.16 BASFOR: forest C-sequestration 2005-2076 Change in potential C-seq. Uncertainty in change of potential C-seq. Change in annual mean Temperature UKCIP • Uncertainty due to model parameters only, NOT uncertainty in inputs / upscaling

  36. 3.17 Integrating RS-data (Patenaude et al.) Model 3-PG BC RS-data: Hyper-spectral, LiDAR, SAR

  37. 3.18 What kind of measurements would have reduced uncertainty the most ?

  38. 3.19 Prior predictive uncertainty & height-data Prior pred. uncertainty Biomass Height Height data Skogaby

  39. 3.20 Prior & posterior uncertainty: use of height data Prior pred. uncertainty Biomass Height Height data Skogaby Posterior uncertainty (using height data)

  40. 3.20 Prior & posterior uncertainty: use of height data Prior pred. uncertainty Biomass Height Height data (hypothet.) Posterior uncertainty (using height data)

  41. 3.20 Prior & posterior uncertainty: use of height data Prior pred. uncertainty Biomass Height Posterior uncertainty (using height data) Posterior uncertainty (using precision height data)

  42. 3.21 Summary for BC procedure “Error function” e.g. N(0, σ) MCMC Samples of f() (104 – 105) Samples of  (104 – 105) P(D|f()) PCC Posterior P(|D) Prior P() Model f Data D ± σ Uncertainty of model output Sensitivity analysis of model parameters Calibrated parameters, with covariances

  43. 3.22 Summary for BC vs tuning Model tuning Define parameter ranges (permitted values) Select parameter values that give model output closest (r2, RMSE, …) to data Do the model study with the tuned parameters (i.e. no model output uncertainty) Bayesian calibration Define parameter pdf’s Define data pdf’s (probable measurement errors) Use Bayes’ Theorem to calculate posterior parameter pdf Do all future model runs with samples from the parameter pdf (i.e. quantify uncertainty of model results) BC can use data to reduce parameter uncertainty for any process-based model

  44. 4. Bayesian Model Comparison (BMC)

  45. 4.1 RECOGNITION revisited: model uncertainty EFM 25 20 15 10 5 0 Latitude

  46. 4.1 RECOGNITION revisited: model uncertainty EFM Q 20 25 15 20 10 15 5 10 0 5 -5 0 -10 Latitude -5 HOG PUN KEM KAR PUS SKO KAN SOL LOP GA2 GA1 KOL HEL PFZ RAJ PFF AAL BLA JAD ALT BRI TRI 40 EFIMOD FinnFor 30 20 20 10 10 0 0 Latitude Latitude HOG PUN KEM SKO KAR PUS KAN HEL SOL LOP GA2 GA1 KOL PFZ RAJ PFF AAL BLA JAD ALT BRI TRI -10 HOG PUN KAR PUS SKO KAN KEM SOL LOP GA2 GA1 KOL HEL RAJ AAL BLA JAD PFZ PFF ALT BRI TRI

  47. 4.2 Bayesian comparison of two models Model 1 Model 2 Bayes Theorem for model probab.: P(M|D) = P(M) P(D|M) / P(D) The “Integrated likelihood” P(D|Mi) can be approximated from the MCMC sample of outputs for model Mi (*) P(M1) = P(M2) = ½ P(M2|D) / P(M1|D) =P(D|M2) / P(D|M1) The “Bayes Factor”P(D|M2) / P(D|M1) quantifies how the data D change the odds of M2 over M1 (*) harmonic mean of likelihoods in MCMC-sample (Kass & Raftery, 1995)

  48. 4.3 BMC: Tuomi et al. 2007

More Related