1 / 33

Uncertainty analysis and Model Validation

Uncertainty analysis and Model Validation. Final Project. Summary of Results & Conclusions. In a real-world problem we need to establish model specific calibration criteria and define targets including associated error. Calibration Targets.

Télécharger la présentation

Uncertainty analysis and Model Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Uncertainty analysis and Model Validation

  2. Final Project Summary of Results & Conclusions

  3. In a real-world problem we need to establish model specific calibration criteria and define targets including associated error. Calibration Targets associated error calibration value 0.80 m 20.24 m Target with smaller associated error. Target with relatively large associated error.

  4. Smith Creek Valley (Thomas et al., 1989) • Calibration Objectives • Heads within 10 ft of measured heads. Allows for • Measurement error and interpolation error. • Absolute mean residual between measured and • simulated heads close to zero (0.22 ft) and standard • deviation minimal (4.5 ft). • Head difference between layers 1&2 within 2 ft of • field values. • 4. Distribution of ET and ET rates match field estimates.

  5. Also need to identify calibration parameters and their reasonable ranges.

  6. Calibration Prediction Group ARM h ARM ET (x10e7) ARM h (at targets) ARM h (at pumping wells) 1 0.92 1.38 1.60 4.16 2 0.73 1.11 1.99 3.03 3 0.69 0.51 0.95 1.76 4 1.34 1.27 1.46 2.57 5 1.56 0.89 2.79 1.43 6 1.29 0.16 2.58 2.92

  7. Calibration to Fluxes When recharge rate (R) is a calibration parameter, calibrating to fluxes can help in estimating K and/or R.

  8. In this example, flux information helps calibrate K. q = KI K = ? H1 H2

  9. In this example, discharge information helps calibrate R. R = ?

  10. In our example, total recharge is known/assumed to be 7.14E08 ft3/year and discharge = recharge. All water discharges to the playa. Calibration to ET merely fine tunes the discharge rates within the playa area.

  11. Calibration Prediction Group ARM h ARM ET (x10e7) ARM h (at targets) ARM h (at pumping wells) 1 0.92 1.38 1.60 4.16 2 0.73 1.11 1.99 3.03 3 0.69 0.51 0.95 1.76 4 1.34 1.27 1.46 2.57 5 1.56 0.89 2.79 1.43 6 1.29 0.16 2.58 2.92

  12. Includes results from 2000, 2001, 2003

  13. Includes results from 2000, 2001, 2003

  14. Group P1 P2 P3 P4 P5 P6 P7 1 2320 PW1 3970 PW2 2310 playa 1920 PW4 1500 PW4 4810 PW2 684 PW2 2 74,000 PW2 39,000 PW2 393,000playa 3.93E6 PW4 252 PW4 1084 playa 1576 playa 3 1.21E6 PW1 2.15E6 PW2 3.90E6 playa 1110 PW4 1.58E6 playa 1860 playa 893 playa 4 1200 PW1 1900 PW2 6.7E6 PW3 290 PW4 2800 PW5 760 PW1 1200 PW2 5 1295 PW1 3160 PW2 503 playa 986 PW4 605 PW4 316 PW1 3100 PW2 6 3100 PW1 982 PW1  4.9E5 playa 603 PW4 1450 PW4 2000 PW1 1380 PW2 Truth 802 PW1 1913 playa 620 playa 310 PW4 1933 PW5 690 playa 2009 PW2 Particle Tracking

  15. Observations Predicted ARM > Calibrated ARM Predicted ARM at pumping wells > Predicted ARM at nodes with targets Flow predictions are more robust (consistent among different calibrated models) than transport (particle tracking) predictions.

  16. Conclusions • Calibrations are non-unique. • A good calibration (even if ARM = 0) • does not ensure that the model will make • good predictions. • You can never have enough field data. • Modelers need to maintain a healthy skepticism • about their results. • Need for an uncertainty analysis to accompany • calibration results and predictions.

  17. Uncertainty in the Calibration Involves uncertainty in: • Targets • Parameter values • Conceptual model including boundary conditions, • zonation, geometry, etc.

  18. Ways to analyze uncertainty in the calibration Sensitivity analysis Use an inverse model (automated calibration) to quantify uncertainties and optimize the calibration.

  19. Uncertainty in the Prediction • Reflects uncertainty in the calibration. • Involves uncertainty in how parameter values • (e.g., recharge) will vary in the future.

  20. Ways to quantify uncertainty in the prediction Sensitivity analysis Stochastic simulation

  21. MADE site – Feehley and Zheng, 2000, WRR 36(9).

  22. A Monte Carlo analysis considers 100 or more realizations.

  23. Stochastic modeling option in GW Vistas

  24. Ways to quantify uncertainty in the prediction Sensitivity analysis Scenario analysis Stochastic simulation

  25. Model Validation How do we “validate” a model so that we have confidence that it will make accurate predictions?

  26. Modeling Chronology 1960’s Flow models are great! 1970’s Contaminant transport models are great! 1975 What about uncertainty of flow models? 1980s Contaminant transport models don’t work. (because of failure to account for heterogeneity) 1990s Are models reliable? Concerns over reliability in predictions arose over efforts to model a geologic repository for high level radioactive waste.

  27. “The objective of model validation is to determine how well the mathematical representation of the processes describes the actual system behavior in terms of the degree of correlation between model calculations and actual measured data” (NRC, 1990)

  28. What constitutes validation? (code vs. model) NRC study(1990): Model validation is not possible. Oreskes et al. (1994): paper in Science Calibration = forced empirical adequacy Verification = assertion of truth (possible in a closed system, e.g., testing of codes) Validation = establishment of legitimacy (does not contain obvious errors), confirmation, confidence building

  29. How to build confidence in a model Calibration (history matching) steady-state calibration(s) transient calibration “Verification” requires an independent set of field data Post-Audit: requires waiting for prediction to occur Models as interactive management tools

  30. HAPPY MODELING!

  31. Have a good summer!

More Related