1 / 17

UFP workshop 25/26 August 2008 Utrecht Expert Judgement and uncertainty

UFP workshop 25/26 August 2008 Utrecht Expert Judgement and uncertainty. Dr. Jeroen P. van der Sluijs. Copernicus Institute for Sustainable Development and Innovation Utrecht University. &.

lucia
Télécharger la présentation

UFP workshop 25/26 August 2008 Utrecht Expert Judgement and uncertainty

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UFP workshop 25/26 August 2008 Utrecht Expert Judgementand uncertainty Dr. Jeroen P. van der Sluijs Copernicus Institute for Sustainable Development and InnovationUtrecht University & Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France

  2. Expert Elicitation Workshop • Day 1: causal pathways, mechanisms connecting UFP exposure to health effects • Judgement on level of evidence • Uncertainty • Day 2: quantification Exposure Response Functions • Elicitation of Probability Density Functions • Uncertainty

  3. 3 framings of uncertainty 'deficit view' • Uncertainty is provisional • Reduce uncertainty, make ever more complex models • Tools: quantification, Monte Carlo, Bayesian belief networks 'evidence evaluation view' • Comparative evaluations of research results • Tools: Scientific consensus building; multi disciplinary expert panels • focus on robust findings 'complex systems view / post-normal view' • Uncertainty is intrinsic to complex systems • Uncertainty can be result of production of knowledge • Acknowledge that not all uncertainties can be quantified • Openly deal with deeper dimensions of uncertainty (problem framing indeterminacy, ignorance, assumptions, value loadings, institutional dimensions) • Tools: Knowledge Quality Assessment • Deliberative negotiated management of risk

  4. Dimensions of uncertainty • Technical (inexactness) • Methodological (unreliability) • Epistemological (ignorance) • Societal (limited social robustness)

  5. Locations of uncertainties: • Context system boundary, definitional vagueness • Expert judgement estimates, variability in (implicit) assumptions • Model causal structure, technical model, model parameters, model inputs • Data measurements, monitoring data, survey data • Outputs indicators, statements

  6. Expert Elicitation • To systematically make explicit and utilizable unwritten knowledge in the heads of experts, including insight in the limitations, strengths and weaknesses of that knowledge

  7. Rating-Scale for judgements on causality A level of confidence characterizes uncertainty based on expert judgement as to the correctness of a causal pathway being true (IPCC, 2005)

  8. Procedure • First round: individual expert judgements • Document key studies and main arguments underpinning your rating • Show (variability of) individual results to group • Group discussion focussing on reasons for disagreement / argumentations • Opportunity to revise initial individual judgements

  9. Pitfalls in expert elicitation • Overconfidence • Representativeness • Anchoring • Bounded rationality • Availability / lamp posting • Implicit assumptions • Motivational bias • Possibility of strategic answers • Interests with regard to outcome of analysis

  10. Overconfidence Experts tend to over-estimate their ability to make quantitative judgements Difficult to guard against; but a general awareness of the tendency can be important

  11. Representativeness Tendency to place too much confidence in a single piece of (familiar) information that is considered reliable Reliability representativeness Ignoring a larger body of more generalized information or other sources of information

  12. Anchoring Assessments are often unduly weighted toward the conventional value, or first value given, or to the findings of previous assessments in making an assessment.

  13. Bounded rationality Everyone has his own “blinkers” / limited view on reality Try to involve as many different viewpoints / disciplines as possible If you suspect that some other scholar in your field would disagree with your estimate, please mention that

  14. Availability bias • The tendency to give too much weight to readily available data or recent experience (which may not be representative of the required data) • Lamp-posting

  15. Implicit assumptions • A subject's responses are typically conditional on various unstated assumptions. • The effect of these assumptions is often to constrain the degree of uncertainty reflected in the resulting estimate of a quantity. • Stating assumptions explicitly can help reflect more of a subject's total uncertainty.

  16. Motivational bias • By their judgements, experts can influence the outcome of a research project. • This could lead to strategic answers to promote an interest in stead of what the expert truly believes.

  17. (Remaining) Program day 1 • Explantion first questions of protocol • Initial rating of confidence in overall causal links UFP and health end-points • coffee break • feedback of individual ratings • Group discussion • final rating • Lunch • Same procedure for individual causal pathways

More Related