1 / 27

Introduction to assessment performance

This article provides an introduction to the assessment performance, covering concepts, perspectives, and examples. It discusses the importance of evaluation, quality assurance/quality control, uncertainty, and model performance. The text emphasizes the properties of good assessment and the link between assessment outputs and outcomes.

fjohn
Télécharger la présentation

Introduction to assessment performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mikko Pohjola, THL Introduction to assessment performance

  2. Contents Concepts & setting Common perspectives (& examples) Quality assurance/quality control Uncertainty Model performance Properties of good assessment Summary & discussion

  3. Setting • Decision making under uncertainty • Input information • Assessment information • News • gossip, hearsay • Processing (decision making) • Cognition • Communication • Output • Decision -> Action -> Outcome

  4. Setting • Assessment performance is about • Information • …in use • Making of… • How good is it?

  5. Concepts • Some basic concepts: • Performance = goodness! • Assessment, Management • Model • Process (making/using), Product • Output, Outcome • Assessor, Decision/Policy maker, Stakeholder • Participant, User

  6. Concepts • Why evaluation of assessment performance? • Efficient use of resources? • Value of work done? • Importance/meaning of information? • Implications of information? • Actual impacts of information? • … • …because funder, customer, user, boss, peer, stakeholder etc. wants/needs to know!

  7. Roles and interests

  8. General RA/RM framework • Process, product, use

  9. Common perspectives & examples • Quality assurance/quality control • Focus on assessment process • An “engineering” perspective • Uncertainty • Focus on assessment output • A scientists perspective??? • Model performance • Focus on modelling and model • Combines QA/QC and uncertainty perspectives • A modellers perspective

  10. Quality assurance/quality control • Principle: • Good process guarantees good outputs/outcomes! • Question: • How should an assessment process be conducted? • Examples: • Ten steps by Jakeman et al.(2006) • IDEA framework (Briggs, 2008) • (Over)appreciation of randomized controlled trials (RCT’s)

  11. Ten iterative steps in development and evaluation of environmental models Jakeman et al.: Ten iterative steps in development and evaluation of environmental models. Environmental Modelling & Software Issue 5, May 2006, Pages 602-614

  12. IDEA framework (INTARESE) Briggs: A framework for integrated environmental health impact assessment of systemic risks. Environmental Health 2008, 7:61.

  13. Uncertainty • Principle: • Performance is an intrinsic property of an information product! • Question: • How good is the answer provided by the assessment?

  14. Uncertainty • Examples: • Statistical uncertainty analysis • Mean, variance, confidence limits, distributions, … • Cf. D. Lindley: Philosophy of Statistics, 2000 • Sources of uncertainty • E.g. model, parameter & scenario uncertainty (as applied e.g. by the U.S.EPA) • Extensive approaches • E.g. inclusion of qualitative aspects, sources of uncertainty as in NUSAP (www.nusap.net)

  15. NUSAP • N: numeral • U: unit • S: spread • A: assessment (qualitative judgment) • P: pedigree (historical path leading to result)

  16. NUSAP - pedigree Jeroen van der Sluijs: NUSAP- some examples. Presentation. Available: http://tinyurl.com/5uwln2r

  17. Model performance • Principle: • The model is the essence of the assessment! • Question: • How good is the model? • Examples: • Verification, validation, (reliability, usability, …) • Outcome-oriented approach by Matthews et al. 2011

  18. Outcome-oriented modelling approach Matthews et al.: Raising the bar? – The challenges of evaluating the outcomes of environmental modelling and software. Environmental Modelling & Software, March 2011, Pages 247-257.

  19. Summary of common perspectives • Assessment process and product addressed in many ways • Use of results mostly not considered • The link between outputs and outcomes (cf. Matthews et al. 2011) • Evaluation often a separate process • Expert processes of making assessments and using their results • Expert processes of evaluating performance • Alternative perspectives?

  20. Properties of good assessment

  21. Properties of good assessment • Ex post (after assessment) evaluation • Ex ante (before/during assessment) evaluation • Guidance of design and execution • Links process and output with use • Thereby also linking them to outcomes

  22. Example: what makes a good hammer?

  23. Example: what makes a good hammer? • How is the hammer made? By whom? • What properties does the hammer have? • What do you want to do with the hammer? • How does the hammer help you do it?

  24. Summary • Consideration of (intended) use is essential • Consideration of process and product in light of use • Consider the instrumental value of information • Cf. absolute value (a common science view) • Cf. Ad hoc solutions (a common practice view) • Contextuality, situatedness, practicality, … • In policy-support information is a tool (a means to an end) • A model is a tool for producing information • How does this relate to the previous lectures about DA and the DA study plan exercise?

  25. Discussion example: swine flu vaccination • Because of urgence, swine flu vaccination was bought in Finland without a thorough testing. • When narcolepsy cases were identified, the decision made without testing was seen as a major mistake. • Was it a mistake? • How should we evaluate the situation to find an answer? • How did the decision-maker assess the situation? • How should she have assessed the situation?

  26. Swine flu example: issues in performance? • What are the critical issues in the assessment performance? Possibilities include e.g. • The assessment truthfully estimates the total health impact of swine flu. • The assessment truthfully estimates the health impact of a vaccination campaign. • The only tested vaccines are assessed. • The assessment does not underestimate potential side effects of the vaccine, whether tested or not. • Something else, what?

  27. Swine flu example: follow-up as a part of assessment performance? • What are the methods to identify if something starts to go on after the decision? • Should these be assessed already in the assessment before the decision? • How can this be done? • Does this improve the assessment performance?

More Related