1 / 30

‘Lessons learned in designing effective assessments’

‘Lessons learned in designing effective assessments’. European Environment Agency (EEA) Anita Künitzer. http://www.eea.eu.int. Domingo Jiménez-Beltrán Executive Director, E EA. The EEA's Mission. ... is to deliver timely, targeted, relevant and reliable information

Télécharger la présentation

‘Lessons learned in designing effective assessments’

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ‘Lessons learned in designing effective assessments’ European Environment Agency (EEA) Anita Künitzer http://www.eea.eu.int

  2. Domingo Jiménez-Beltrán Executive Director, EEA The EEA's Mission ... is to deliver timely, targeted, relevant and reliableinformation to policy-makers and the public for the development and implementation of sound environmental policies in the European Union and other EEA member countries.

  3. EEA member countries EU 15 member states + Iceland Liechtenstein Norway EEA candidate countries Stability Pact countries TACIS

  4. Production process of assessment • Identification of users (politicians, scientists, school children,...) • Policy issues to be addressed: what should the assessment achieve? • Process of the assessment • Launch: at which policy event

  5. Designing effective assessments:The role of participation, scienceand governance, and focus Workshop co-organised by the European Environment Agency and the Global Environmental Assessment Project 2001

  6. Integrated Environmental Assessment An interdisciplinary process of structuring knowledge elements from various scientific disciplines in such a manner that all relevant aspects of a complex societal problem are considered in their mutual coherence for the benefit of (sustainable) decision-making.

  7. policy preparation policy formulation policy evaluation policy execution The Policy Cycle

  8. Framework for scientific assessment process to inform policy makers • Example: IPCC (Intergovernmental Panel on Climate Change) • Involved only expert scientists in defined disciplines, no political stakeholders • Production of lengthy reports • Example: CLRTAP (Long-Range Transboundary Air Pollution) • Less clear science-policy distinction • Few formal reports

  9. Effective assessments • What is effective? • Cost-effectivness • Improvements in the natural environment • Fulfilling political objectives • Attributes for effective assessments: • Credibility • Salience • Legitimacy

  10. Credibility • Lack of credibility: • Assessment based on shoddy methods • Assessment ignores important empirical evidence • Assessment draws inappropriate conclusions from data • Gain credibility: • Through the process by which the information is created (example: data obtained by good laboratory practise) • By the credentials or other characteristics of producers of the assessment (example: assessment done by well-known, highly regarded scientist)

  11. Salience or relevance • Lack of salience: • Produced report is never referred to and never heard from again • Assessment addresses questions to which the user is not interested in the answers • Gain salience: • Assessment is able to address the particular concerns of a user • User is aware of the assessment • User considers the assessment relevant to current policy

  12. Legitimacy measure of the political acceptability or perceived fairness of an assessment to a user • Lack of legitimacy: • In ‘global’ assessments inputs from less powerful countries are not included or their interests are ignored • Gain legitimacy: • Users and participants interests, concerns, views, perspectives have been taken into account • The assessment process has been a fair one

  13. Assessment design (1) • Historical context of the assessment • Characteristics of the issue area • Position of the issue on the political agenda • Characteristics of the intended user • Interest in the issue and/or assessment • User capacity to understand the results • User openness to different sources of advice

  14. Assessment design (2) • Assessment characteristics • Participation: who is involved in the assessment process? • Science and governance: how are assessments conducted with respect to the interactions between scientific experts and policy makers? • Focus: how broadly (multidisciplinary) or narrowly (technically) focussed should the assessment be? How consensus-based should it be?

  15. Conceptual framework for considering effective assessments Ultimate determinants Proximate pathways Assessment effectiveness • Historical context • Issue characteristics • Linkage • Attention cycle Salience Credibility Legitimacy • User characteristics • Concern • Capacity • openness Effectiveness • Assessment characteristics • Science/governance • Participation • focus

  16. Participation: critical issues • The capacity of partners, clients and/or users to participate in the assessment (travel costs, administrative capacity, time for the assessment itself). • Are scientists participating in their individual capacity (good for scientific credebility) or are they accountable to governments? • Encourage participation of stakeholders to whom the assessment is designed to: NGOs, policy making community, country representatives to make them interested in the final report • Process of participation might be more important than content: inclusion as author or attending a meeting increases legitimacy of assessment. • Broad review of an assessment done by few scientist by several international organisations can increase the legitimacy

  17. Science and governance: critical issues • Assessments on issue areas, which are scientific controversial should be undertaken by institutions accountable to the scientific community to minimise credibility concerns. • While scientist prefer credibility of assessments, politicians prefer salience. Therefore such assessments on more mature scientific areas might better be undertaken by other organisations more focussed on policy needs. • Including policy recommendations in a scientific assessment can be dangerous. Here assessment of ’boundary organisations’ that are accountable to science and policy may be the solution.

  18. Focus: critical issues • Succesful assessment avoid adressing controversial issues. • Broadly focussed assessments include more relevant factors and increase the audience and might be more relevant to decision makers. • Most assessment to date have been too simple by excluding too many factors and causal chains. • Assessment should be kept comprehensive despite all interactions. Periodic separate thematic assessments could be produced instead of one big comprehensive assessment.

  19. Use MDIAR to analyse the information provision processMDIAR stands for: R: Reporting A: Assessment I: Information D: Data M: Monitoring M: Monitoring D: Data I: Information A: Assessment R: Reporting

  20. e.g. Clean Production, Public Transport, Regulations, taxes Information, etc. Drivers Responses e.g. Transportand Industry Pressures e.g. Polluting Emissions Impact e.g. Ill health, Biodiversity loss, Economic damage State e.g. Air, Water, Soil quality The DPSIR framework

  21. A performance indicator Emissions of ozone precursors, EU15 target

  22. Universe of DPSIR descriptive system indicators 1 2 3 Universe of policy relevant indicators The link between indicators and the policy process – distinguishing the differences and improving relevance Indicators linked to policy intentions or public expectations Indicators linked to quantitative targets Indicators linked to stated objectives

  23. What are scenarios? Scenarios are archetypal descriptions of alternative images of the future, created from mental maps or models that reflect different perspectives on past, present and future developments.

  24. Measuring is Not Knowing:The Marine Environment and the Precautionary Principle ‘The enormous number of papers in the marine environment means that huge amounts of data are available, but …we have reached a sort of plateau in …the understanding of what the information is telling us …. We… seem not to be able to do very much about it or with it. This is what led to the precautionary principle, after all – we do not know whether, in our studied ecosystem, a loss of diversity would matter, and it might’.Marine Pollution Bulletin, Vol 34, No. 9, pp. 680-681, 1997

  25. Precautionary principle in assessments • Levels of proof: Assessments for public policy making need lower levels of proof than normal good science • Multidisciplinary approaches: improve the quality of an assessment by considering aspects of the problem from different perspectives • Early warnings: successful prevention of environmental impacts and associated cost needs early warnings

  26. Levels of proof - some illustrations • Beyond all reasonable doubt • reasonable certainty • balance of probabilities/evidence • strong possibility • scientific suspicion of risk • negligible/insignificant

  27. Organisation through Interest Groups On each of the 33 servers across Europe

  28. CIRCLE Library Service

  29. Data access and visualisation and Reference Centre Concept E2RC Reports Information Retrieval System EEA Warehouse European Layer National Layer Data Flows in EIONET

More Related