1 / 19

Joint Meeting of DG REGIO Evaluation Network and ESF Evaluation Partnership Gdansk 8 July 2011

This joint meeting discusses the theoretical foundations and current state of international thinking on evaluation use and usability. Topics include critical distinctions in evaluation use, design considerations, and factors influencing use. The aim is to emphasize the benefits of evaluation as a resource for policy learning and development.

jfay
Télécharger la présentation

Joint Meeting of DG REGIO Evaluation Network and ESF Evaluation Partnership Gdansk 8 July 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Joint Meeting of DG REGIO Evaluation Network and ESF Evaluation Partnership Gdansk 8 July 2011 Evaluation use and usability: theoretical foundations and current state of international thinking Murray SaundersIOCELancaster University

  2. Mapping the territory • Drawing on international sources(IOCE, World Bank, SE Asia CoE, DG REGIO) • Critical distinctions in evaluation use • The use of evaluation outputs: • What counts as Use • What counts as usability • 3. Some difficult questions: Designing evaluations, factors influencing use and questions produced through RUFDATA

  3. Why a re-emphasis on the uses and impact of evaluation as a resource for policy learning and development? The urge to ‘sense make’ in increasingly complex environments Evaluations tell us what is going on and what works?Social and political imperatives (issues of transparency, resources, legitimacy and equity) Evaluations contribute to public debate?Methodological debate: difficulties and uncertainties in addressing ‘end points’ (attribution, causality, alignment and design) Evaluations provide authoritative evidence for policy and developmental strategy? Evaluations cost time and money Moving away from evaluations as expensive ritual, or as compliance toward evaluations as ‘use objects’?

  4. The claim is that usedeffectively, evaluation offers a range of specific benefits or outcomes: • stronger public sector planning • more efficient deployment of resources • improved management and implementation of programmes or other policy interventions • stronger ownership and partnership amongst actors with a stake in programmes • greater understanding of the factors determining the success of programmes • broader scope to assess the value and costs of interventions The use of evaluation in the management of EU programmes in Poland, Martin Ferry, Karol Olejniczak Warsaw 2008 (Ernst and Young)

  5. Some distinctions of use and mis-use Instrumental: when decision makers use the evaluation findings to modify the object of the evaluation in some way Conceptual: when the evaluation findings help program staff understand the program in a new way Enlightenment: when the evaluation findings add knowledge to the field and thus may be used by anyone, not just those involved with the program or evaluation of the program Process use: cognitive, behavioural, program and organizational changes resulting from engagement in the evaluation process and learning to think evaluatively.Persuasive or symbolic: persuade important stakeholders that the program or organization values accountability or when an evaluator is hired to evaluate a program to legitimize a decision that has already been made prior to the commissioning of the evaluation.Dreolin N. Fleischer and Christina A. Christie American Journal of Evaluation 2009 30: 158

  6. The idea of ‘process use’: effects of undertaking evaluations Refers to the unintended or intended effects of the process of carrying out an evaluation: formative evaluation in action (Patton now talks about developmental evaluation-continuous adaptation) • Foregrounding new issues for managing an intervention • Drawing attention to ‘hot spots’ or problem areas • Forcing attention on difficult areas • Providing a ‘voice’ for the powerless • Drawing attention to time-lines • Making participants think about ‘audience’ and ‘users’ • Policing role

  7. Turning to the uses and usability of the outputs from an evaluation

  8. What forms do evaluation outputs take? • A report: • Data and analyses (narrative • analysis, statistical analysis, • comparisons, modeling) • Evidence of process, outputs, results, • effects, outcomes Analysis: descriptive (what) Analysis: diagnostic (how) Analysis: prescriptive (what should be) • Cases of good practice • Scenarios • Recipients’ experience

  9. By their ‘use’ we are referring to the capacity of outputs to contribute to development (policy, practice, strategic management) This involvesenabling Changes in policies Changes in practices Changes in systems and protocols Changes in thinking Changes in culture In order for an evaluation output to have developmental impact Must involve contribution to decisions about sustainable changes /improvement

  10. Stages of use of an evaluation output: use audit for stakeholders 0 No Awareness • Awareness: little concern or understanding of implication 2. Informational: awareness plus interest in knowing more about implications for policy or practice • Personal: beginning to analyse potential implications for policy and practice and impacts on planning (contribution) • Management: attention on difficulties in the processes and tasks involved in developing new practices/policies on the basis of evaluation outputs • Consequence: attention on impact on stakeholders of new practices/policies, their relevance, evaluation and implied changes derived from evaluation outputs • Collaboration: co-ordinating and co-operating with others in using new practices/implementing policies 7.Refocusing: attention now on adaptation, major changes, alternatives to original ideas, creativity

  11. Use as ‘engagement’: engagement practice Less More Use • Distributive or dissemination practice: • Report • Executive summary • Article • Presentational practice: • Seminars • Presentations • Active workshops • Embodiments • Interactional practice: • Working alongside colleagues • Analysis of situational enabling and constraining factors for change (with decision makers/users)

  12. Usability refers to the design of an evaluation: I suggests that there are 7 design decisions that can critically effect its usability Use refers to characteristics of the organisational context and its capacity to respond to evaluation outputs Both dimensions are important in explaining ‘high or low use environments’ Differences in meaning and practice between use and usability

  13. Towards a framework; what counts as Use? “Use” refers to the way in which the outputs of an evaluation act as a resource for onward practice, policy or decision making”

  14. This requires ‘bridging’ or boundary crossing practices Evaluation output as a bridging artefact or tool: • Providing examples of interesting practice • Suggesting ways of moving from A to B (theories of change or engagement strategies • Connecting with existing practices (en-grooved practice and how to un-block) • Designing evocative resources for change management • Moving to decisions on continuation, funding or new policy • Validity and authenticity of the evidence is fore-grounded

  15. Towards a strategy to maximise use • Embed the output in decision making cycles (clear knowledge on when decisions take place and who makes them) • Clear understanding of organisational memory (how evaluations might accumulate and how this output connects) • Analyse the capacity of an organisation to respond • Systemic processes (feeding into structures that are able to identify and act on implications) • Organisations that are lightly bureaucratised (complex adaptive systems) are better placed to respond to ‘tricky’ or awkward evaluations • Strongly connect evaluation to power structures • Evaluations that are congruent: recommendations from evaluation need to build on what is already in place. Avoid suggestions which need a total change unless there are resources to back them up.

  16. Towards a ‘utilisation’ framework; what counts as usability? “Usability refers to the way an evaluation design shapes the extent to which it’s outputs can be used”

  17. Designing evaluations for usability and use: critical questions produced through RUFDATA • Reasons and Purposes [planning, managing, learning, developing, accountability] Potential users know why an evaluation is taking place and have an intention to use it • Uses [providing and learning from examples of good practice, staff development, strategic planning, PR, provision of data for management control, planning and milestones Rehearsing use environments in real time with real people by identifying a list of specific practices, for example: • Tabling the report at a meeting to assess its implication • Deciding on what those implications might be and acting on them • Doing so in an agreed timeline • Undertaking staff development activities on the basis of the findings • Publicising and disseminating more widely etc

  18. Designing evaluations for usability and use: critical questions produced through RUFDATA • Foci [activities, aspects, emphasis to be evaluated, should connect to the priority areas for evaluation]Create a need to know with key stakeholders by careful selection of foci (co-construction?) • Data and Evidence [numerical, qualitative, observational, case accounts] Render evidence and data sets in ways that the non technical stakeholder can ‘read’ them • Audience [Community of practice, commissioners, yourselves] Discriminate between different audiences by style, form and content of output • Timing [Coincidence with decision making cycles, life cycle of projects] Make sure the evaluation output and deliverable deadlines within a proposal coincide with other decision making cycles • Agency [Yourselves, external evaluators, combination] Involve as wide a group as possible in design

  19. Evaluation Process • External/Internal • Inclusive/hierarchic • Qualitative/Quantitative • Methodology used • Product Quality • Relevance • Context specific • Reality reflected • Communication/engage- ment strategy • Evaluative Culture and organisational context • Findings attached to further funding • Value given to evaluation • Organisational ‘insertion’ • External influence (pressure/independence) Factors Affecting Evaluation Use • User Attitude • Forward looking to improvement • Involvement

More Related