1 / 56

Evaluating Experiences in Transitional Justice and Reconciliation Cape Town, South Africa April 2-4, 2007

Evaluating Experiences in Transitional Justice and Reconciliation Cape Town, South Africa April 2-4, 2007 . day 1. Objectives of Workshop. Discuss issues, challenges & opportunities for evaluating projects and programs that promote transitional justice and reconciliation.

fauna
Télécharger la présentation

Evaluating Experiences in Transitional Justice and Reconciliation Cape Town, South Africa April 2-4, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Experiences in Transitional Justice and Reconciliation Cape Town, South Africa April 2-4, 2007

  2. day 1

  3. Objectives of Workshop • Discuss issues, challenges & opportunities for evaluating projects and programs that promote transitional justice and reconciliation. • Introduce evaluation tools & methodologies that might assist in identifying and tracking outcomes. • Explore interest in further developing evaluation approaches for transitional justice and reconciliation.

  4. Why this workshop, why now? • Stellenbosch conference on empirical research methodologies, Nov.2002 • Prospects for using evaluation findings as a research tool. • Evaluation community: Growing methodological orthodoxy; shortcomings of methods for evaluating conflict prevention and post-conflict interventions. • IDRC`s approach to evaluation mirrors approach to development research (rigour/validity, action-oriented, focus on ownership, participation, prioritizes CB • `Evaluative thinking`- evaluation as an analytical way of thinking.

  5. Review of key concepts: • What do we understand by? • Monitoring • Evaluation • Results • Research

  6. M&E – what’s the difference? Monitoring • Ongoing, continuous • Internal activity • Responsibility of project staff and management • Continuous feedback to improve programme & report on performance Evaluation • Periodic and time bound; `episodic assessment` • Can be internal, often external • Responsibility of evaluator with staff and management • Periodic feedback

  7. What are “Results” For the Evaluator: • Outputs, Outcomes, Impacts, Effects For the Researcher: • When we talk about “research results, we actually mean “research findings”

  8. How do Evaluation and (Applied) Research Differ? Many similarities: • Both rely on social science methods; • Examine multiple facets of a problem, often using multimethod approaches; • collect and analyse data; • utilize theory to inform work;

  9. What´s the Difference between Evaluation and (Applied) Research? Distinctions: Evaluation uses universally accepted standards (Utility, Propriety, Feasibility, Accuracy) Always assesses the performance of the person or entity under investigation Audience:Evaluation has a client who wants to know something.

  10. overview of outcome mapping

  11. before we start, be aware... • OM is not a panacea • New vocabulary • OM depends on context, needs and realities

  12. the outcome mapping story

  13. history of outcome mapping • mid-1990s: need to demonstrate results • 1998: met Barry Kibel and Outcome Engineering • methodological collaboration with FRAO & NEPED • 2000: publication of manual in English • presenting, training & using OM globally • 2006: www.outcomemapping.ca • …. towards the future

  14. what’s in a name?

  15. key evaluation challenges • measuring development results of research • establishing cause & effect in an open system • timing • encouraging iterative learning • clarifying values

  16. POLITICS • New Knowledge/ Ideas • POLICIES • Motivations • NGOs • Environmental • Women Groups • Advocacy • ENVIRONMENT • National/ Regional • Money/ policy/ services • EXTENSION GROUPS • Government • NGOs, • Advisory services • OTHER RESEARCHERS • Universities • Technologies • Information • DISSEMINATION • New Knowledge • Extension • Inputs • FUNDERS • Policies • Motivations • Money + Inputs • PRIVATE SECTOR • Seed suppliers • Technical Assistance • MINISTRY • Agricultural • Facilitation • Regulations • NGOs • Facilitate Adoption of Technology • FARMERS • Farmer Orgs • Advocacy for research NARO Beneficiaries • RESEARCH INSTITUTES • Outside Community • GOV/ MINISTRIES • Incentives to Facilitate Adoption of Technology • NARO • Support Staff • Research Support • DONORS • Financial resources • Human resources • REG~L/ INTERN~L CENTRES • Information Technology • FARMER ORGANIZATIONS • Identify problems • Dissemination RESEARCH MANAGERS RESEARCHERS • FARMER ASSOCIATIONS • Extension Services • LOCAL ORGs • Leadership • Mobilization USERS (Farmers & Families) • FARMERS/ PRODUCERS • New Knowledge • Sharing • Motivation • NGOs • Farmer training • Transfer of Technology PRIVATE SECTOR National & International RURAL SOCIOLOGISTS Identification of opportunities & constraints • POST PRODUCTION • Marketing • Transportation/ Shipping connecting research to well-being 8-15 Years

  17. Impact Implies: Cause & effect Positive, intended results Focus on ultimate effects Credits a single contributor Story ends when program obtains success Development Implies: Open system Unexpected positive & negative results occur Upstream effects are important Multiple actors create results & need credit Change process never ends problem with « impact »

  18. focus of outcome mapping Behavioural Changes

  19. where is the map? • OM is a guide to the journey we take with our partners. We co-create the map. • It focuses on the intention, • what happens - and the learning - along the way • The map is not the territory but the route taken

  20. recommended reading Liberia case study & OM manual foreward by Michael Quinn Patton and introduction (pages vii-ix and 1-15)

  21. day 2

  22. the core of outcome mapping

  23. what is outcome mapping? • A methodology for planning and assessing the social effects & internal performance of projects, programs, & organizations

  24. a flexible, multiple-use tool • Planning • Monitoring • Evaluation

  25. What are we trying to accomplish and how? What do we want to know? What do we want to learn?

  26. outcome mapping key messages

  27. looking at the bigger picture • Seeing yourself as a part of a interconnected web of relationships and systems

  28. recognizing that change is… • Continuous • Complex • Non-linear • Multidirectional • Not controllable

  29. embrace constant change “It’s not possible to see the same river twice.”

  30. keeping your eyes wide open • Being attentive along the journey is as important as the destination

  31. focus on direct partners • Key concept is « boundary partners » • The individuals, groups, and organizations you work with directly and anticipate opportunities for influence

  32. spheres of influence The rest of the world project / program = boundary partners

  33. boundary partners have boundary partners program program’s bp bp’s bp

  34. Families PHCs Banks Community Leaders SHG Police State NGO State NGO State NGO State NGO State NGO State NGO BAIF IDRC Swayamsiddha CIDA

  35. moving from stakeholders...

  36. ...to boundary partners project

  37. why behaviour changes? • To stress that development is done by and for people • To illustrate that although a program can influence the achievement of outcomes, it cannot control them because ultimate responsibility rests with the people affected

  38. contribution not attribution • your influence on a better world • you can influence but not control change in your partners

  39. uses of outcome mapping

  40. principles of use • Flexible: modular to be adapted to use & context • Complementary: use with other methodologies. • Participatory: seeks dialogue and collaboration with partners • Evaluative thinking: culture of reflection, results oriented thinking, andpromotes social & organizational learning

  41. primary uses • PLANNING: articulate goals & define activities • MONITORING: assess program performance & partners’ outcomes • EVALUTION: design & conduct a use-oriented evaluation

  42. assessing development results Behaviour Changes

  43. assessing internal performance Behaviour Changes Program

  44. assessing influence Behaviour Changes Program

  45. …within their context Behaviour Changes Program

  46. progress markers • A graduated set of statements describing a progression of changed behaviours in the boundary partner • Describe changes in actions, activities and relationships leading to the ideal outcome; shows story of change • Articulate the complexity of the change process • Can be monitored & observed • Permit on-going assessment of partner’s progress (including unintended results)

  47. progress markers are graduated • move from easier to more difficult to achieve changes in behaviour • describe the change process of a single boundary partner • are more complete than a single indicator

  48. how many progress markers? Suggestion: total of 15, with most occurring in the “like to see” range Expect 4 Like 8 Love 3 Life of program • Remember, more PMs = more data points to monitor

  49. progress markers = ladder of change Outcome challenge Love to see Truly transformative Set quite high Like to see More active learning, engagement Expect to see Early response to program’s basic activities

  50. X

More Related