Lessons Learned from Evaluation: A platform for sharing knowledge Mike Spilsbury, Catrina Perch, Seg Norgbey, Ganesh Rauniyar and Cristina Battaglino
Common views on lessons learned ‘Lessons’ are: • Often of variable quality and are generally underutilised • Often “platitudes borne of a felt need to demonstrate engagement in the ‘knowledge society’ or to satisfy evaluation requirements” • ‘Lessons learned’ should more accurately be regarded as ‘lessons to be learned’
Key Problems with Lessons 1.) Lessons are often poorly formulated (low quality) 2.) Processes to promote dissemination and uptake of lessons are often weak 3.) Lessons are often considered individually and patterns across lessons are not known
Improving Lessons: Developing a platform for sharing lessons EOU embarked on a process: • To screen all existing lessons applying minimum quality standards based on lesson definitions • Classify lessons in a ‘problem tree framework’ • Use the framework as a tool for enhancing uptake, dissemination and access to UNEP evaluation lessons
Screening existing lessons • What constitutes a lesson? A quality lesson must: • concisely capture the context from which it is derived • be applicable in a different context (generic), have a clear ‘application domain’ and identify target users. • should suggest a prescription and should guide action • Low quality lessons were identified and removed from the UNEP lessons database
Screening existing lessons • UNEP EOU had a database of approximately three hundred ‘lessons’ from evaluations (1999-date) • Nearly 50% of all the lessons analysed failed to satisfy the criteria • This led us to specify attributes of quality lessons in TORs, evaluation guidelines and evaluation quality control processes • Whole process – 4 eval. professionals x 2 weeks = 2 man months (spread over 5 months)
Developing a framework of lessons • A ‘problem tree’ approach was adopted - since the bulk of UNEP’s lessons are derived from evaluations of projects or sub-programmes a generic or ‘central’ problem was defined as: • “UNEP projects and programmes have sub-optimal impact”
‘Central’ and ‘cornerstone’ problems for clustering lessons Lessons were debated and ‘underlying’ problems were identified (or inferred). The problems were then clustered and organised in a hierarchy of causality.
Lesson 112 “It is critical that the internal logic of the project be very clearly spelled out in the project document and that the strategic linkages between outcomes and objectives are made very clear. Those implementing or supervising a project are frequently completely different people from those who developed the project. The Project Document needs to be a self-explanatory, stand-alone document.” Mid-Term Evaluation of the UNEP UNDP GEF Project “Botswana, Kenya and Mali: Management of Indigenous Vegetation for the Rehabilitation of Degraded Lands in Arid Zones of Africa”GF/2740-03-4618
Advantages of the Lessons Framework • Multiple lessons can be clustered around commonly occurring issues (or ‘root causes’), - can provide ‘triangulation’ for commonly articulated lessons • Lessons can be associated with more than one issue or problem - rather than applying a mutually exclusive (‘taxonomic’) classification approach • Aids identification of commonly occurring problems across a project / programme portfolio
Application of the lessons framework • The framework is NOT a definitive statement on causality • EOU regards the categorisation of lessons in the framework per se as much less important than the process of discussion and debate about such categorisation • The process of classifying lessons within the framework with key intended users will provide an excellent interactive means of promoting their uptake or ‘influence’ as new lessons are examined within the context of all others • The application of the ‘lessons platform’ is currently in a pilot phase
Findings and Conclusions • Many lessons failed to meet the criteria developed for high quality lessons and were deleted from the EOU evaluation lessons database. • This prompted the unit to specify more clearly the requirements for drafting lessons in our standard evaluation guidelines, TORs and evaluation quality control processes. • The lessons screening process reduced both the volume of information and significantly raised the overall quality and relevance of our lessons. • The framework of lessons learned from evaluation provides a means to visualise all lessons at once, and to see how different lessons relate to common problems and to one another. • It is intended to be a user-friendly way of presenting and storing information in relation to lessons from evaluation.
Findings and Conclusions • We regard the framework of lessons from evaluation as useful ‘platform’ for both collating and disseminating lessons – it provides a tool for discussing evaluation lessons with intended users. • The problem-oriented nature of the lessons framework provides an intuitive and interactive ‘user interface’ to the usual databases of lessons that are commonly collated by evaluation units.