1 / 21

KM Impact Challenge: Preliminary analysis and emerging lessons

KM Impact Challenge: Preliminary analysis and emerging lessons. Marie-Ange Binagwaho Knowledge-Driven Microenterprise Development Project. January 27, 2011. KM Impact Challenge.

kalb
Télécharger la présentation

KM Impact Challenge: Preliminary analysis and emerging lessons

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. KM Impact Challenge:Preliminary analysis and emerging lessons Marie-Ange Binagwaho Knowledge-Driven Microenterprise Development Project January27,2011

  2. KM Impact Challenge • An initiative under the Assessing and Learning mandate of the Knowledge-Driven Microenterprise Development (KDMD) Project • KMIC has partnered with KM4Dev • The Impact Alliance provides facilitation support for the challenge process.

  3. Why a KM Impact Challenge? • Knowledge management approaches such as communities of practice, learning networks and online knowledge repositories have emerged during the last decade, with marked expansion over the last five years • While many believe that more knowledge-driven programmatic approaches are inherently better, most organizations lack the evidence to clearly and compellingly demonstrate the impact of their KM investments. • We are all asking similar questions around the impact of KM and learning but there are many different answers • A key moment to expand the knowledge base and reflect on how people are addressing these questions and identify successful examples

  4. The KM Impact Challenge has been visited 2,669 times via 100 countries/territories Top 10 countries visiting the KMIC

  5. Collection of case stories From 1st December 2010 – 30th January 2011 we are collecting case stories to capture diverse experiences of KM assessment through the following questions: • Organization • Organization Type • Sector • Describe the KM initiative • Describe the approaches utilized to measure / assess this KM initiative • What was the purpose or motivation for assessing this KM initiative? • What were the most important lessons learned about the assessment process? • What would you do differently next time? • What advice would you give to others based on your experience? • What do you think are the main unanswered questions or challenges related to this field of work?

  6. Geographic Sources of Case Stories 25 approved case stories • Africa: 10 • Asia: 2 • Australia and the Pacific: 1 • Europe: 1 • Latin America and the Caribbean: 4 • North America (excluding Mexico): 7 KMIC has 174 members and we are followed on facebook and twitter

  7. Types of KM initiatives • Training courses (2) • Community of Practice / networks (3) • Internships (1) • Learning events (2) • Community knowledge /learning centres (4) • Social media and new technologies (3) • Huddle and other collaboration platforms (4) • Technology transfer (3) • Management performance / organizational culture (3)

  8. Sectors represented • Agriculture (5) • Health (5) • Education (5) • Technology (4) • Environmental (2) • Gender (2) • Governance (1) • Policy Research (1)

  9. Examples of tools and methods mentioned • Situational analysis • Needs / Demand assessment • Organizational Capacity Assessment (OCA) • Gantt Chart • Tangible / intangible benefits log • Balanced Score cards • Rapid Assessment and Prioritization of Protected Area Management (RAPPAM) methodology • Social Network Mapping and Analysis • Gender Analysis Matrix (GAM) • Most Significant Change (2)

  10. Understanding assessment approaches • Surveys • 11 case stories include description of some type of surveys raising issues such as: • Importance of survey design and asking the right questions • Need for multifaceted approach to data collection - online surveys alone are not sufficient and should be complemented with phone or other mechanisms • Indicators • 5 Case stories mention indicators • These cases highlight importance of building in assessment and monitoring processes from the project design stage to identify indicators that they could follow through on and which generate useful data. • Social media • 2 case stories describe innovative uses of new technologies to collect and compile assessment data • Another highlights weakness of webstats to monitor access and usage from common IP address

  11. Challenges identified • Too much data is confusing • A lack of time and a lack of resources to document an experience • Initial objective of involving all stakeholders is not always easy • We can systematically measure how information is accessed and shared but not how it is applied and used. • Many suggested approaches require trained facilitators • A lack of ownership of assessment “efforts follow a request from their donors, and not a true felt need”

  12. Success factors • Actionable feedback • The strongest cases described the process of collecting actionable data and using this feedback to inform the ongoing project process. • “..clearly articulating what participants will get out of participating in these assessment processes. While contributing to the “greater good” of building and strengthening the field is generally seen as valuable, participants understand how contributing ….enables them to have a say in the type of information they will obtain.” (Making Cents) • Building trust is essential to both successful knowledge sharing and assessment process • “participation needs to be seen on a continuum; in light of criticism directed at participatory processes that fail to connect with decision-making processes, a focus on organizational utilization is well justified and essential for realizing the potential of such methods” (Help Channel Burundi-Ethnocorder)

  13. Emerging lessons • Stronger stories are those that incorporated M&E from the beginning of processes and / or used M&E assessment to generate actionable data which informed project development. • Assessment processes designed to promote learning and improvement are stronger than those which respond to reporting requirements • Simplicity is key, too much data makes things confusing • There is some ambiguity between KM / M&E • Both require information exchange • Both are strengthened by functional feedback loops

  14. Examples of case stories • Ethnocorder: An Innovation in Mobile Data Collection and Use- Burundi • Breaking the Walls of a KM Class Room with YouTube - India

  15. Ethnocorder: An Innovation in Mobile Data Collection and Use- Burundi • Innovative assessment tool to for mobile data collection of multimedia content, combining technological availability with knowledge management theory. • Not only supports collection of qualitative data and stories but enables video clips to be used as cues for surveys and focus groups • Real time tagging of video responses supports quantification of data • Promotes organizational culture of dialogue, innovation and learning • www.ethnocorder.com

  16. Breaking the Walls of a KM Class Room with YouTube - India • Supports development of KM practitioners on diverse elements of KM via YouTube open content repository. • Captured wisdom and insights from leading KM thinkers • Positive branding of channel provides incentive for thought leaders to participate • Assessed in terms of efficiency / activities (hits, views and measures) and effectiveness (business outcomes) using combination of web stats, surveys and verbal conversations. • Also popularity metrics to measure discussions on social networking sites to highlight importance of brand value of channel • http://www.youtube.com/user/eclerxservices

  17. Other activities on KM Impact challenge site • Library contains approximately 50 documents including practical manuals and toolkits, project reports, academic papers, corporate guidelines and overview papers which address the issues of KM Impact from numerous perspectives. • Blog includes experts interviews, synthesis and feedback on key discussions and news of upcoming events • Participation in key international events e.g. Share-Fair, Addis Ababa…

  18. Brief reflections on the challenge to date • People reflected more on their own lessons learned than in their lessons specific to the assessment process • People are still using traditional M&E approaches though there are some emerging innovations such as the use of social media • KM cross cutting theme involving different sectors • Most assessments are donor driven – not internally driven • Limited to English language a challenge for some participants (having said that the first case stories were submitted from Latin America by Spanish speakers) • Consistent support and feedback is being given to case story authors throughout the submission process resulting in solid case stories

  19. Next steps • Case story submission closes 30th January • Evaluation by Technical Advisory Group based on following criteria • Clarity: Does it clearly illustrate how a knowledge and learning initiative or KM investment was assessed, monitored or evaluated? • Analysis: Does the case story identify key lessons or the relative strengths and weaknesses of the tool/approach used? • Creativity: Does the case story describe an innovative approach or novel adaptation of an established methodology • Replicability: Does the case story present practical approaches, lessons learned or advice that can be used by others to improve their knowledge management assessment practice? • Sense-making and synthesis to extract lessons learned • UnConference May 2011

  20. Thank you http://kdid.org/kmic

More Related