1 / 22

Common Impact Indicators in Extension Community Development Programming

Common Impact Indicators in Extension Community Development Programming. Tim Borich, Iowa State University Scott Chazdon, University of Minnesota Mary Simon Leuci, University of Missouri Scott Loveridge, North Central Regional Center for Rural Development. Outline of Presentation.

darius
Télécharger la présentation

Common Impact Indicators in Extension Community Development Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Common Impact Indicators in Extension Community Development Programming Tim Borich, Iowa State University Scott Chazdon, University of Minnesota Mary Simon Leuci, University of Missouri Scott Loveridge, North Central Regional Center for Rural Development

  2. Outline of Presentation • Origins, purpose and goals of the common metrics • Outcomes to date • How do we collect the information? • What goes in and what does not? • Definitions • Case studies • North Dakota • Iowa • Missouri • Minnesota

  3. How Did We Get Here? • Federal Extension funding flat or declining • Requirement for 25% Federal funds in multi-state efforts • NC Extension Directors asked each program area to develop common indicators for multi-state programming: • Help document 25% effort. • Communicate better with policy makers: “Don’t tell me stories.”

  4. Now in Year Four of Effort • Initial set of indicators has been tweaked to deal with: • Lack of clarity about measure • Difficulty of obtaining measure • Current set of measures is now “field tested” and found to be feasible, with most states able to report on most metrics.

  5. 20-year Trend in Smith-Lever Funding Source: Association of Public and Land-Grant Universities, 2012. http://www.land-grant.org/docs/FY2013/SL.pdf

  6. Background Justification-Budget • APLU graph understates the problem • Skilled labor costs vs. general inflatioin • Impacts of Federal budgets… • No replacement when colleagues resign/retire • Need to cover more territory • No raises, or below-inflation raises • Furloughs • Layoffs • Reduced operating • Ability-to-pay vs. needs-based programming • Ultimate results? What if we continue to lose ground to inflation?

  7. Background Justification – Best Practices • Understanding your program • Experiment and find what works • Marketing your program • “Final Exam” • Community Design Team experience

  8. How are Indicators Used? • Within your state or service area • Telling the story of extension for state and local officials • Helping Extension professionals refine their program efforts. Which programs generate the most impact for my time? • Across states • Helping program leaders refine their program mix. Which programs generate the most impact at the state level? Which programs from other states should we try to pick up? • Nationally • Communicating with NIFA • Communicating with Congress

  9. Annual Impacts Report

  10. Attribution Principle Don’t need press clippings or sworn statements Need knowledgeable individual from the target community (not employed by Extension) who can vouch for the impact. “But for” concept. Would the impact have occurred without Extension?

  11. North Central States Impact Indicators 2012 • Educational Contacts • Persons who received educational services via face-to-face or live distance enabled sessions. • Persons participating more than once should be counted more than once. • Number of racial minority contacts • Contacts (as above) who self-report as non-white racial status • Number of Hispanic contacts • Contacts (as above) who self-report as Hispanic or Latino

  12. North Central States Impact Indicators 2012 • Number of participants reporting new leadership roles and opportunities undertaken • New leadership roles may include formal (e.g. board member) or informal (e.g. advocate, group leader). Use attribution principle. • Number of business plans developed • Includes formal business plans and informal strategic changes. • Use attribution principle. • Number of community or organizational plans developed • Includes formally adopted plans by official agencies as well as strategies. • Use attribution principle.

  13. North Central States Impact Indicators 2012 • Number of community and organizational, policies, plans adopted or implemented • Includes plans (as above) wholly or partially adopted or implemented. Use attribution principle. • Number of businesses created • New business start ups or firms that moved into the area. Use attribution principle.

  14. North Central States Impact Indicators 2012 • Number of jobs created • New jobs in the area as a result of programs. Use attribution principle. • Number of jobs retained • Existing jobs that were at risk, protected by programs. Use attribution principle. • Dollar value of volunteer hours leveraged to deliver programs • Based on Independent Sector value for your state (http://www.independentsector.org/volunteer_time) • Count hours provided by individuals in executing the program (include volunteer hours required for certification).

  15. North Central States Impact Indicators 2012 • Number of volunteer hours for community generated work • Count hours indirectly generated by programs. • Example: person receiving training recruits additional volunteers. Use attribution principle. • Dollar value of volunteer hours generated by organization and/or community as result of program • Based on Independent Sector value for your state (http://www.independentsector.org/volunteer_time)

  16. North Central States Impact Indicators 2012 • Dollar value of efficiencies and savings • Count savings through improved processes and approaches due to programs. • Dollar value of grants and resources leveraged/generated by communities • Dollar value of resources leveraged by businesses • Includes loans and investments. Use attribution principle.

  17. NDSU – Measuring Impact Keep it simple Don’t try to measure everything Measure what you can Be able to defend what you measure Begin with the end in mind! Ask yourself – why and who

  18. NDSU – Cultural Change • Make it easy • Provide evaluation tools and training • Send out the indicators matrix 4 x per year • Make it valuable • Encourage use with required annual impact report • Encourage conversation on impacts with supervisor during annual review

  19. Iowa State University CED Evaluation and Reporting Reporting: Impact system is web based Attribution, News Media, Evaluation Data, Secondary Data. Regional Indicators Plus Data recorded by case and community Community Cases can be updated PRI (It’s not Public Radio International) Omission/Under Reporting is still our biggest problem

  20. Univ. of MO – Measuring Impact • Focus on the in-depth programs and multi-year • Build in follow-up as part of the program • Appropriate, not often a survey, use of ripple effect mapping • Be able to defend what you report: attribution • Basic criteria: “If but for” If Extension had not done this work with us, we would not have started x and therefore been on this path that helped us …… • Jobs: • Don’t count temp jobs • National data sources don’t distinguish between fulltime and part time so both ok to count • For single proprietor new business, count as one job • Recognize nonprofits and governments create or retain jobs • Be sure to question if someone gives you big number or a number that doesn’t ring true—seek to understand • Ask key contacts in community/organization to copy you on key email follow up and links to news articles.

  21. Minnesota – Link to Ongoing Evaluation When Possible

  22. Summary • Implementing indicators takes patience – probably a several year effort to fully implement in a state • Payoffs for system can be great in terms of • Communicating our relevance to the public and to key policy makers • Helping us assess how we can improve our work

More Related