1 / 68

Some Practical Tips for Measuring Financial Success Dr. Angela Lyons University of Illinois

Some Practical Tips for Measuring Financial Success Dr. Angela Lyons University of Illinois. Program Evaluation II: Creating Your Evaluation Toolkit Presented by Dr. Angela Lyons University of Illinois October 2009. Identifying the “ideal” approach to program evaluation.

aletta
Télécharger la présentation

Some Practical Tips for Measuring Financial Success Dr. Angela Lyons University of Illinois

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some Practical Tips for Measuring Financial Success Dr. Angela LyonsUniversity of Illinois Program Evaluation II: Creating Your Evaluation Toolkit Presented by Dr. Angela Lyons University of Illinois October 2009

  2. Identifying the “ideal” approach to program evaluation • Evaluation methods and measures vary widely across programs and academic disciplines. • Wide variation in financial outcomes across programs. • Significant differences in financial needs across consumers. • Some participants unable to implement certain financial behaviors.

  3. Then you need to decide…. “To evaluate or not to evaluate, that is the question?” “To evaluate or not to evaluate, that is the question?”

  4. Questions to ask yourself …. • At the end of the day, what are you trying to show? • What is the purpose of the evaluation? • Who will use the information – and how? • What information do you want to collect? • Who is your target audience? • What is your primary delivery method? • What are your available resources (i.e., time, money, and staff)? • What is your timeline? • What is your expertise and evaluation capacity? • Who are your partners, funders, and stakeholders?

  5. Common survey methodsused to collect impact data • Post evaluation only • Retrospective pre-test (RPT) • Pre and post evaluation • Follow-up • Stages to Change (TTM) • Control groups and longitudinal studies Key question to ask: What is the length of your program?

  6. Post evaluation only When to use: Short programs that are less than 2 hours Advantages: • Only need to survey group once. • Good for limited-resource audiences and groups that are transient. • Relatively inexpensive and less time intensive. • Can document participants’ levels of knowledge, skills, and planned behaviors at the end of the program. Disadvantages: • With no pre assessment, it’s difficult to document potential and actual changes in knowledge, attitudes, and behavior. Retrospective pre-tests (RPTs); The Post-Then-Pre Evaluation

  7. Retrospective pre-test (RPTs) When to use: Any program, but typically 2 hours or less Advantages: • Only need to survey group once. • Good for limited-resource audiences and groups that are transient. • Controls for “response shift bias.” • Can document “relative” change. Disadvantages: • Potential for respondent bias (social desirability factor). • Self-assessment measures are subjective.

  8. Pre and post evaluations When to use: Programs that are 2 hours or longer Advantages: • Can compare pre and post responses and document changes in knowledge, attitudes, and behavior. • Can be used to document immediate changes in knowledge, skills and planned behaviors following the program. Disadvantages: • More time intensive. • Identification numbers are needed to match pre and post surveys. • May be difficult to show actual behavior change. • May be difficult to show that the intervention caused the change. • Doesn’t account for other possible reasons for change. • Transient populations may lead to low unmatched evaluations.

  9. Follow-ups When to use: • Program is comprehensive enough to potentially result in intermediate and long-term impact. • Must have adequate resources and evaluation capacity. • Usually administered three to six months after the program. • Can document changes in actual financial behaviors, ability to achieve financial goals, and overall financial position.

  10. Stages to Change (TTM) When to use: Programs that have multiple sessions Advantages: • Can document intermediate and long-term change. • Easier to measure actual behavior change and to control for other factors that may lead to change over time. • Can identify stage at which individual is ready and able to change behavior. • Behaviors can be recorded at the beginning, middle, and end of the program so that changes in actual behavior can be observed. Disadvantages: • Time and resource intensive. • May require additional progress reporting and long-term follow-up. • Can only be used with multi-session programs.

  11. Train-the-trainer evaluations • Similar to pre and post evaluation, but more content specific. • Covers subject material in more detail to ensure that trainers have an adequate level of knowledge to teach the program to others. • Can be used to document changes in both the instructors’ teaching skills and personal financial behaviors. • Follow-ups can document how the curriculum materials are being used and identify additional programming needs.

  12. Designing the evaluation instrument:Key survey content • General reactions to the session • Changes in knowledge • Changes in motivation, confidence, and abilities • Intended changes in behavior • Actual changes in behavior • Future programming needs and preferences • Demographics • Qualitative / open-ended responses

  13. General reactions to the session Please rate the instructor(s), materials, and the overall program by checking the box that best applies.

  14. Measuring changes in knowledge Testing Knowledge Please circle your answer to each of the following statements.

  15. Measuring changes in knowledge (conti.) • Format can be True/False or multiple choice. • True/False is reliable indicator for low literacy audiences and youth. • The more questions you ask, the greater the reliability measure. • May include a “don’t know” option to control for guessing. • Post-test: 10 questions (established standard) • Pre- and post-test: 10-20 questions • Train-the-trainer: 10-25 questions

  16. Changes in motivation, confidence, and abilities Building Skills/Confidence Indicators Please check the box that best describes your confidence to do the following:

  17. Changes in motivation, confidence, and abilities (conti.) Recording Participants’ Attitudes Please check the box that best describes how much you agree with the following statements.

  18. Intended changes in behavior Please indicate how often you plan to do each of the following financial practices. There is no “right” or “wrong” answer. (Choose only one)

  19. Actual changes in behavior Please indicate how often you are currently doing each of the following financial practices. There is no “right” or “wrong” answer. (Choose only one)

  20. Capturing behavior change in follow-ups and “stages” Financial Progress Indicators Please check the box that best describes how your financial position has changed since completing the program. Then, indicate specifically how your financial position has changed.

  21. Capturing behavior change in follow-ups and “stages” Progress Reporting Please record your financial position based on your current progress in the program.

  22. A few words about train-the-trainer programs…. • Testing knowledge • Building teaching skills • Shaping personal skills • Taking action for teaching • Taking action for personal financial success • Follow-ups

  23. Qualitative / Open-Ended Questions (common examples) “Post Evaluation Only” and “Pre and Post Evaluation” • What did you like the most about this program? • What did you like the least about this program? • How could this program be improved? • Would you recommend this program to others? “Stages to Change Evaluation” • What has made it easier for you to improve your financial practices? • What has prevented you from improving your financial practices? • With respect to the overall program, what did you like the most? • What did you like the least? • How could this program be improved? • Have you shared what you learned with others? • Would you recommend this program to others?

  24. Qualitative / Open-Ended Questions (conti.) “Train-the-Trainer Evaluation” • What was the most helpful information you received during this training program? • How could this training program be improved? • How do you plan to share this information with your target audience(s)? • What information and materials from this training do you plan to share with your target audience(s)? • Will you share what you learned with other instructors and colleagues? • Would you recommend this training program to other instructors and colleagues?

  25. Age Gender Race, Ethnicity, and Language Marital Status Education Employment Family Structure Health Status Income, Assets, and Debts Region/Location Financial Experience Students/Youth Instructors/Educators Demographic Questions

  26. Rigor vs. Reality Longitudinal data? Control groups? Randomized experiments?

  27. NEFE Financial Education Evaluation Toolkit®http://www2.nefe.org/eval/intro.html

  28. NEFE Financial Education Evaluation Toolkit® • Database • Post evaluation only with option for follow-up • Pre and post evaluation with option for follow-up • Stages to Change Evaluation • Train-the-Trainer • Testing Knowledge • Building Skills • Taking Charge • Manual • How-to-guide for grass-roots level organizations • Examples (survey instruments, executive summary, reports) • Guidance on how to organize and present impact data

  29. Manual http://www2.nefe.org/eval/manual.html

  30. Manual http://www2.nefe.org/eval/manual.html Part I: Financial Education Overview Part II: Understanding Program Evaluation Part III: The Evaluation Planning Process Part IV: Using the Evaluation Database Part V: Reporting Program Impact

  31. Part I: Financial Education Overview

  32. Part II: Understanding Program Evaluation

  33. Part III: The Evaluation Planning Process

  34. Part IV: Using the Evaluation Database

  35. Part V: Reporting Program Impact

  36. Appendix: Sample Evaluation Instruments

  37. Database http://www2.nefe.org/eval/index.php

  38. Step 1: Program Info and Follow-up

  39. Step 2a: Knowledge Indicators

  40. Step 2b: Customizing Questions

  41. Step 3: Confidence and Behavior Indicators

  42. Step 4a: Selecting Statements

  43. Step 4b: Customizing Statements

  44. Step 5: Qualitative Info

  45. Step 6: Demographics

More Related