1 / 20

Looking for Results: Principles for Evaluating Student Success Initiatives

Looking for Results: Principles for Evaluating Student Success Initiatives. Presenter: Rick Voorhees. Goals for Today ’ s Work Together. You ’ ll be able to Identify key questions for evaluating interventions Distinguish between different types of evaluation

presta
Télécharger la présentation

Looking for Results: Principles for Evaluating Student Success Initiatives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Looking for Results: Principles for Evaluating Student Success Initiatives Presenter: Rick Voorhees

  2. Goals for Today’s Work Together • You’ll be able to • Identify key questions for evaluating interventions • Distinguish between different types of evaluation • Make the link between evaluation and Continuous Quality Improvement • Identify four components of evaluation • Visualize how a logic model can be a powerful tool for understanding interventions

  3. The BIG Question What types of learners change in what evident ways with which influences and resources?

  4. Chances Are . . . • Everything else equal, what are the chances (probability) that • Males and females progress and graduate at the same rate? • Racial and ethnic groups progress and graduate at the same rate? • Financial aid recipients and non-recipients progress and graduate at the same rate? • Students referred to developmental education and those that aren’t progress and graduate at the same rate?

  5. Formative Program Evaluation Formative evaluation: (sometimes referred to as internal) is a method for judging the worth of a program while the program activities are forming (in progress). This part of the evaluation focuses on the process. Permits faculty, staff, and students to monitor how well goals are being met. The main purpose is to catch deficiencies so that the program can readjust while it is in progress Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at http://www.nwlink.com/~donclark/hrd/isd/types_of_evaluations.html

  6. Summative Program Evaluation Summative evaluation: (sometimes referred to as external) is a method of judging the worth of a program at the end of the program activities (summation). The focus is on the outcome. Note: All evaluations can be summative (i.e., have the potential to serve a summative function), but only some have the additional capability of serving formative functions (Scriven, 1967). Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at http://www.nwlink.com/~donclark/hrd/isd/types_of_evaluations.html

  7. The Basics of Continuous Quality Improvement A Act P Plan C Check D Do Graphic Source: ww.anzca.edu.au/fpm/resources/educational-documents/guidelines-on-continuous-quality-improvement.html

  8. Four Components of a Culture of Inquiry • Component One • “What’s Wrong” • Use disaggregated longitudinal cohort data to determine: • Which student groups are less successful than others (i.e. identify gaps in student success • Which high enrollment courses have the lowest success rates Component Two “Why” Collect, analyze, and use data from other sources (focus groups, surveys, literature reviews) to identify the underlying factors (barriers or challenges) impeding student success. Component Three “Intervention” Use data from Component Two to design new interventions, or revise current ones, to effectively address the underlying factors impeding student success. Review and consider changes to existing practices and policies that impact those factors • Component Four • “Evaluation and Modification” • Collect, analyze, and use evaluation data to answer • To what extent did the intervention (including policy changes) effectively address underlying factors? • What extend did the interventions increase student success Source: K.P. Gonzalez. Using Data to Increase Student Success: A Focus on Diagnosis. Retrieved March 12, 2013 at http://www.achievingthedream.org/sites/default/files/resources/ATD_Focus_Diagnosis.pdf

  9. True Experimental Design Pretest-Posttest Control Group Design Post-Test Measurement Intervention Group 1 Random Assignment True Differences Post-Test Measurement Group 2 Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing

  10. Quasi-Experimental Design Posttest Only Control Group Design Post-Test Measurement Intervention Group 1 True Differences Post-Test Measurement Group 2 Could be a “historical cohort” Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing

  11. Percentage of Students Persisting By Enrollment in a Student Success Class Source: Voorhees & Lee. (n.d.) Basics of Longitudinal Cohort Analysis. Retrieved April 15, 2012 at http://achievingthedream.org/sites /default/files/resources/ATD_Longitudinal-Cohort-Analysis.pdf

  12. Marco Level Cohort SSBTN Template Micro Level Cohort Micro Level Cohort Intervention Level Intervention Level

  13. Cause or Correlated? Attribution of cause and effect is difficult when working with a highly complex institution with multiple programs and initiatives as well with students from a wide range of backgrounds and current environmental influences.

  14. Sources of Evaluative Data • Administrative data systems • Focus groups • Faculty journaling • Student journaling • External surveys • Institutionally-developed surveys • Interactions with college services • Matching external databases

  15. Academic Terms, Administrative Data Systems, and What We Know About Students The Academic Term Census Date End-of-Term We Could Do So Very Much Better Here Most of What We Do Know About Students Happens Here Most of What We Don’t Know About Students Happens Here

  16. Logic Modeling: Informing Planning and Evaluation Planned Work Intended Results Outputs Outcomes Impact Assumptions Resources and Inputs Activities

  17. Logic Model Elements

  18. Why Use a Logic Model? 1. Program Design and Planning: serves as a planning tool to develop program strategy and enhance the ability to clearly explain and illustrate program concepts and approach to all college stakeholders 2. Program Implementation: forms the core for a focused management plan that helps identify and collect the data needed to monitor and improve programming 3. Program Evaluation and Strategic Reporting: presents program information and progress toward goals in ways that inform, advocate for a particular program approach, and teach program stakeholders

  19. Logic Models Aren’t Automatically Linear INPUTS OUTPUTS OUTCOMES Activities Participation Short Medium Program investments Long-term What we invest What we do Who we reach What results

  20. Outputs Inputs Assumptions Outcome Activities Impact Your Results Your work Your beginnings Logic Model Worksheet

More Related