1 / 44

outcomes-based, evidence-Based Program Review

Cyd Jenefsky, Ph.D Prof & Assoc VP Academic Affairs JFK University jenefsky@jfku.edu WASC Assessment Leadership Academy August 5, 2011. outcomes-based, evidence-Based Program Review. Today’s Session:. Overview of OB-EB program review: Conceptual Framework

Télécharger la présentation

outcomes-based, evidence-Based Program Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cyd Jenefsky, Ph.D Prof & Assoc VP Academic Affairs JFK University jenefsky@jfku.edu WASC Assessment Leadership Academy August 5, 2011 outcomes-based, evidence-Based Program Review

  2. Today’s Session: • Overview of OB-EB program review: • Conceptual Framework • Program Assessment : Program Review • Major Components • Key Qualities • Use of Results • Designing for Purpose

  3. Basics: What is It? • Cyclical process for evaluating programs to make decisions about improvements • Comprehensive audit, inquiry, reflection & analysis • quality, effectiveness, currency, assets, risks, sustainability/viability , mission alignment • Can be standardized, theme-based, customized or combination • Uses evidence from multiple sources as basis of inquiry, evaluation, and follow-up planning and budgeting

  4. Conceptual Frame-Shift • From audit to inquiry • From traditional input-based, process-focused model to outcomes-based model • From description & advocacy to evidence-basedanalyses and planning • Heightened attention to improving the quality of student learning • From focus on conducting effective program review to using the results effectively

  5. Comprehensive Inputs Resources Outputs Outcomes Processes Learning Ecosystem

  6. Program Review Assessment of Student Learning (multiple years)

  7. Program Review Retention & Graduation Rates Employer Satisfaction Student Satisfaction Program Reputation Admission Requirements Advising Licensure Rates Faculty Curriculum Design & Currency Enrollment Trends Assessment of Student Learning

  8. Main components of PR process: • Self-Study • External Review • Internal Review • Recommendations for Improvement • MOU = Final Results • Action Plan – Implementation

  9. Self-Study • External Review • Internal Review Analysis of Evidence of Program Quality and Viability

  10. Clear identification of evidence you want programs to analyze • Use guiding questions • for analysis of evidence • to structure self-study inquiry and report • to structure feedback reports from internal and external reviewers

  11. What data/evidence do you need to see to understand program quality?

  12. Evidence of Quality and Effectiveness: (Institution-Specific) • Curriculum (content, sequencing, currency, depth, rigor, breadth, scheduling, syllabi, delivery modalities, etc.) • Student learning and success • Co-curricular integration • Assessment practices • Learning environment & support • Student profile • Faculty • Research, recognition, awards

  13. Evidence of Viability/Sustainability (inc. administrative cost effectiveness) • Demand for program - enrollment • Student support services • Faculty • Staff • Information & technology resources • Facilities • Financial resources

  14. Sample results to analyze • Student learning results (academic, co-curricular) • Retention/graduation rates • Licensure, placement rates • Enrollment trends • Student satisfaction with support services • Alumni achievements • Faculty grants, research, scholarship, awards • Feedback from external constituencies • (e.g., employers, peer reviewers, community agencies, specialized accreditors, professional organizations)

  15. What do you want to know about multiple years’ worth of learning results? Identify Key Questions

  16. Main components of PR process: • Self-Study • External Review • Internal Review • Recommendations for Improvement • MOU • Action Plan

  17. Recommendations for Improvement • Significant actions • Evidence-based • Feasible • Aligned with priorities • Include improvements in student learning

  18. Exercise and Sports Science (2005)  Better lab and office space.  More grants  Move offices closer to Sports Management.  Retreat to discuss goals.  Hire sports psychologist.  Increase gender and racial/ethnic diversity.  Prioritize health promotion.  Reduce number of upper division electives.  Encourage community outreach.  Increase collaboration w/other departments and programs.

  19. Theology & Religious Studies (2004)  Merge separate degree programs into one.  Work to bridge the divide between T&RS.  Revise and shorten student learning outcomes.  Start long term planning around curriculum.  Introduce more dynamic courses.  Review the graduate program.  Greater mentorship of junior faculty.  Co-chairs and department sub-committees.

  20. MOU • Agreement between program/division and institution (or dept & division) • Clear expectations • Time line • Resources committed • Used for planning & budgeting at multiple levels in institution • Accountability for implementation

  21. Sample MOUs from SFSU

  22. Action Plan - Implementation • Concrete actions • Persons responsible • Timeline • Resources procurement • Tracking progress and impact

  23. Action Plan Template and Sample Action Plan from CSU Fresno

  24. MOU + Action Plan = PR RESULTS

  25. Clearly Identify All Players’ Roles • IUPUI: http://planning.iupui.edu/39.html • SFSU Guide: http://air.sfsu.edu/prog_review/sixth.html • CSUN Program Review Distribution of Tasks: www.csun.edu/assessment/docs/ProgramReview • USD Guidelines for PR • JFKU PR Guide (on your Blackboard site)

  26. Coordinate with specialized accreditation cycles

  27. Summary of 7 Key Qualities of Outcomes-Based Evidence-Based Program Review

  28. 1. Meaningful collective inquiry • Research questions that matter • Appropriate methodologies • Thematic, customized, uniform • Open to surprise • Multiple stakeholders involved Including students!

  29. 2. Stakeholder engagement • Faculty, support staff, students, peer reviewers, employers, community, external experts • Inquiry, evaluation and decisions for improvement

  30. 3. Outcomes-based • Focused on results • Quality, effectiveness determined by evidence of outcomes achieved

  31. 4. Evidence-based • All evaluative claims about strengths, weaknesses, proposed improvements supported by relevant qualitative and/or quantitative evidence • Evidence cited throughout • PR results used as evidence for follow-up planning, budgeting

  32. 5. Meaningful analysis • Reasoned analysis of evidence (v. advocacy) • Significance for future program planning, inc. improving student learning • Responsive to changes in external environment

  33. 6. Usable results • Yields info relevant and pertinent to program improvement, inc. student learning • Avoid busy-work unlikely to be used • Informs evidence-based action plan (with timeline) for improving program/student performance

  34. 7. Results Used • Planning and budgeting at various levels throughout institution • Program • Department • Division/College • University-wide • Intentional capacity-building for constituencies to use results effectively

  35. Program-Level USE of Results - examples • Refine learning or service outcomes • Refocus curricula to reflect changes in discipline or profession • Cut redundant classes & redesign other courses to align better with program learning outcomes • Re-sequence courses for better scaffolding • Coordinate course assignments in the major • Develop undergraduate research initiative

  36. Program-Level USE of Results (cont’d) • Strengthen or streamline assessment practices • Enable a digital learning community for students • Reassign faculty/staff or request new lines • Design professional development program to improve online teaching, learning, assessment • Develop applied learning initiative with employers, community

  37. USE of Results Beyond the Program - examples • Identify and respond to trends across programs • Coordinate developmental courses • Writing across the curriculum • Clear flow-charts & paths to completion • Enhance intra-institutional synergies to promote student success: • E.g., collaborate with student services & general education to improve student retention and performance • Create college-wide first-year program • Reallocate resources to academic support services

  38. USE of Results Beyond the Program (cont’d) • Create professional development programs across units • Better align department, college, institutional goals • Develop cross-disciplinary service-learning collaborations • Grant resources university-wide to integrate adjunct faculty into program assessment • Reallocate resources to growing programs • Allocate resources to undergrad research initiative • Improve e-Portfolio software • Revise program review process to produce more usable results

  39. Design with purpose How do you want to use program review in your program, department institution? Purposes you want it to serve.

  40. PURPOSES: • improve student learning • improve curriculum, pedagogy, learning environment, program administration, advising, etc. • identify trends across program/dept units • set program/division/institutional goals • inform program/division/institutional planning • inform decision-making, resource re/allocation at multiple levels • re/align programs with strategic priorities and plans • respond to environmental changes • report program performance to stakeholders

  41. General PURPOSE: To generate useful, usable results to inform planning and budgeting at various levels of the institution in order to improve quality, effectiveness and sustainability of programs.

  42. Evaluation of Program Review Process • Is it effective? • Achieving intended purposes? • Criteria for evaluation • Your intended purposes • WASC Sr. and ACCJC rubrics • Criteria for evaluating PR design • Feedback from multiple users/stakeholders • Plan for improvement • Accountability for implementation

  43. How well does your current program review process serve your purposes?

  44. Reflection Questions • Are purposes of your PR process clear? Articulated in the guidelines? • What are the gaps or adjustments needed in your current design in order to achieve purposes, including using results more effectively? • In what parts of your program review process could you integrate a focus on student learning and assessment? • What kinds of professional development are needed to support this focus? • What kind of professional development is needed to improve evidence-based decision-making and use of results to inform planning?

More Related