1 / 46

Assessing for Program Improvement

Assessing for Program Improvement. Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU). Presented at the University of Arizona February 11, 2009. Overview.

krystyn
Télécharger la présentation

Assessing for Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing for Program Improvement Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU) Presented at the University of Arizona February 11, 2009

  2. Overview • What I think you think I might talk about • What I think you need to think about

  3. How to Assess Programs for Improvement • A range of methods from simple to complex • The Core Idea • Simple models • Quality improvement models • Program review • More complex models

  4. Plan Implement Improve Plan Act Do Assess Check The Core Idea: The Planning-Evaluation-Improvement Cycle

  5. On to something else Back to the drawing board Toward a Spiral of Improvement Adapted from Norman Jackson 1. THINK ABOUT PROGRAM ISSUES 2. ENGAGE WITH THE PROBLEM CALLED HOW DO WE IMPROVE PROGRAM? 3. DEVELOP RESOURCES/ STRATEGIES TO IMPROVE 6. PLAN TO IMPROVE 4. IMPLEMENT CHANGES * experiment 5. EVALUATE IMPACT ON OUTCOMES * did it work as I intended? * how did people respond? * what were the results?

  6. Core Evaluation Cycle Questions

  7. Why the Fixation on Outcomes? • We haven’t paid sufficient systematic attention to this in the past • We look at inputs (resources) and processes (curricula and programs) fairly systematically • We tend to look at outcomes one student at a time • The link to accountability

  8. Simple Models of Assessment • Advantages • Easy to communicate, use, and learn from • Can be built into everyday work • Helps build and maintain culture of evidence • Models • The evaluation cycle (or spiral) • The assessment matrix template

  9. The Assessment Matrix

  10. The Limits of Simple Models • Often overly simplistic relative to problems • Actual measures can be misguided • Implementation can be inconsistent across units • Not always easy to link outcome measures to “responsible” processes • doing the right thing vs. doing it right

  11. Quality Improvement Models • Advantages • Focus on process provides best chances for identifying points of improvement • Collaborative teams empower staff and help improve communication across units • Formulaic method and external staff support help guide and keep on track • Sample methods • Penn State’s Fast Track • U of Wisconsin Accelerated Improvement

  12. PSU Fast Track

  13. Team # 526 -- Food Sciences Measures Task Force College of Agricultural Sciences December 2002 Objective Develop and implement a centralized system for collection and reporting of key performance indicators and departmental reports.Action Plan: 1. Evaluate current processes and data sources for gathering data for performance indicators and department reports.2. With information from #1, work with Dept. Head to define "key performance indicators" fromDepartment's Strategic Plan.3. Working with Dept. Head, define "departmental reports and other measures".4. Develop feasible solutions for a collection process for data/information defined in #2 and #3.5. Present solutions to the Sponsor with cost/benefit analyses.6. Sponor to disseminate solution and plan to faculty.7. Assist in implementation of solutions including initial collection of information (test cycle for newprocess), revising process, flowcharting and writing procedures and training stakeholders. Results Achieved to Date Team was disbanded at the end of 2003 after achieving action plans #1 - 7. Most of the expected outcomes were achieved for a fundamental centralized data collection system. Given the current resource and budget constraints, the sponsor decided not to pursue further automation of the centralized data system at the time. The Food Sciences Task Force was disbanded at the end of 2003.

  14. http://www.wisc.edu/improve/improvement/accel.html

  15. UWisc Accelerated Improvement

  16. Limits of QI Models • Academicians wary of “business” models • Focus on process emphasizes doing it right over doing right thing • Can be episodic rather than continuous

  17. Program Review • Program self-study, site visit by “peers” • Common method for academic programs • Increasing use for administrative programs • Fits well with accreditation framework • Guidelines shape tone and tenor • Content standards • Review team composition • Flexibility accommodates range of inquiry orientations

  18. Limits of Program Review • Expensive and time-consuming • Can be done with little participation • Or with a lot • Results not always directly useful for change • Memorandum of understanding helpful • Episodic nature not responsive to changing environment

  19. More Complex Models • Advantages • Handle true complexity • Provide in-depth insight into context • Academicians respect the scholarship (although not necessarily the particular approach) • Examples (from WMU’s evaluation center) • CIPP • Constructivist Evaluation

  20. More Complex Models

  21. The CIPP Model • Contractual Agreements • Context Evaluation • Input Evaluation • Process Evaluation • Impact Evaluation • Effectiveness Evaluation • Transportability Evaluation • Sustainability Evaluation • Metaevaluation • The Final Synthesis Report

  22. Constructivist Evaluation • Guba & Lincoln (2001) • Two-stage process • Discovery - effort to describe “what’s going on here,”  the “here”  being the evaluand and its context • Assimilation - effort to incorporate new discoveries into the existing construction or constructions …so that the new construction will fit, work, demonstrate relevance, and exhibit modifiability.

  23. Limits of Complex Models • Too complex to be practical • Expensive • They require an… • “evaluation unit as a staff operation at a high level of the organization in order to help insulate the unit from inappropriate internal influences and enhance its influence on decision making .”

  24. Take Home Points • There are many approaches to assessing for improvement • Virtually any method of inquiry can be accommodated • The point of all of them is to determine how well you are doing things and how they might be done better; and to then try doing better and to see if that improves the outcomes • Each can be done well or poorly

  25. Doing Assessment Well • Being “data-” or “evidence-driven” is not, in and of itself, a good thing • e.g., “selective use” of evidence to support a foregone conclusion • Torture numbers long enough and they’ll confess to anything • Effective use of data requires sharing diverse and often divergent perspectives • It’s not what the data say, it’s what you say about the data • Some disagreement and dissent is important to learning and innovation

  26. Further Heresy • Building effective programs requires some level of irrationality and disorder • To learn from what we do requires that we unlearn some things that we often don’t want to unlearn • As if doing this by ourselves were not difficult enough, we must do this together

  27. From Data- to Learning-Driven • Data-driven implies… • Rational, systematic testing of ideas through inspection of facts • sequential, often individual decision-making process • Learning-driven implies… • Going beyond what we already know and can do to gain new competencies • Deconstruction and reconstruction of ideas and beliefs • Becoming irrational to become re-rational

  28. Single- and Double-Loop Learning • Learning is the detection and correction of error (unintended consequences) • “Governing Variables” are those things what we feel are important to keep within acceptable limits • “Action Strategy” is what we do or plan to do to keep the governing variables within limits • “Consequences” are the intended and unintended outputs and outcomes • Intended: confirm our theory in use • Unintended: suggests error in our theory in use

  29. Single-Loop Learning • Governing variables not called into question • Adjustments made to action strategies at best • Defense mechanisms can readily arise to maintain single-loop learning Governing Variables Action Strategies Conse- quences

  30. Double-Loop Learning • Questioning the role of the framing and learning systems which underlie actual goals and strategies • Reflection is fundamental • Basic assumptions are confronted • Hypotheses publicly tested • Falsification is sought • Ego is laid aside Governing Variables Action Strategies Conse- quences

  31. Model I and II Org Learning • Single- and double-loop learning at the organizational level • Model I: Organizational members prescribe to a common theory in use • Organizational policies and practices inhibit change • Model II: Governing values, policies, and practices promote double-loop learning

  32. A Model I Learning Organization • Governing Variables • Tow the line • Win at all costs • Suppress negative feelings • Emphasize rationality • Action Strategies • Control environment and task unilaterally • Protect self and others unilaterally • Discourage inquiry • Consequences • Defensive relationships • Low freedom of choice • Reduced production of valid information • Little public testing of ideas

  33. A Model II Learning Organization • Governing Variables • Valid information is most important • Free and informed choice • Shared internal commitment • Action Strategies • Shared control • Participation in design and implementation of action • Consequences • Minimally defensive relationships • High freedom of choice • Public testing of ideas

  34. Participatory Action Research/Inquiry • Systematic inquiry process • Can use any of aforementioned methods • Stakeholder empowerment through active and on-going participation • Dialog throughout process promotes collaboration • Active learning and discovery fostered by critical reflection process • Action plans create shared responsibility for doing something with the results • Follow-up to action (checking results) maintains relationships and commitments

  35. Participatory Action Research/Inquiry • Quotes from Handbook of Action Research by Peter Reason • http://www.bath.ac.uk/~mnspwr/Papers/HandbookIntroduction.htm • The aim of participatory action research is to change practices, social structures, and social media which maintain irrationality, injustice, and unsatisfying forms of existence. • (Robin McTaggart) • Participatory research is a process through which members of an oppressed group or community identify a problem, collect and analyse information, and act upon the problem in order to find solutions and to promote social and political transformation. • Daniel Selener • We must keep on trying to understand better, change and reenchant our plural world. • Orlando Fals Borda

  36. Participatory Action Research • Who does what? • Decides what actions are taken? • Is responsible for effective implementation? • Can devise appropriate evaluation protocols? • Has access to or can collect appropriate evidence? • Reviews the results and decides what to do? • What can be done to get these people to work together and in concert?

  37. Example: Evaluation of New Student Orientation • Research Question and Evaluation Focus • reassessment of goals; incoming students’ needs; impacts on knowledge, attitudes, and behaviors • Data Collection • focus groups and questionnaires, sought perspectives of all major stakeholders • Data Reporting and Feedback • meetings with orientation leaders and faculty stakeholders • Development of Action Plans • facilitation of dialogue and data-driven proposals • Action • implementation of proposed changes • Assessment – on-going formative evaluation; re-administration of process and outcome instruments

  38. Example: Indiana Project on Academic Success (IPAS) • Research-based inquiry for enhancing academic success • Four-stage method • Assessment • Organizing • Action Inquiry • Evaluation • Supported by use of state and institutional student tracking records

  39. Stage 1: Assessment • Compare campus assessment information to statewide assessment results; identify possible challenges • Collect additional information from campus sources, such as prior reports and studies and focus group interviews • Organize teams of administrators, faculty, professional staff, and students to identify critical challenges on the campus • Prioritize the challenges, identifying two or three that merit special attention at a campus level

  40. Stage 2: Organizing • Coordinate the assessment and inquiry process with campus-level planning and budgeting; integrate the challenges with strategic plans; coordinate budgeting to provide necessary support. • Appoint workgroups to address critical, campus-wide challenges; consider providing release time to team leaders to work on tasks for the campus. • Coordinate the inquiry process (activities of the workgroups) with campus planning and budgeting.

  41. Stage 3: Action Inquiry • Build an Understanding of the Challenge • What solutions have been tried in the past, and how well did they work? What aspects of the challenge have not been adequately addressed? What aspects of the challenge require more study? Develop hypotheses about the causes for the challenges using data to test the hypotheses. Do the explanations hold up to the evidence? • Look Internally and Externally for Solutions • Talk with people on campus about how they have addressed related challenges. Consider best practices for retention and how they might be adapted to meet local needs. Visit other campuses that have tried out different approaches to the problem. How well would these alternatives address the challenge at your campus? • Assess Possible Solutions • Consider alternatives in relation to the understanding of the problem developed in Stage 3, step 1. Will the solutions address the challenge at your campus? How can the solution be pilot tested? If you tried out the solution, how would you know if it worked? What information would you need to know how well it worked?

  42. Stage 3: Action Inquiry (cont.) • Develop Action Plans • Action plans should address the implementation of solutions that should be pilot tested. Consider solutions that can be implemented by current staff. If there are additional costs, develop budgets for consideration internally and externally. (Remember, seeking additional funds can slow down the change process.) Develop action plans with time frames for implementation and evaluation

  43. Stage 4: Evaluate • Implement Pilot Test and Evaluate • Provide feedback to workgroups and campus coordinating team. Use evaluation results to refine the solution. Also, evaluation can be used as a basis for seeking additional funding from internal and external sources, if needed

  44. Do… Evaluate program effectiveness Provide incentive for using information (regardless of results) Raise expectations regarding quality and use of evidence Be patient with the learning curve Raise expectations for learning (for students and colleagues) Don’t… Evaluate individual effectiveness Tie resource allocation directly to results Beat people over the head with findings Confuse anecdotes with evidence Keep changing direction based on initial findings Lower expectations for learning Building Trust – Lowering Resistance to Change

  45. What’s the Point? • Assessment and evaluation are means not ends • Other important ingredients include: • Bringing the right people together • A climate of trust and experimentation • Incentives and support • It’s not rocket science • An imprecise answer to the right question is much more useful than a precise answer to the wrong question

More Related