1 / 40

Institutional Effectiveness at Southern Methodist University

Institutional Effectiveness at Southern Methodist University. Focusing Efforts, Building Sustainability, and Mitigating Risk January 25 th , 2006 J. Joseph Hoey, Ed.D. SACS. Southern Association of Colleges and Schools, Commission on Colleges Quasi-Federal Agency

reed
Télécharger la présentation

Institutional Effectiveness at Southern Methodist University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Institutional Effectiveness at Southern Methodist University Focusing Efforts, Building Sustainability, and Mitigating Risk January 25th, 2006 J. Joseph Hoey, Ed.D.

  2. SACS • Southern Association of Colleges and Schools, Commission on Colleges • Quasi-Federal Agency • Accredited by US Dept. of Education • Title IV funds gatekeeper • It’s us! Peer evaluators from all over the South • Extensive training program provided

  3. Principles of Accreditation: The SACS Standpoint • The Commission on Colleges adheres to the following fundamental characteristics of accreditation: • Accreditation requires institutional commitment and engagement. • Accreditation requires an institutional commitment to student learning and achievement. • Accreditation acknowledges an institution’s prerogative to articulate its mission within the recognized context of higher education and its responsibility to show that it is accomplishing its mission. • SACS Principles of Accreditation, p.4 (excerpt)

  4. SACS Reaffirmation: New Ballgame • New Standards: Principles of Accreditation • Old Criteria: 354 “must” statements • New Principles: 73 broader standards • New Vision • Retrospective: Compliance Report • Prospective: Quality Enhancement Plan (QEP) • New Focus • “Institutional Effectiveness” a cornerstone • Need to demonstrate we do what we say we do

  5. SACS Reaffirmation: New Ballgame • New Process • Compliance Report submission • Off-site (online) review of compliance • Focused report • Quality Enhancement Plan development • On-site review and visit • Lingering compliance issues • Quality Enhancement Plan • Follow-up report • SACS reaffirmation decision • Commission may request follow-up reports • Five-year mid-cycle review

  6. SACS: Most Frequently Encountered Areas for Recommendations • Planning, Institutional Effectiveness and Assessment • Documentary evidence of an integrated system of planning, assessment, and institutional effectiveness is required • Evidence from multiple annual cycles is expected • Financials • Faculty Credentials • Consortia and Agreements • Quality Enhancement Plan

  7. Quality Enhancement Plan (QEP) • A major plan (project) demonstrating the institution’s commitment to student learning • Focus and scope of the plan is up to the institution, however, it must: • have broad institutional involvement • Have resources identified to ensure completion • be well conceived and justified • have thoughtful implementation plans • Good example: Georgia Tech’s QEP, www.assessment.gatech.edu/SACS/QEP

  8. Four Basic Questions To Ask • What markets are we trying to serve? • What services must be in place to fully serve those markets? • What is the institutional “branding” that will enable our university to appeal to those markets? • How will we know if our institution is effective in serving those markets? Source: Society for College and University Planning (2006)

  9. Educational Effectiveness: The New Edge • How will we know if our institution is effective in serving our intended markets? • The past decade has seen tremendous growth in our knowledge of how people learn and the relationship between learning, technology, and space programming. Are we competitive? How do we know? • Private institutions are increasingly using assessment information to determine their effectiveness and increase their market edge. See http://www.elon.edu/e-web/news/nsse/2005/.

  10. Where to Start: Focusing Efforts and Mitigating Risk • Conduct audit to determine current strengths and areas of risk. • Plan and implement processes to enculturate SACS Principles of Accreditation and minimize institutional risk. • Keep momentum going for institutional effectiveness: if it’s not integrated into culture and processes, it will not be sustainable. • Start now: longest required lead time and culturally most difficult process is linking planning, institutional effectiveness and assessment.

  11. Principles of Accreditation: Research-Based Planning and Evaluation Process • Core Requirement 2.5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

  12. Accreditation: Importance of Linkages • Clear demonstration of linkage between consideration of assessment results, educational planning, and budget allocation process is vital. • Good examples: • Charleston Southern University http://www.csuniv.edu/version3/fs/institutionaleffectiveness.asp • CSU Sacramento http://www.csus.edu/cup/planning/03_Institutional_Planning_Final.pdf

  13. Principles of Accreditation: Mission • Core Requirement 2.4: “The institution has a clearly defined and published mission statement specific to the institution and appropriate to an institution of higher education, addressing teaching and learning and, where applicable, research and public service.” • Comprehensive Standard 3.1.1: “The institution has a clear and comprehensive mission statement that guides it; is approved by the governing board; is periodically reviewed by the board; and is communicated to the institution’s constituencies.”

  14. Strategic Planning: Evaluation of the Mission • Does your mission statement… • Specify the fundamental reasons for your institution’s existence? • Establish the scope of your institution? • Identify your institution’s unique characteristics? • Provide a consistent message to all constituents? • Provide overall policy direction to the institution? • Direct your short-term as well as your strategic planning initiatives? • Need to be re-examined? Source: Society for College and University Planning (2006)

  15. College of William and Mary: Institutional Effectiveness Calendar

  16. Matrix to Tie Budgeting, Planning and Assessing Source: Society for College and University Planning, 2006

  17. How to Build Sustainability: Process Integration • Make assessment part of core infrastructure and processes: • planning, budgeting, resource allocation, and reporting cycles; • socialization processes; • strategic planning process; • program review; • promotion/tenure/reward structure; • professional development opportunities; and • administrative services.

  18. Integrating Assessment and Annual Reporting • Unit-based assessment update becomes part of annual report at school/college (and then institution) level. • Annual updates on broad institutional performance indicators, critical success factors and student learning included. • Examine relationship of annual reports to metrics established in strategic plan to help determine institutional progress.

  19. Degree Program Level: BS in HTS

  20. Integrating Assessment, Budgeting and Resource Allocation • SACS Standard: Budget is based on sound educational planning. • Expectations for linkage to assessment results are clearly established • in budget letter to units, • as part of a compact planning process, or • through other direct communication channels. • Teachable moment in creating a culture of evidence: With new initiatives, recommendations, or increases over base, ask for evidence of student learning. “Show me the data before I show you the money.”

  21. Integrating Assessment and Socialization Processes • Generational turnover means that institutional memory is being lost • New faculty orientation • Presentations to new chairs • New student orientation

  22. Integrating Assessment and Reward Structure • Crucial administrative role: act as catalyst for innovations • Provide recognition and rewards for innovative projects • Provide competitive grants for innovative teaching and learning projects that include assessment

  23. Integrating Assessment and Professional Development • Provide faculty and staff with the necessary tools to accomplish assessment as part of their jobs: • communication and recognition of best practices around campus will reinforce culture of reflection and action • lots of web links to assessment efforts appropriate to their discipline • keep information provided simple and understandable: for example, see CMU “Toolkit” developed by faculty: http://www.provost.cmich.edu/assessment/toolkit/toolkit.htm • campus forums/assessment seminars/discussion groups • sponsorship of faculty attendance at assessment-related conferences

  24. Integrating Assessment and Funded Projects • Funding agencies (NSF, NIH, Sloan, etc.) require solid project evaluation as part of proposal funding. • Ideal opportunity to involve faculty in assessment within their area of interest and demonstrate project results in terms of student learning.

  25. Integrating Assessment and Promotion/Tenure Review • Teaching awards • Teaching portfolios • Educational research projects counted in P&T process – c.f. Boyer’s Scholarship of Teaching • Pursuit of extramurally-funded research projects related to educational improvement

  26. Integrating Student Involvement in Assessment • Engagement: Get students involved in a positive way (Palomba and Banta, 1999). • Responsibility: students need to understand a clear set of expectations for their role in assessment. • Resources: they need relevant information and support. • Rewards: they may need some sort of incentive for their participation. • Most helpful: evidence of faculty commitment to assessment.

  27. Integrating Assessment and Program Review • Identify key issues that will focus review (Wergin, 2003) • Incorporate evidence of student learning: Assessment is at the heart • More frequent cycle (e.g., 5 years) • Use annual assessment results to support meaningful trend analysis • Institutional oversight committee • Reviews results • Discusses implications • Makes recommendations • Feeds into planning processes

  28. Integrating Assessment and Administrative Services • The Principles of Accreditation also require all administrative and support units to continuously assess outcomes of operations and demonstrate usage of evaluation results for improvement. • Seek appropriate outcome measures, professional service level standards, and benchmarks (e.g., CAS, NACUBO, APPA).

  29. Integrating Assessment and Strategic Planning • SWOT Analysis: • Assessment data: basis for analysis of internal strengths and weaknesses; • Feedback from alumni, employers: part of the basis for considering externalopportunities and threats; • At the institutional level, assessment data are used to: • Chart progress towards campus planning goals, • Inform the institution of the needs and expectations of internal and external clients and stakeholders, • Illuminate trends in institutional effectiveness, and • Thereby inform the process of establishing/modifying strategic direction for the institution.

  30. Principles of Accreditation: Assessment • Comprehensive Standard 3.3.1: “The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.” • Comprehensive Standard 3.4.1: “The institution demonstrates that each educational program for which academic credit is awarded (a) is approved by the faculty and the administration, and (b) establishes and evaluates program and learning outcomes.”

  31. Developing Program-Level Faculty Expectations for Student Learning Approach: Top-down or bottom-up: • Top Down: As a group of scholars, decide what are the important faculty expectations for students in the program. • Bottom Up: Identify recurring course-level faculty expectations of core and capstone courses in the program, and use this list to develop overarching program-level expectations. • Combination: Combine both approaches (Palomba and Palomba, 2001)

  32. Elements of a Program-Level Institutional Effectiveness Plan • Unit statement of mission or purpose • Faculty expectations for student learning and operational objectives • Educational practices and experiences • Assessment measures and standards for achievement • Implementation plan: state who is responsible • Dissemination and evaluation of results • Action: Documented use of results to maintain and enhance program excellence

  33. Differentiate Faculty Expectations from Operational Objectives • Faculty Expectation: • Student work will demonstrate competent use of diagnostic problem-solving model. • Program Operational Objective: • The program will admit 10% more students next year, with 5% higher overall SAT scores.

  34. Preparing Faculty Expectations for Student Learning: A Tasteful Context…

  35. Describing Expectations for Apple Pie… • What are our expectations for a great apple pie and what criteria would we use to ascertain if our expectations are met? • What would be some good methods for us to check and see if our expectations are being met? • What point(s) would be a good time to do so? • What would we call success? What would it look like?

  36. Describing Our Pie Expectations… • What’s one criterion for a good pie? • It’s baked all the way through. • How can you tell if it’s baked? • Direct method: taste test • Indirect method: color of crust. • When would we check this out? • End of baking cycle. • What would success look like? • First bite melts in mouth (direct measurement). • Crust is golden brown color (indirect indicator).

  37. From Pie to Students: Describing Faculty Expectations • Think in terms of end results. What should students be able to know, do, or value when they finish the program? • How would we know they know? What criteria would we use to form a judgment? • What method would we use to see if our expectations have been met? • When would we conduct measurements? • What level of performance would we call a success?

  38. Faculty Expectation Example: Master’s in Music Composition • Expectation: Master’s composition graduates will synthesize the musical language of 20th Century composers in their work. • Some possible sources of evidence: student portfolios, course portfolios which include student work • Possible ways to measure: 3 faculty raters use a simple rubric designed for this purpose to rate student work - collected in student and/or course portfolios • When to Measure: end of 1st year; master’s performance of their works at end of 2nd year. • Possible Standard for Success: successful synthesis (defined in rubric) of harmonic, rhythmic, formal and scalar materials into student’s compositional vocabulary. Evidence may include student work from selected courses, culminating experiences (e.g., exam/performance). • Possible Uses of Results: an instructor redesigns an assignment, the faculty restructures the sequencing of courses (or adds a new course), the dean allocates resources for software (or a new faculty line)

  39. Summary • SACS Principles: New Ballgame • Educational Effectiveness: the New Edge • Linkage: A Basic Expectation • Now: audit current compliance, focus on needed areas, create sustainable systems, and mitigate risk • Sustainability achieved through process integration • Develop and implement program-level assessment plans, based on well-crafted faculty expectations for student learning • Use assessment results to inform planning and budget requests, and to prompt actions to further the excellence of student learning in our academic programs.

More Related