1 / 44

Twin Peaks of Assessment: Institutional Effectiveness and Student Learning

Twin Peaks of Assessment: Institutional Effectiveness and Student Learning. Dr. Jo Allen, Senior Vice President & Provost Widener University. Overview of Presentation. Operational Terms Drivers of assessment

tassos
Télécharger la présentation

Twin Peaks of Assessment: Institutional Effectiveness and Student Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Twin Peaks of Assessment: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University

  2. Overview of Presentation • Operational Terms • Drivers of assessment • Assessment of institutional effectiveness: What Accreditors want to know & what the institution wants to know • Assessment of student learning outcomes: What Accreditors want to know and what the institution wants to know • Questions and concerns

  3. Assessment: An Operational Definition Assessment is the process of asking and answering questions that seek to align our stated intentions with documentable realities. As such, in higher education, it deals with courses, programs, policies, procedures, and operations.

  4. Evaluation: An Operational Definition Evaluation focuses on individual performance in the sense of task performance or job completion and quality, typically resulting in merit raises, plans for future improvement, or—in less satisfying cases—probation and possibly firing.

  5. Assessment vs. Evaluation • Assessment focuses on the work to be done, the outcomes, and the impact on others—typically, the aggregate situation, not just the individuals. • Evaluation focuses on the work of the individuals—their contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

  6. Assessment vs. Evaluation Assessment focuses on the work to be done, the outcomes, and the impact on others—not on the individuals doing the work. Evaluation focuses on the work of the individuals—their contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

  7. Assessment of Institutional Effectiveness vs. Student Learning • Institutional effectiveness = the results of operational processes, policies, duties and sites—and their success in working together—to support the management of the academy [Standard 7] • Student learning = the results of curricular and co-curricular experiences designed to provide students with knowledge and skills [Standard 14]

  8. What or who is driving assessment? Accreditors… • who determine the reputable from non-reputable institutions and programs • who ensure that institutional practices support the viability and sustainability of the institution and its offerings • who represent disciplinary and institutional interests

  9. Assessment drivers (cont’d.) • The public: “Ivory Tower,” liberal bias, ratings/rankings? • Legislators: responsive to citizens’ concerns about quality, costs, biases….or? • Prospective faculty: Quality and meaningful contributions to students’ lives? • Prospective parents: real learning and preparation for careers—worth the money? • Prospective students: How will I measure up? And what kind of job can I get when I graduate? • Funding agencies/foundations: evidence of an institution’s or faculty’s commitment to learning and knowledge and evidence of [prior] success?

  10. Make no mistake….

  11. Institutional Effectiveness What Accreditors Want to Know

  12. Institutional Effectiveness: What Accreditors Want to Know • Can you verify the effectiveness of operational contributors to a sustainable educational experience? • Do you use data and other findings to improve the quality of your educational and operational offerings? • Do you use those findings to align resources (financial, staff, curricular, co-curricular) to enhance desired outcomes?

  13. What sensibilities point to institutional effectiveness?

  14. What sensibilities point to institutional effectiveness? • A well-articulated set of processes for critical functions • A clear line of responsibility and accountability for critical functions • An alignment of the importance of the function and sufficient resources (staff, budget, training, etc.) to support the function • Evidence of institution-wide knowledge of those critical functions, processes, and lines of responsibility

  15. What kinds of evidence point to institutional effectiveness?

  16. What kinds of evidence point to institutional effectiveness? • Well-managed budgets • Accreditation and governmental compliance • Clearly defined and supported shared governance (board, president, administration, faculty, staff, and students) • Articulated communication pathways and strategies [transparency] • Consensus on mission, strategic plan, goals, priorities, etc. • Student (and other constituencies’) satisfaction

  17. Sites of Institutional Effectiveness

  18. Sites of Institutional Effectiveness • Processes [existence and transparency] • Enrollment: Admissions, financial aid, registration • Curricular: Advising, progress toward degree completion • Budgeting: operations/salaries; capital; bond ratings and ratios; endowment management; benefits; etc. • Planning: strategic planning, compact planning, curricular planning, etc. • Judicial: education/training, communication, sanctions, etc. • Residence Life: housing selection, training for RAs, conflict resolution/mediation • Advancement: fund-raising, alumni relations, public relations, government/corporate relations, community relations, etc.

  19. Sites of Institutional Effectiveness • Units/Offices of operations (samples) • Advancement • Admissions • Bursar • Registrar • Athletics • Deans (school/college) • Center for Advising, Academic Support, etc. • Campus Safety • Institutional Research • IT • Maintenance

  20. Measures of Institutional Effectiveness

  21. How do we measure institutional effectiveness? • Tangible data: Audited budget statements, handbooks, enrollment data, institutional data • Records/reports of activities and/or compliance • Self-studies pointing to documented evidence • Surveys of satisfaction, usage, attitudes, confidence, etc. • Disciplinary accreditation reports

  22. The Assessment Cycle: Key Questions for Institutional Effectivenessthe Institution Should be Asking/Answering • What services, programs, or benefits should our offices provide? • For what purposes or with what intended results? • What evidence do we have that they provide these outcomes? • How can we use information to improve or celebrate successes? • Do the improvements we make work?

  23. Where do we seek improvement [and what evidence will help us]? • Set Measurable Goals: We will raise the number of students who choose our institution as their first choice to 95% by 2012. • Demonstrate Transparency: All faculty committees will be invited to participate in the next planning meeting. • Respond to Unacceptable Findings: Students (39%) still report feeling unsafe in the mezzanine of the University Gallery. We will hire 4 new security staff, add 10 Saf-Tee lights…

  24. The Iterative Assessment Cycle for Institutional Effectiveness Gather Evidence Interpret Evidence Mission/Purposes Objectives/Goals Outcomes Implement Methods to Gather Evidence Make decisions to improve programs, services, or benefits; contribute to institutional experience; inform institutional decision- making, planning, budgeting, policy, public accountability

  25. Student Learning Outcomes Assessment

  26. Student Learning Outcomes Assessment What Accreditors Want to Know

  27. Student Learning Outcomes: What accreditors want to know… • Have you articulated your institutional, general education, and disciplinary/course-based learning objectives? • Are the objectives documented? Where? • Are the objectives measurable? • Have you actually conducted the assessment to see if students have learned what you expect them to learn? • Did you use your results to maintain or improve your educational offerings?

  28. Civic engagement Diversity appreciation Communication skills Professional responsibility Ethics Critical thinking Collaborative learning Leadership Mathematical or Quantitative competence Technological competence Scientific competence Research skills Cultural competence Interdisciplinary competence Civic responsibility Global competence Economic/financial competence Social justice Learning Outcomes?

  29. Measurable Objectives/Outcomes? • Yes or No Evidence of… • The degree to which… • Alignment evidence…

  30. Sites of Evidence? • Essays/Theses • Portfolios (faculty or external readers evaluated) • Quizzes • Oral presentations • Homework assignments • Lab experiments • Tests • Journal entries • Projects • Demonstrations

  31. Conducted the Assessment?

  32. Analyze, Interpret, Reflect? What does it all mean?

  33. DO?

  34. Do! • Alter the curriculum content • Alter the teaching methodology • Alter the assignments • Alter the schedule • Alter the course rotation • Alter the students

  35. Reassess: Did the alterations help? • Better? • Smarter? • Faster? • Safer? • More involvement? • More effective? • More efficient? • More sustainable? • More replicable?

  36. Student Learning Assessment: Key Questions for the Institution • What should our students know or be able to do by the time they graduate? • What evidence do we have that they know and can do these things? • How can we use information to improve or celebrate successes? • Do the improvements we make work?

  37. The Iterative Assessment Cycle Gather Evidence Interpret Evidence Mission/Purposes Objectives/Goals Outcomes Implement Methods to Gather Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  38. What are educators seeking? • Evidence of students’ skill level (basic competency to mastery) • based on faculty-articulated standards of quality and judgments • applied to all students’ work evenly • indicative of aggregate evaluations of performance or knowledge • informative for course or program improvements

  39. Middle States… • No prescription for your learning objectives and outcomes • No prescription for how you measure • No prescription for what you do as a result

  40. Middle States • Evidence of learning outcomes • Evidence of measures • Evidence of analysis and action

  41. Assessment • Standard 7: How is the institution doing? • Standard 14: What and how much are the students learning?

  42. Assessment of Institutional Effectiveness & Student Learning Outcomes: What is similar? • A commitment to doing the very best job possible under whatever conditions exist • A commitment to recognizing ways that altering those conditions can affect the outcomes (e.g., labs, field placements, time of meeting, style of teaching) • A commitment to recognizing that altering the outcomes can affect the conditions (e.g., student success in particular studies attracts more students of certain kinds)

  43. Ultimately…. We hold ourselves and our colleagues accountable for articulating the intentions of our work and then measuring the realities, resulting in designing and implementing strategies for improvement over time. • How are we doing? • How can we do better?

  44. THE END QUESTIONS? Comments?

More Related