1 / 45

OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement

OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement. James O. Carey Associate Professor Emeritus School of Information University of South Florida. ALA Annual Meeting, June 27, 2011. Overview. What is outcomes assessment? Why do outcomes assessment?

Télécharger la présentation

OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OUTCOMES ASSESSMENT:Linking Learning, Assessmentand Program Improvement James O. Carey Associate Professor Emeritus School of Information University of South Florida ALA Annual Meeting, June 27, 2011

  2. Overview • What is outcomes assessment? • Why do outcomes assessment? • What are the required elements of outcomes assessment? • How is an outcomes assessment process implemented? • Practical tips for successful outcomes assessment • Summary and Conclusions

  3. What is Outcomes Assessment? 3 Characteristics • Identifying desired outcomes • Assessing progress on outcomes • Using the results of assessment for improvement Roughly = To: • Institutional effectiveness • Accountability • Continuous improvement • Quality assurance • Formative evaluation

  4. NEEDS PLANNING ASSESSMENT PROGRAM What is Outcomes Assessment? Continuous process . . . . . . instead of an event.

  5. What is Outcomes Assessment? • At the institutional level • Institutional effectiveness • Accountability • For example: • The University’s goal is to graduate 80% of entering freshmen in 5 years • A three-year assessment indicates graduation rate of 63% in 5 years and points to financial problems as primary cause. • Office of Student Affairs will create a student services taskforce on alternative paths to financial viability

  6. What is Outcomes Assessment? • At the program level • Accountability • Outcomes assessment • For example: • A departmental objective is relevant job placement for 85% of graduates within 1 year of graduation • Surveys of alumni indicate 70% have found relevant placements in first year • An initiative is planned to contact relevant professional constituencies and develop structured involvement in the program with regional employers

  7. What is Outcomes Assessment? • At the program level • Student Learning outcomes assessment • Another example: • A library school wants graduates to be able to write an action research plan for a given problem scenario. • On the comprehensive exam, 18% of students fail the action research question and analysis indicates that most took the course with an adjunct • The curriculum committee standardizes the syllabus for the research course and the pass rate increases

  8. What is Outcomes Assessment? • At the classroom level • Student learning outcomes assessment • For example: • The instructor wants students to learn how to analyze service needs for a specified patron group • On the final project students consistently confuse patron needs with programming alternatives • The instructor develops new case studies on analyzing service needs, and performance on the final project improves to an acceptable level

  9. What is Outcomes Assessment? To summarize: • Has three basic components • Is part of a whole family of accountability methodologies • Goes by many names • Is used in many organizations in both public and private sectors • Is used at many levels within organizations for multiple accountability purposes

  10. Our Purpose Today • Institutional and programmatic learning outcomes assessment • For accreditation • Mandated, unit-level accountability • Shared responsibility Rather than • Course-level outcomes assessment • For course improvement • Elective, individual accountability • Personal responsibility

  11. Our Purpose Today Focus on: • Program level accreditation requirements • Systematic planning and evaluation • Student learning outcomes assessment

  12. Why Do Outcomes Assessment? • A systematic process of outcomes assessment is currently required for accreditation by: • ALA Committee on Accreditation • Parallel professional association accreditation (e.g., NCATE) • All eight regional higher education accreditation organizations • Most state boards of regents and departments of education • A proven methodology for best results from effort and resources

  13. Overview:Elements of Outcomes Assessment

  14. Student Learning Outcomes • What are they? • Statements describing knowledge and skills that students are expected to master by completion of their program of studies • Synonyms (sort of): • Core competencies • Learning objectives • Where do we get them? • Parent institution’s mission, goals, and strategic objectives • Unit-level mission, goals, and objectives

  15. Student Learning Outcomes • Where do we get them? (continued) • Professional standards • ALA/COA Standards for Accreditation (2008) (see Standards I and II) • ALA Task Force “Core Competencies” (2009) • ALA divisions and other library/information professional associations • Expert faculty members • Syllabi from core courses • Program advisory boards, alumni, employers, practitioners, students • Exemplary LIS programs • Futurists

  16. Student Learning Outcomes • What do they look like? • Declarative sentences describing what students will know and be able to do • Description of a single skill or set of closely related skills that can be assessed at the same time • Performance of the skill(s) should be observable, or result in an observable product • Performance of the skill(s) should be measurable; i.e., one can determine when it has been done successfully

  17. Student Learning Outcomes • How many should we have? • Don’t go overboard! • Remember, if you write it you will need to assess it and report on it. • Usually 2-5 outcomes for each core content area are sufficient. • Write outcomes at a high intellectual level (analysis and problem solving) that subsumes multiple sub-skills • For example: Students use strategic planning processes to guide the direction and progress of an organization VS Students list the steps in a strategic planning process

  18. Student Learning Outcomes Bad Ones: Good Ones: Students join relevant professional organizations.(observable &measurable) Students plan a simulated needs assessment for a given collection development problem(observable and measurable) • Students appreciate the value of professional organizations • Students become familiar with needs assessment for collection development

  19. Student Learning Outcomes Bad Ones: Good Ones: Students select scholarly literature appropriate for analyzing a current issue in LIS(observable and measurable) Students describe principles of fair use and write policy for applications in an information center(observable and measurable) • Students know the scholarly literature in the LIS field • Students understand principles of fair use and how to apply them

  20. Student Learning Outcomes Bad Ones: Good Ones: Students select a source of outside funding and write a proposal for support of a project(higher level skill) ? • Students find sources of outside funding for libraries and information centers • Students learn about cataloging tools and bibliographic utilities

  21. Student Learning Outcomes Bad Ones: Good Ones: ? ? • Students list the features of an effective reference interview • Students describe functional areas within libraries or information centers that offer opportunities for applied research

  22. Develop Measures of Outcomes • IF learning outcomes have been written well, logical measures are often implied • For example: Outcome: Students will identify and assess the specific information needs of user groups in the community and use that information to write a collection development policy Measure: Write a collection development policy for user groups in the following community scenario

  23. Develop Measures of Outcomes • Characteristics of good measures • Valid; that is, actually measures what it claims to measure • Reliable; that is, will yield consistent scores • Applied uniformly across all students or sampled across students • Objective or require consensus of more than one judge/evaluator/rater

  24. Develop Measures of Outcomes • Two general types of measures: • Direct measures (primary data) Describe the selection and configuration of technological resources required to solve the communications problems depicted in the following case study. (requires a product) • Indirect measures (supplemental data) How would you rate your ability to select and configure technological resources to solve communications problems? • Not adequate for an entry-level professional • Adequate for an entry-level professional • Above average for an entry-level professional • Equal to an experienced professional (elicits an opinion)

  25. Develop Measures of Outcomes • Examples of direct measures • Comprehensive examination w/rubric • Portfolio w/rubric • Products from capstone course w/rubric • Observation scale from fieldwork or internship • Standardized tests (local, state, or national) • Common course examinations • Licensure examinations • All examples must conform to characteristics of good measures

  26. Develop Measures of Outcomes • Examples of indirect measures • Exit interviews • Focus groups with students, alumni, supervisors, and employers • Surveys of students, alumni, supervisors, and employers • Reviews by advisory boards or councils • Case studies of cohort groups

  27. Develop Measures of Outcomes • What about students’ grades in classes, seminars, capstone courses, fieldworks, and internships? NO WAY! • Can not satisfy the characteristics of good measures

  28. Develop Measures of Outcomes • Set performance expectations • Once measures have been established, set the levels of performance that will be considered acceptable or “passing” • This is an internal “gatekeeping” function for student progress • for example: • Yes or no; right or wrong; pass or fail • 80% correct, 90% correct, 95% correct • Average rating of 4 on a 5 point scale to pass • Students must perform at 4.5 level on critical criteria #1 and #2, but can pass with an overall rating of 4.0 averaged across all 5 criteria

  29. Develop Measures of Outcomes • Practical considerations • Not a simple task • Requires a level of sophistication in testing and measurement • Requires a committee of the willing and/or a layer of administration for: • Design and development of direct measures • Design and development of indirect measures • Formative testing and revision of both direct and indirect measures

  30. Assess Learning Outcomes • Developing and carrying out an assessment plan • Practical considerations • Requires a committee of the willing and/or a layer of administration for: • Policy, procedures, and calendar for administration of measures • Policy, procedures, and calendar for grading • Policy, procedures, and calendar for notifying successful students and notifying and managing unsuccessful students • Procedures and calendar for recording, summarizing, and reporting results

  31. Assess Learning Outcomes • More practical considerations • It is difficult to measure all student learning outcomes with a single instrument in a single event • Use multiple measures, for example: • Comprehensive exam and products from capstone course • Capstone course products and portfolio • Portfolio and fieldwork observations • Sample across outcomes and students, for example: • Measure several outcomes in each comprehensive exam and rotate outcomes across exams each semester

  32. Organize and Interpret Results • Purposes for organizing and interpreting results • Confirm satisfactory performance • Detect performance problems • Detect faulty assessment instruments and/or procedures • Discover opportunities for programmatic expansion, reorganization, additions, cuts, and changes in overall direction • Address accountability expectations for the parent institution and for accreditation • Inform programmatic improvement

  33. Organize and Interpret Results • Set accountability expectations • How do we know when the program is meeting its obligations to its students, its institution, and its profession? • For example: • 90% of alumni will report “adequate” or better preparation on 90% of learning outcomes • 85% of our students will achieve an average rating of 4.5 or above on a 5-point scale on their capstone projects • 95% of students will rate a “pass” on student learning outcome #6 on the comps exam

  34. Organize and Interpret Results • Methods for organizing and interpreting results • Matrix analysis is most typical • Display student learning outcomes by measurement items and fill in results at the intersection, for example:

  35. Use Results for Improvement • Remember—no need to improve what is working well!!! • The dependent variable in this investigation is student learning • The independent variables are many! • Teaching and learning are parts of a complex system with multiple interacting components • Data can point to problems, but cause and effect relationships are difficult to establish

  36. COURSE CONTENT LEARNER CHARACTERISTICS LEARNING ENVIRONMENT STUDENT LEARNING OUTCOMES ASSESSMENT IMPROVEMENT IMPROVEMENT • ESSENTIALS FOR • LEARNING: • MOTIVATION • LEARNINGGUIDANCE • ACTIVESTUDENTPARTICIPATION • CONTENTINTEGRATION Simplified Model of Variables in Teaching and Learning(See Figure 1 in handout.)

  37. Use Results for Improvement • Where would one look among all of the variables for opportunities for improving student performance? • Begin with assessment data • Look for gaps between performance expectations and actual performance • Look for gaps between accountability expectations and actual performance • Sharpen understanding of performance problems with qualitative data (See Table 1 in handout.)

  38. Practical Tips for Implementation • Establish authority for outcomes assessment A person or committee charged with managing the process and delegating assessment responsibilities • Establish an annual assessment calendar (See Table 2 in handout.) • Establish a uniform reporting format (See Table 3 in handout.)

  39. Summary and Conclusions • For good teachers, learning outcomes assessment is intuitive; good teachers are always improving what they do based on the results of what they have done in the past. • The challenges: • Infuse the logic of that “good teacher” intuition school-wide or department-wide • Create sustaining policies and administrative structures • “How we do it.” instead of “What we do for accreditation.”

  40. This PowerPoint presentation along with resource links for outcomes assessment will be available at: http://shell.cas.usf.edu/~jcarey/Oa/

More Related