1 / 29

Measuring the Impact of Learning Outcomes

An Integrated Model for Measuring the Impact of Course, Program, and Institutional Learning Outcomes. Measuring the Impact of Learning Outcomes.

azana
Télécharger la présentation

Measuring the Impact of Learning Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Integrated Model for Measuring the Impact of Course, Program, and Institutional Learning Outcomes

  2. Measuring the Impact of Learning Outcomes “Good evidence, then, is obviously related to the questions the college has investigated and it can be replicated, making it reliable. Good evidence is representative of what is, not just an isolated case, and it is information upon which an institution can take action to improve. It is, in short, relevant, verifiable, representative, and actionable.” Guide to Evaluating Institutions ACCJC, August 2008

  3. Measuring the Impact of Learning Outcomes How do we collectively examine disparate learning outcomes assessment efforts? (representativeness and replicability) How do we provide evidence about these disparate efforts? (verifiable) How can we use evidence to improve upon learning outcomes practices? Upon district practices? (actionable)

  4. Measuring the Impact of Learning Outcomes Assessment often stands alone; relationships are often not explored between: • Courses From Different Programs • Diverse Programs (within Instruction and Between Instruction and Student Services) • Course/Program Learning Outcomes and Their Contribution to Institutional Learning Outcomes • Types of Learning Outcomes (SLOs & AUOs) • Differing Methods of Assessment • Learning Outcomes and College Goals

  5. Measuring the Impact of Learning Outcomes: Statistical Analysis How do we statistically assess impact? • Tests of statistical significance are not necessary • Greatly influenced by sample size (mean = 10.0; 10.1) • Do not speak to the magnitude of the difference • Not well understood – even by ‘experts’ • Effect Size as measure of practical significance • Unstandardized • Standardized (d, r) • Cohen’s conventions: d = .20 – small; .50 – moderate; .80 – large

  6. Statistical Significance

  7. Effect Size (unstandardized)

  8. Effect Size (standardized)

  9. Meta-Analysis • Combines research results across all identified studies • Combines results using Effect Size (i.e., d) - generates an Average d • Average d may vary across study characteristics (i.e., Moderators): • Number of Assessment Cycles • Type of Assessment Instrument

  10. Moderators(Grouping Variables) • Term – term data collection occurred • Academic Year (AcadYr) – Academic year (SU, FA, SP) data collection occurred • Learning Outcome (LearnOutcome) • AUO – Administrative Unit Outcome • SLO – Student Learning Outome • Researcher – person who conducted research • Program – department • Course

  11. Effect Size by Academic Year

  12. Effect Size by Learning Outcome Type (AUO or SLO)

  13. Effect Size by Program

  14. Effect Size by Course

  15. Moderators (Continued)(Grouping Variables - Measure) • Criterion – How was the Criteria for Success set by the program? • Population – A measure obtained from the population (e.g.: critical thinking, Nursing Licensing Exam, etc.) • Pre-assessment score – The pre-assessment score in a pre-post assessment • Program – The program set a specific criterion (e.g.: PE, Communication Studies, etc.) • PrePost – The assessment was a post-assessment or pre-post • Standardized – Measure used to assess outcome was standardized or unstandardized • Instrument • Rubric • True/False • Multiple choice • Matching • Likert Scale • Anchored Scale • Unduplicated – used to identify number of students impacted by learning outcomes processes. From 2004-005 to 2008-2009 6,714 students have been impacted by learning outcomes. • Assessor • Self-Assessment (e.g.: student self-assessed) • External Evaluation (e.g.: student evaluated by teacher)

  16. Effect Size by Criterion Type

  17. Effect Size by Pre-Post or Post-Assessment

  18. Effect Size by whether the Instrument was Standardized or Unstandardized

  19. Effect Size by Measurement Type

  20. Effect Size by whether Student Self-Assessed Learning or Learning was Assessed by External Evaluator (e.g.: teacher)

  21. Moderators (Continued)(Grouping Variables - Planning) • EndsPolicy1 • (1) Instructional and Student Services, (2) Comprehensive Education Program, (3) Collaborative Partnerships, (4) Continuous Improvement, (5) Learning Outcomes, (6) Core Competencies • EndsPolicy2 • (1) Foundation Skills Courses, (2) Occupational Programs, (3) Transfer Level Programs, (4) Outreach, (5) Student Services, (6) Library and Learning Support Services • EndsPolicy3 • (1) College Facilities, (2) Technology, (3) Human Resources • EndsPolicy4 • (1) Balanced Budget, (2) Achievement of Planned Enrollment Growth, (3) External Funding, (4) Bond Reserve • EndsPolicy5 • (1 Qualified Personnel and Professional Development, (2) Commitment to Diversity, (3) Employment Agreement, (4) Professional Ethics

  22. Effect Size by Ends Policy 1: Learning Centered College

  23. Effect Size by Ends Policy 2: Institutional Effectiveness

  24. Moderators (Continued)(Grouping Variables – Learning Outcomes) • KSA – Knowledge, Skill, or Ability (e.g.: critical thinking, speech, confidence, self-efficacy, etc.) • KSACateg = KSA Categorized by Core Competency and Instruction • Communication – Students will demonstrate effective communication and comprehension skills. • Critical Thinking – Students will demonstrate critical thinking skills in problem solving across the disciplines and in daily life • Community/Global Awareness and Responsibility – Students will demonstrate knowledge of significant social, cultural, environmental and aesthetic perspectives • Personal, Academic, and Care Development – Students will assess their own knowledge, skills and abilities; set personal, educational, and career goals; work independently and in group settings; identify lifestyle choices that promote self reliance, financial literacy and physical, mental and social health • Instruction – Students will demonstrate KSA specific to course subject (e.g.: theory identification, research methods, etc.) • Cycles – Number of cycles program has assessed outcome • Level • Course • Program • Institution

  25. Effect Size by Knowledge, Skill, or Ability

  26. Effect Size by Number of Times Program has Assessed Outcome

  27. Effect Size by Course and Program Level Learning Outcomes

  28. Limitations • Database only includes results from outcomes evaluated by the Office of Institutional Research • How do we connect information to student success (e.g.: goals, course success, transfer, etc.)?

  29. Resources • Microsoft Excel Spreadsheets that calculate effect size statistics: http://www.stat-help.com/spreadsheets.html • Effect Size Calculator with Confidence Intervals: http://www.cemcentre.org/renderpage.asp?linkID=30325017

More Related