1 / 40

Program Assessment: Designing and Documenting Learning

Program Assessment: Designing and Documenting Learning. March 29, 2010 Forum Session 5-6 p.m. League for Innovation in the Community College Conference Presenter: Robin Nickel, Ph.D. Welcome!. Agenda. Explore a process and tool for designing program assessment Examine sample programs

sage-morris
Télécharger la présentation

Program Assessment: Designing and Documenting Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Assessment: Designing and Documenting Learning March 29, 2010 Forum Session 5-6 p.m. League for Innovation in the Community College Conference Presenter: Robin Nickel, Ph.D. Welcome!

  2. Agenda • Explore a process and tool for designing program assessment • Examine sample programs • Identify criteria for program assessment • Connect program assessment to performance-based curriculum

  3. Quote of the Day • “As long as you’re going to be thinking anyway, think big. • --Donald Trump quoted in Sky Magazine March 2010

  4. Assessment of Learning • Informed judgments about achievement of intended learning outcomes • Based on evidence • Assessments are valid, reliable, and fair • Assessment is built into the plan for teaching • Data-driven evaluation that is actively used for continual improvement of teaching and learning

  5. Why do we have to do this? What are the drivers?

  6. Drivers • Accreditation (NCA, AQIP, industry) • Increased accountability to improve teaching and learning

  7. Drivers • Carl D. Perkins IV legislation and funding • Objectively measure student attainment of industry recognized skills upon graduation

  8. Assessment in Community and Technical Colleges • Focus on industry aligned skill sets • Students know they are acquiring skills that have value and portability • Increases odds that students will be successful in their employment • Ensures that your college is educating students to meet the needs of employers

  9. Questions about Outcomes • Who cares about outcomes? • Which outcome is which? • How are outcomes related? • What is the desired outcome of outcomes? • What does this mean for students and instructors?

  10. Who cares about outcomes? Which outcome is which? How are outcomes related? What is the desired outcome of outcomes? What does this mean for students and teachers Stakeholders Types of outcomes Relationship of institutional, program, and course level outcomes Learning, assessment, documentation, etc. Impact on course design, learning and assessment Questions about Outcomes

  11. Who are our stakeholders? • Accrediting agencies and funding sources…

  12. Assessment of Learning “Assessment of student learning is a participatory, iterative process that: • Provides data/information you need on your students’ learning • Engages you and others in analyzing and using this data/information to confirm and improve teaching and learning • Produces evidence that students are learning the outcomes you intended • Guides you in making educational and institutional improvements • Evaluates whether changes made improve/impact student learning, and documents the learning and your efforts.” NCA Higher Learning Commission From “Student Learning, Assessment and Accreditation: Criteria and Contexts”, presented at Making a Difference in Student Learning: Assessment as a Core Strategy, a workshop from the Higher Learning Commission, July 26-28, 2006. 

  13. Assessment of Learning HLC/AQIP asks 5 fundamental questions: How are your stated learning outcomes appropriate to your mission, programs, students and degrees? How do you ensure shared responsibility for student learning & assessment of student learning? What evidence do you have that students achieve your stated learning outcomes? In what ways do you analyze and use evidence of student learning? How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? From “Student Learning, Assessment and Accreditation: Criteria and Contexts”, presented at Making a Difference in Student Learning: Assessment as a Core Strategy, a workshop from the Higher Learning Commission, July 26-28, 2006. 

  14. Curriculum and SACS • Core Requirement 2.5 Review of Programs • Systematic • Results in continuing improvement • Demonstrates you accomplish mission

  15. Curriculum and SACS • Core Requirements • 2.7.3 General Education component • 2.12 Quality Enhancement Plan relates to student learning

  16. Curriculum and SACS • Comprehensive Standards • 3.3.1 “institution identifies expected outcomes … assesses whether it achieves the outcomes; and provides evidence of improvement based on analysis of those results.” • 3.4.1 “establish and evaluate program and learning outcomes

  17. Curriculum and SACS • Comprehensive Standards • 3.5.11 “general education core… provide evidence that graduates have attained these competencies”

  18. Curriculum and SACS • Federal Requirements • 4.2 “The institution maintains a curriculum that is directly related and appropriate to its purpose and goals….”

  19. Benefits of Program Assessment

  20. What Does That Mean for Me?

  21. Increased Accountability for Direct Measurement of Learning Indirect Measures Direct Measures Data that provides evidence that students have achieved the outcomes: Performance assessments with rubrics Portfolios Artifacts Performances Outcome referenced tests • Data based on inferences about why achievement is high or low: • Enrollment • Retention rates • Course completion • Student/graduate/employer satisfaction surveys • Placement statistics

  22. Sound Assessment • Fair • Valid • Reliable • Learners informed of expectations up front! • Feedback to learners • Reliable • Performance assessment based on consistent rubrics, scoring guides, and rating scales • Valid • Outcomes based on standards (industry) • Measures intended outcomes • Measures application and critical thinking

  23. Summative/Formative Assessment Formative Assessment Summative Assessment Continuous Improvement

  24. Program Outcomes • Measurable, observable, and field-specific skills – major outcomes • Number 5-7 per program (guideline not rule) • Originate from: • Current DACUMs • Accrediting Agencies • National (or other) Skill Standards • Advisory Committees • Threaded through courses • Performance verified with summative assessment of skill performance

  25. Radiography Program Outcomes • Carry out the production and evaluation of radiographic images • Adhere to quality management processes in radiography • Apply computer skills in the radiographic clinical setting • Practice radiation safety principles • Provide quality patient care • Model professional and ethical behavior consistent with the A.R.R.T. Code of Ethics • Apply critical thinking and problem solving skills in the practice of diagnostic radiography

  26. Options • Your own Assessment • External Assessment • Indirect Assessments

  27. New Accountability for Direct Measurement of Learning Balance of College Assessments and External Assessments College Assessment External Assessment Indirect 2012-2013

  28. Program Assessment • is valid and reliable • is approved by industry (advisory committee or standards) • meets quality guidelines for 3rd party assessment if no rubric is used • such as…

  29. See Handout

  30. 2. External Assessment • Third party exam (i.e. NCLEX, Barb Cosmetology) • Required for job placement or minimal/no cost to student • Valid reliable • Linked to technical skills required on the job

  31. Considerations for selecting 3rd Party External Assessments • Can we obtain the results/data? • Are cost and administration feasible? • Does it add value or control entry to the student/occupation? • Is it valid, reliable, reputable, recognized? • Is it summative, cumulative or a partial measurement of skills?

  32. Concerns: • Only measures part of the program outcomes- • “All Code-No Application” • Does not indicate job performance • Does it help employers? • Focused only on technical skills, what about academic skills and soft skills?

  33. Indirect Assessment • GPA • Course Completion • Teacher developed exams that do not meet the larger state or system standards

  34. For any discipline or delivery mode What is WIDS? Curriculum Design • Proven process • Program and Course • Software Tool • Expertise

  35. Exit Learning Outcomes Program Outcomes Core Abilities Gen Ed Outcomes Institutionally-defined Instructor-defined

  36. What Can You Create With WIDS? • Program Documentation • Outcome assessment charts • Program profiles • Official Course Documentation • Syllabi • Assessments (rubrics and checklists) • Learning Plans/Teaching Plans • Reporting Matrices

  37. Official PROGRAM Documentation

  38. See Handout Then… let’s see how they are generated!

  39. Questions? Contact: Presenter: Robin Nickel, Ph.D. (608) 849-2411 nickelr@wids.org

More Related