Download
developing assessment plans n.
Skip this Video
Loading SlideShow in 5 Seconds..
Developing Assessment Plans PowerPoint Presentation
Download Presentation
Developing Assessment Plans

Developing Assessment Plans

168 Vues Download Presentation
Télécharger la présentation

Developing Assessment Plans

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Developing Assessment Plans Mi-Suk Shim, Ph.D Spring 2006 DIIA

  2. Outline of Workshop • Review of previous workshop • Assessment methods overview & resources • Syllabus, exam, & assignment analysis for each course • Assessment map, matrix, & assessment plan at program level

  3. SACS Criteria CS 3.3.1 Institutional Effectiveness The institutionidentifies expected outcomes for its educational programsand its administrative and educational support services; assesses whether it achieves these outcomes; andprovides evidence of improvement based on analysis of those results

  4. UT SCHEDULE • Program Educational Objectives and Program Outcomes • Complete by end of Spring 2006 and document • Assessment Plan • Complete by end of Spring 2006 and document • Progress Toward Completion of One Assessment Cycle and Closing the Loop • Complete documentation by Spring Break 2007

  5. Learning Outcomes • statements that describe what students are expected to know, think, and be able to doby the time of graduation

  6. Learning Outcomes Students will DO WHAT (how)

  7. Matrix

  8. Assessment Plan University of Texas at Austin Academic Unit Assessment Plans Format (tentative version) I. School and Degree Program School Name and College: Degrees awarded: Contact person: Date: II. Program Mission Statement III. Program Educational Objectives IV. Program Learning Outcomes V. Strategies, Methods, and Level of Competence VI. Implementation Plan VII. Assessment of Results VIII. Evaluation of Results IX. Recommendations X. Actions

  9. Assessment Methods • Multiple methods & sources recommended (increase validity) • One method does NOT fit ALL (each has pros & cons) • Practicality? Time, effort, money • Do not have to measure everything or everybody (sampling) • Capitalize on what you are already doing • Quantity of data does not equate to Quality

  10. Direct vs. indirect • Direct measures: Assess student knowledge or skills, that is student learning outcomes • Indirect measures: Assess students’ learning experiences or perceptions of their learning

  11. Direct (Required) Class Assignments (paper, presentation, report…) Capstone Project Performance Project Direct Observation Portfolios External examiner Standardized exam Locally developed exam Certification and licensure exams Simulations Theses/Senior papers Indirect (Supplemental) Surveys Student survey Alumni survey Employer survey National survey Interview Focus group Case study Inventory of assessment methods

  12. Guiding Questions for Methods Does the method…… • Measure your learning outcomes? • Measure your learning outcomes accurately? • Provide useful information (implications for educational evaluation and improvement)? If you answered YES to all of the above, it can be used to demonstrate Institutional Effectiveness

  13. Level of competence • Your decision • What do you consider a success? Example: 90% of students will meet “acceptable” level of competence using a rubric

  14. Resources • UT SACS website https://www.utexas.edu/provost/planning/assessment/sacs/resources.html • Gloria Roger’s materials from October workshop (handouts) • DIIA Instructional Assessment Resources (IAR) Website http://www.utexas.edu/academic/diia/assessment/iar/how_to/methods/index.php

  15. Where to start? • Course related: • Course descriptions • Syllabi • Course objectives • Course assignments • Course exams • Other activities: • Student exit survey • Alumni survey • Employer survey • National Standardized Exams **Key is to “Make use of existing sources”

  16. What can individual faculty do? • Syllabus analysis • Exam analysis • Assignment analysis • For more detailed information; http://www.utexas.edu/academic/mec/research/workshopsummary.html

  17. Syllabus analysis • Identify course objectives • Document those objectives in a table • Faculty complete table for each of their courses

  18. Syllabus Analysis • table

  19. Exam analysis • Identify test items that match course objectives • Calculate overall student performance for each item • Calculate the average performance for items assessing same objective • Determine the level of competence

  20. Exam Analysis • Table

  21. Assignment analysis • Identify assignment components that match course objectives • Assess student performance for each component • Determine the level of competence • Using rubrics

  22. Assignment Analysis • Table

  23. Rubrics • Scoring guidelines • A set of categories which describe the important components of the work assessed.

  24. Rubrics • Scale • Descriptors Criteria (with indicators)---Things to look for Standard --- Description of degree of each level • Type Holistic Analytic

  25. Resources for Rubrics • Sample handouts from Relearning by Design Inc. http://www.relearning.org/resources/PDF/rubric_sampler.pdf • DIIA workshop material http://www.utexas.edu/academic/mec/research/pdf/rubricshandout.pdf

  26. Compile Info at Program level:Assessment Map

  27. Assessment Map: focused

  28. Matrix Example page

  29. Where when who • Where --- context for assessment (sample) • When --- time of data collection • Who ---responsible person --- who interprets results?

  30. Results/Recommendation/Action • State in future tense • What do you expect as results?

  31. Assessment Plan example

  32. Further assistance • Dr. Neal Armstrong Vice Provost for Faculty Affairs Office: MAI 201 Email: neal_armstrong@mail.utexas.edu Phone: (512) 232-3305; (512) 471-4716