1 / 33

Using Assessment Data for Continuous Improvement in General Education

Using Assessment Data for Continuous Improvement in General Education. AAC&U Conference Atlanta, Georgia February 19, 2005. Presented by…. Linda Pomerantz-Zhang, Ph.D. lpomerantz@csudh.edu C. Edward Zoerner, Jr., Ph.D. ezoerner@csudh.edu Sue Fellwock-Schaar, Ed.D. sschaar@csudh.edu

bree
Télécharger la présentation

Using Assessment Data for Continuous Improvement in General Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Assessment Data for Continuous Improvement in General Education AAC&U Conference Atlanta, Georgia February 19, 2005

  2. Presented by… Linda Pomerantz-Zhang, Ph.D. lpomerantz@csudh.edu C. Edward Zoerner, Jr., Ph.D. ezoerner@csudh.edu Sue Fellwock-Schaar, Ed.D. sschaar@csudh.edu California State University, Dominguez Hills

  3. Is your institution collecting data for your General Education Program?

  4. What are you hoping to gain from this session on assessment?

  5. CSUDH Profile • One of 24 CSU campuses, 4 in LA metro • 13,000 students, 61% undergrad • Small but growing number of FTF (5900 are upper division transfer students) • Approx 90% of incoming frosh need remediation in math, English, or both • High degree of cultural and racial diversity • ECLP grant to improve critical student skills in writing, reading, critical reasoning

  6. Background to GE Assessment Activities… • Administrators unable to encourage faculty to initiate & sustain program review for GE • System-wide pressure to reduce size of GE package & streamline curriculum • Fear of “turf wars” • Inconsistent oversight from departmental level on up • General distrust of administration motives • 15 years of unsuccessful administrative efforts to reduce size of GE & BA/BS

  7. And then came… • Growing faculty concern about student achievement levels • Creation of Student Learning Outcomes Assessment Committee • Creation of half-time Assessment Coordinator (faculty) • WASC focus on student learning outcomes and “culture of evidence” • ECLP grant aimed at improving student writing • GE Syllabus Analysis sponsored by ECLP • Embedded Assessment initiatives • Willingness to review GE in 5-year cycle

  8. The tensions… Who controls the process? (Administration vs. Faculty) How are the data used? (Course improvement vs. faculty/program faultfinding)

  9. The Review Procedure—Year 1

  10. Departments were notified in Spring, 2003, that materials should be gathered in Fall, 2003 and submitted in February, 2004

  11. Questions we asked in planning our GE Assessment… • What types of assessment data should/could be collected? • What types of data analyses & who should conduct them? • Who should see the collected data and analyses? • How should assessment data be utilized and by whom? • What types of obstacles may be encountered in utilizing data? • How can the tension between assessment and faculty fears be managed? • How can an operative feedback loop be insured?

  12. Initial procedures in Fall, 2003… Forms for the department review coordinators (chair or other as appointed) were distributed at initial meeting • Overview of the process • Course Assessment Record form for each Area A Objective • Coordinator Report forms • Instructor Forms to be attached to graded student work

  13. Types Syllabi Student work with grades Sample exams with grades Analyzed for… Alignment w/ GE objectives Student centered, measurable course objectives Appropriate materials Academic rigor Commensurability across sections Data collected from faculty

  14. Description of ways assessment methods were selected & aligned with GE objectives Description of how department ensured commensurability of course reqs & student learning across sections Description of how the results of the review will be used to improve the instructional program Recommendations for changes in the Area A learning outcome objectives Feedback about the GE review process Data collected from departments…

  15. Collected from Registrar Grades of all courses under review, by section but without faculty names attached

  16. Review Team (4-5 members) met to analyze documentation and data in March, 2004 Rubric was developed and review procedures were designed Team reviewed findings with Administrator who was working with us (LPZ) Administrator wrote letter regarding Team’s findings to GE Committee

  17. GE Committee reviewed findings of Review Team GE Chair wrote letter to Department and attached letter from the Team (Only the letter from the GE Committee went to the Deans and Provost) Team requested preliminary update in Fall, more complete update in Spring 2005

  18. Department’s Original Response • Fear that some would employ results to argue for only one semester of Freshman Comp • General mild resentment over process; some duplication of labor • Mild hope that some good could come from the review

  19. Morals… • Make sure things are non-threatening—people will see it that way despite efforts to the contrary • Ask about local assessment practices and consider using (parts of) them before implementing a system-wide practice

  20. Original Department Actions and Results… • Chair attended meeting to learn about process and forms • Chair attempted to gather required data and provide analysis • Chair underestimated time and energy required and turned in an originally inadequate report

  21. Moral… • If possible, provide a model response to guide those writing reports

  22. Departmental next steps… • Director of Composition assumed responsibility for data accumulation and analysis • Report much more closely conformed to desired parameters

  23. Moral… • Get the right person for the job!

  24. Principal responses from Review Team… • Concern over course objectives in Freshman Comp II • Concern about syllabus uniformity and match to catalog description, especially in Freshman Comp II • Concern about grading practices—higher than expected percentages in A and B range • Concern about the way individual instructors mark papers

  25. Departmental response… • Director of Comp and Chair review syllabi more carefully; return to faculty for revision as needed • Memo of “Review of Objectives” and “Learning Outcomes” given to all instructors • Director of Comp held Grading standards Workshop • Memo to/conferencing with instructors regarding grading standards • Rebuttal to Review Team’s concern regarding instructors’ marking of papers

  26. Morals… • Seemingly large problems have (potentially) simple solutions • Reviewers need to be careful about commenting on things outside their professional expertise • Dialog can be fruitful

  27. Short-term results… • Generally improved syllabi, with greater clarity and more explicit focus on analytic writing • Little change in grade distribution

  28. Morals… • Things can improve as a result of the assessment process • Attempts at improvement must be sustained

  29. Longer-range results… • Heightened awareness of need to simplify Basic Skills Objectives in Catalogue • General departmental satisfaction with Freshman Composition sequence • Understanding of need to move students to analytical writing more quickly Changes in way pre-Freshman course is run

  30. Longer-range results… (cont.) • Understanding that department needs to remain vigilant against grade inflation • Department needs to consider more systematic mechanism of review to ensure that course offerings uniformly meet desired objectives

  31. Things we learned about the process… • Although not perfect, the process was workable • Rubrics were not a perfect fit but a reasonable guide • Provide rubrics BEFORE departments begin the review • Inter-rater training is important • Consider what kind of report you will give to departments—quantitative or narrative

  32. Things we learned about the process… (cont.) • Five years is too long to wait for recommended changes to be reviewed • Departments who used coordinators generally turned in stronger portfolios • Need to increase awareness of the expectation that the departments should be doing their own internal review and evaluation as a part of the process

  33. Things we learned about the process… (cont.) • Start early enough to catch the exceptions to typical scheduling • Plan for contingencies regarding team members • Should consider ways to have direct contact with students or information from them • Mixed results are likely, but waiting until everyone is ready is not in the best interest of the students

More Related