1 / 36

Annual Assessment Reports: Their Format and Uses

Annual Assessment Reports: Their Format and Uses. Fred Trapp Bob Pacheco Mary Allen. Annual Assessment Reports: Why?. Useful to the Department. Ensure continuity in assessment when passing the baton Orient new faculty to program assessment

neona
Télécharger la présentation

Annual Assessment Reports: Their Format and Uses

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Annual Assessment Reports: Their Format and Uses Fred Trapp Bob Pacheco Mary Allen

  2. Annual Assessment Reports: Why?

  3. Useful to the Department • Ensure continuity in assessment when passing the baton • Orient new faculty to program assessment • Provide a historical record of changes and their rationale • Inform subsequent assessments of the same outcomes, such as using the same rubric or calibration exemplars • Accessible warehouse of information to use for program review or accreditation

  4. Program Review Includes: • Review of the currency of the outcomes • Review of the curriculum map • Review of annual assessment findings and associated program changes • Discussion of the evidence of impact of previous program changes

  5. Provide feedback to departments: Recognize their successes and identify areas of concern

  6. Example: Learning Outcomes: What Feedback Would You Give? • Students understand economic theories. • Students can write and speak effectively. • Students complete a dissertation. • Students pass the licensure exam. • Students can conduct literature reviews.

  7. Example: Curriculum Map

  8. Example: Evidence • The department chair assessed three outcomes this year using exit interviews with graduating students. She found that students satisfactorily mastered all three outcomes.

  9. Example: Evidence • We assessed students’ mastery of research methods by calculating the average grade in the Research Methods class. The average grade was 2.96. This is good.

  10. Example: Evidence and Assessment Process • We selected 50 capstone projects at random, calibrated 10 faculty on the use of an analytic rubric with four dimensions, and assessed the quality of students’ writing skills. Inter-rater reliability estimates ranged from .87 to .93.

  11. Example: Conclusion • We decided that we’d be satisfied if at least 80% of the students scored at level 3 or higher on each dimension of the rubric. We were satisfied with students’ control of syntax & mechanics and their use of sources, but we were disappointed with their synthesis of ideas and the overall organization of their writing.

  12. Example: Closing the Loop • We learned that students are not explaining theories at the level we expected. We decided to close the loop by expanding the emphasis on theory in each survey course (310, 312, 315, 316, and 321) and by placing more emphasis on theory in exams in these courses. Faculty will devote more time to theory during class meetings, and at least 20% of students’ grades will now reflect their ability to explain theories.

  13. Tie Assessment to Budgeting • Does closing the loop require additional budget for the program? • Have a procedure for allocating funds for well-based budget requests.

  14. Useful to Campus • Identify common outcomes with weak results to identify the need for a campus solution, such as the need to establish a Writing Center, to expand ESL assistance, or to offer faculty a WAC workshop. • Warehoused annual reports and program reviews are easily accessed for integration into WASC self studies and for visiting teams to review prior to their visit. Annual reports demonstrate sustainable, effective assessment is in place.

  15. Guiding Questions How can the report writing experience: Help faculty explore the student learning process? Determine the extent to which the curriculum is working? Where can time, energy and/or money be allocated for continuous improvement in learning? Exploit the writing process and dialogue about results to gain broader institutional learning experiences? ** Help meet our quality assurance pledge to the community? **Adriana Kezar ed. Organizational Learning in Higher Education New Directions for Higher Education. No. 131, Fall 2005. Jossey-Bass. 15

  16. What Might Be Included? • Assessment focus- course, program, general ed, etc. • What outcomes were assessed? • How and when were they assessed? • Who was assessed? • What were the results? • Who reviewed the results, made sense of the them and what conclusions were reached? • What are the implications for practice and/or policy or future assessment work?

  17. Report as Quality Assurance Promising Vehicles for Expanding Information to the Public Brief narrative report from annual assessment reports Simple statistical reports on learning outcomes or surveys Best practices stories supported by assessment Peter Ewell. Accreditation & the Provision of Additional Information to the Public about Institutional and Program Performance, CHEA, May 2004

  18. Report as Quality Assurance • National Institute for Learning Outcomes Assessment (NILOA) • 2010 Webscan report Exploring the Landscape: What Institutional Websites Reveal About Student learning Outcomes Assessment Activities • 2010 Connecting State Policies on Assessment with Institutional Assessment Activity • 2011 Providing Evidence of Student Learning: A Transparency Framework • Ease dropping • www.learningoutcomeassessment.org

  19. CC of Baltimore County (MD)Course-level Reporting Projects are at least three semesters long Individual and high-impact courses (all sections) included Project proposal by a faculty group Course improvements based on data analysis Reassessment expected Results/report shared across the college and web posted Ease dropping http://www.ccbcmd.edu/loa/CrseAssess.html Two-page executive summaries available

  20. CC of Baltimore County (MD) Course-level Reporting CHEM 108 An initial “failure” turned to success and collaboration with a four-year school HLTH 101 Addressing an achievement gap with professional development and increased communication with students CRJU 101 and 202 Statewide group assessment development effort and creativity in the interventions used

  21. Hocking College (OH) Program learning outcomes data collected in a student E-portfolio Directing internal and external evidence (1 to 10 measures) Indirect evidence (1 to 4 measures) Evidence drawn from samples of student work for faculty to apply an agreed upon holistic rubric Eight general education outcomes (student success skills) Discipline-specific exit competencies or outcomes

  22. Program-level Reports • Hocking College example reports and analysis • Culinary Arts Technology (on cloud) • Forestry Management Technology (on cloud & college web) • Nursing Technology (on cloud) • JFK University example report • Counseling Psychology (on cloud)

  23. Mesa Community College (AZ) • General education studies completed 2007-08; 2005-06 • Numeracy • Scientific inquiry • Problem solving/critical thinking (2008-09) • Information literacy • Workplace skills (CTE) (2009-10) • General education studies completed 2006-07; 2004-05 • Arts & humanities • Cultural diversity • Oral communication • Written communication

  24. Mesa College (AZ)General Education Reports Ease dropping http://www.mesacc.edu/about/orp/assessment/index.html Annual reports and summaries available Nine years of history and experience 14 years of assessment work

  25. Truman State University (MO)Various Reports Ease Dropping Assessment work began in 1970 http://assessment.truman.edu/ Assessment Almanac- A compilation of results from each year’s assessment work (versions from 1997 to 2009 are posted) General Education outcomes are assessed in the context of the major field of study Portfolio Project- required of all seniors to show best work assessed by faculty for the nature & quality of the liberal arts and sciences learning outcomes (versions from 1997 to 2008 are posted)

  26. Archiving • Why archive? • Compliance vs. institutional learning • Where to keep the completed report? • Decentralized vs. central office • Options for how to keep the reports? • Paper • Electronic templates • Interactive database

  27. Putting the Reports to “Work” • To all affected participants • Campus committees • Curriculum, assessment, resource allocation group, unit (department) leadership, general academic and college leadership • Campus fairs, brown-bag lunches, poster sessions for information sharing • Faculty professional development programs • Accreditation self-study committee work groups • Local governing board presentation • College web site for the public

  28. Criteria to Evaluate Reports

  29. Reports as Institutional Learning Feedback & recognition Feedback rubric for annual assessment reports Conversations and action Collection and analysis of evidence Implementation of findings Recognition (achievement & excellence) Ease dropping https://www4.nau.edu/assessment

  30. Reports as Institutional Learning Seal of Assessment Achievement Academic programs earning this recognition have demonstrated in their annual report that learning outcomes have been assessed through two or more methods, and findings have been discussed among the faculty.

  31. Reports as Institutional Learning Seal of Assessment Excellence Academic programs earning this recognition have demonstrated a thorough implementation of assessment plan(s) the reporting of meaningful assessment data the discussion of findings among faculty and perhaps students the use of findings to showcase student achievements and to make curricular adjustments.

  32. Quality Assurance to the Public Voluntary System of Accountability APLU & AASCU (520 public institutions, award 70% of bachelor’s degrees in the US each year) College Profile (includes learning outcomes & links to campus) Proactive initiative to document learning gains and average institutional scores (choice of 3 national instruments) Proactive initiative to illustrate unique campus learning outcomes assessment work Promoting a learning institution Ease Dropping http://www.collegeportraits.org/ CSUPomona http://www.collegeportraits.org/CA/CPP/learning_outcomes

  33. Quality Assurance to the Public National Association of Independent Colleges and Universities Assessment programs on campus tied to institution’s mission Ease Dropping http://www.naicu.edu/special_initiatives/accountability/Student_Assessment/id.514/default.asp Pepperdine University http://services.pepperdine.edu/oie/learning-outcomes/learning-outcomes-overview.aspx

  34. Where Can I Go? Resource Filesanywhere.com http://www.filesanywhere.com/fs/v.aspx?v=8a69668b5c6773a96f6d

  35. Contacts • Mary Allen (independent consultant) • mallen@csub.edu • Robert Pacheco • Rpacheco@barstow.edu • Fred Trapp (Cambridge West Partnership) • fredtrapp@gmail.com

  36. The End Questions and Comments

More Related