1 / 22

Tips for Writing SACSCOC Non-Academic Program Assessment Reports

Tips for Writing SACSCOC Non-Academic Program Assessment Reports. Office of Planning, Institutional Research, and Assessment (PIRA). Fall 2014. Relation Between Existing Assessment and SACSCOC Reports. Ideally you already evaluate your unit’s effectiveness

Télécharger la présentation

Tips for Writing SACSCOC Non-Academic Program Assessment Reports

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2014

  2. Relation Between Existing Assessment and SACSCOC Reports • Ideally you already evaluate your unit’s effectiveness • Program Assessment Reports should describe these activities using SACSCOC guidelines and terminology • Data or other findings that measure operational and/or student learning outcomes should be included, as should interpretation of findings • Initiatives to improve should be included

  3. Relation Between Existing Assessment and SACSCOC Reports • But don’t create special data collection process for SACSCOC; just summarize existing processes • Save yourself time and unnecessary work by adapting your existing annual report to the SACSCOC Program Assessment Report template

  4. Types of Non-Academic Units • Administrative support services • Academic and student support services • Research • Community/public service

  5. Larger Administrative Units . . . may prefer to submit a Program Assessment Report (PAR) for each office within the division, particularly if outcomes are not the same across those offices.

  6. Ensure that Reviewers Will See Clear Evidence that You Have . . . • defined desired mission, program outcomes or objectives, and related measures, • collected and evaluated results from ongoing assessment (multiple years), • undertaken actions to continuously improve outcomes. • Help reviewers find key components quickly & easily Define Outcomes & Measures Implement Change (Improve) Collect Findings Evaluate Results

  7. Use PIRA Checklist to Ensure Key Elements Are Included: • mission and program outcomes (objectives) • operational and/or student learning outcomes (2+) and related measures (2+ each, 1 should be direct measure) • assessmentfindings: results of measures from multiple years (if feasible) • discussion of results:review of findings, including whether performance meets expectations • discussion ofchanges: initiativesto improve program and whether continuous improvement has occurred • clear narrative and organization to make compliance obvious (does everything make sense?)

  8. When Writing Your Mission Statement You Should . . . • tie it to UM Mission: • “The University of Miami’s mission is to • educate and nurture students, to create • knowledge, and to provide service to our • community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world.” • and your strategic plan • describe program outcomes/objectives (e.g., purpose of unit, type of support for students—including any research or service components)

  9. When Writing Operational Outcomes You Should . . . • describe reasonable expectations in • measurable terms (efficiency, accuracy, • effectiveness, comprehensiveness, etc.) • include at least 2 outcomes • make outcomes easy to identify (e.g., use bolding & • numbering) and clearly stated (follow expected • structure)

  10. An Operational Outcome Should • Focus on a current service or process • Be under the control of or responsibility of the unit • Be measurable • Lend itself to improvements • Be singular, not “bundled” • Be meaningful and not trivial • Not lead to “yes/no” answer • Source: Mary Harrington, Univ of Mississippi

  11. Possible Operational Outcomes • Efficiency: The Registrar’s Office processes transcript requests in a timely manner. • Accuracy: Purchasing accurately processes purchase orders. • Effectiveness: Human Resources provides effective new employee orientation services. • Comprehensiveness: Financial Aid provides comprehensive customer service. Source: Mary Harrington, Univ of Mississippi

  12. A Student Learning Outcome (SLO)* Should *if appropriate for your area • Start with words like • Students… Graduates… We want students to… • Include verbs or phrases like • will demonstrate… should have ability to … • Include words like • …mastery of… …a capacity for… • Describe expected competence (e.g., practical skills, communication, leadership, multi-cultural awareness)

  13. Possible Learning Outcomes (not necessarily SLOs) for Non-Academic Units • Library: Students will have basic information literacy skills. • Career Services: Students will be able to create an effective resume. • Information Technology: Staff will know to use the student information system. • Human Resources: New employees will be familiar with the benefit package. Source: Mary Harrington, Univ of Mississippi

  14. Examples of Measures • Research: number of grants, total funding, number of peer-reviewed publications, conference presentations • Administrative support: timeliness in processing orders, budget growth (or savings), complaint tracking/resolution, public safety improvements, audits • Academic/student support: number of students counseled, job placements, scholarship awards, seminar participation, leadership training participation • Community/public service: number of patients seen, community event participation, annual volunteer commitments

  15. Do You Have Survey Data? • Often non-academic units use survey data for their assessment • Surveys are indirect measures of student learning, but they are direct measures of customer (client, employee, patient, student) experience Source: Mary Harrington, Univ of Mississippi

  16. When Writing Assessment Findings, You Should . . . • ensure each measure has corresponding findings (and no findings without earlier measure) • insert corresponding outcome/measure as heading for each set of results • ensure multiple years or insert explanation that data not provided for new program/revised measures: “As part of the major three-year continuous improvement update of our program assessment report in FY 2013, we decided to start using customer satisfaction surveys in conjunction with service requests. Because this is a new measure, we have data for only FY 2014, but we will continue to update the data in upcoming years to monitor continuous improvement.”

  17. When Writing Assessment Findings . . . • if measure is a narrative rather than data, ensure summary plus sample evaluations or insert statement • ensure results are presented clearly (tables) • decide if appendix of findings, survey instrument, etc. will be necessary (usually not) • Common error: Programs simply state they evaluate outcomes or omit measure(s). • Solution: You should provide evidence of assessment activity (table/text summary of findings).

  18. When Writing Discussion Section, You Should Provide . . . • statement as to why these particular assessment instruments were used • analysis of the assessment findings • How are periodic reviews used for improvements? • How does the use of assessment results improve your services? • What changes have been implemented or will be developed to improve your operational and/or student learning outcomes? • evidence of improvement • general trends • specifically in response to improvement initiatives

  19. Avoid These Common Errors In Writing Your Discussion Section • When describing initiatives to improve outcomes: The report simply lists initiatives. • Solution: Include brief commentary on which outcome will benefit. • When describing continuous improvement: The report does not include any evidence of improvement over time. • Solution: At least discuss efforts to improve outcomes.

  20. Format/Organization/Wording Help SACSCOC Reviewers Find What They Need • Add bold, indents, and/or underlines to assist reviewers • Nest measures under related outcomes • Label/nest Outcomes/Measures in Findings section • Include discussion of improvements/changes in Discussion section, not in Outcomes or Findings sections • Remove yellow template instructions • Delete extraneous text and data (clarity more important than length) • Expand acronyms (e.g., RSMAS, PRISM, UMHC) • Spell check; fix typos

  21. Tips for Writing an Efficient Report • Study resources and template before starting • Use existing assessments, available documentation, and your current reports whenever possible (saves time and effort) • Consider starting with measures and then writing outcomes to go with them instead of the other traditional order

  22. Questions for PIRA? Contact: Dr. David E. Wiles Executive Director, Assessment and Accreditation Institutional Accreditation Liaison (305) 284-3276

More Related