1 / 33

ASSESSMENT for Administrative Departments and Educational Support Units September/October 2002

ASSESSMENT for Administrative Departments and Educational Support Units September/October 2002. Overview. What’s New? Latest developments in assessment Review new deadlines What’s Next? Finish old business (complete 01-02 reports) Prepare for next assessment cycle Questions / Resources.

leala
Télécharger la présentation

ASSESSMENT for Administrative Departments and Educational Support Units September/October 2002

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASSESSMENTfor Administrative Departments and Educational Support UnitsSeptember/October 2002

  2. Overview • What’s New? • Latest developments in assessment • Review new deadlines • What’s Next? • Finish old business (complete 01-02 reports) • Prepare for next assessment cycle • Questions / Resources

  3. Timing of Assessment Cycle • Fall too busy – move cycle to spring • Finish current (01-02) assessment cycle now • Columns/steps 4 & 5 of current assessment reports due Oct. 15, 2002 • Slight reprieve • Delay start of next cycle (03-04) until spring • Steps 1-3 due spring 2003 • Steps 4 & 5 due spring 2004

  4. Planning and Assessment Portal • Created a single point of entry for strategic planning and assessment activities • Access through ERAU Online / Blackboard • Overview of planning and assessment processes • Directory of assessment and strategic planning units • University Planning and Assessment Policy (APPM 4.3)

  5. Other Changes • “Best practices” adopted from institutions with successful assessment process: • Use committees to guide ongoing assessment • Set up “peer review” process for unit reports • Annual summary of university-wide assessment • Incentives for doing assessment well • Establish centralized support services

  6. Assessment Committees • Committee structure pushes ownership of assessment process further down into ERAU’s “foundation” (faculty/staff) • 4 Campus Assessment Committees (CAC) DB, EC, Prescott, and University Admin (UA) • Faculty & staff on each CAC (UA – staff only) CAC co-chaired by a faculty and a staff member • CAC co-chairs = University Assessment Committee (UAC)

  7. CAC Duties • In broad terms, CAC will… • Work with individual units to develop effective assessment reports • Facilitate communication across departments and programs • Be a vehicle for feedback to / from Chancellor / VPs • Nominate exemplary assessment reports

  8. CAC Duties • Work with individual units to develop effective assessment reports • Guidance on 5-Step model / assessment process • Peer review assessment reports at 2 points in time using a checklist of assessment guidelines • Upon draft submission of Steps 1-3 at start of assessment cycle • Submission of Steps 4 & 5 at the end of the assessment cycle • Peer review is to ensure the PROCESS; interpretation of results and the decision about what to do with them is YOUR call - this is NOT a prescriptive review!

  9. CAC Duties • Facilitate communication across departments and programs about assessment and continuous improvement • Encourage sharing of best practices and useful assessment measures • Avoid duplication of efforts • Summarize campus assessment activities

  10. CAC Duties • Nominate exemplary assessment reports as part of incentive • Submit “best assessment report” nominees to UAC • UAC will pick “best reports” • “Best reports” selected by UAC will receive $500 to use toward assessment-related activities

  11. CAC Members CAC-UACAC-DB Susan Erdman Tim Brady Allison Kish Rich Clarke Shirley Waterhouse Tina Frederick Kathy Welch Tom Hilburn Jim Libbey Linda Manning Terry Mularkey Notis Pagiavlas Paula Reed

  12. UAC Duties • Unify campus-level assessment activities • Facilitate communication re: assessment and continuous improvement across campuses and UA; encourage sharing of best practices • Annual summary of assessment activities at the university level • Vehicle for feedback to / from Cabinet • Vote on “best assessment reports” nominated by CACs

  13. Institutional Research Duties • Institutional Research (IR) moves to a supporting role • Assessment coordinator available to assist with • development of assessment measures • administration of surveys for assessment • id existing sources of data (IR and external) • help do research for assessment techniques • IR website has survey data and project calendar • IR houses archived assessment reports

  14. Complete Current Assessment • Close out 01-02 assessment report • Download current report from assessment website • Steps 1-3 were submitted last October • Word format this cycle; web-based next cycle • Complete Steps 4 & 5 • Summary of Data Collected • Use of Results • Submit completed reports to Assessment Coordinator by Oct. 15, 2002

  15. Complete Current Assessment • Steps 4 & 5 are straightforward IF…. • Assessment data were actually gathered • Data provided information to determine if outcomes were actually met • Thought was put into how various results might be used

  16. Possible Scenarios • “Winning” Scenarios • Criteria for success were met • Criteria for success were NOT met and results were used to make improvements (even better?) • “Problematic” Scenarios • No use of results are shown • Insufficient data without offering a “fix”

  17. Winning Scenarios • Criteria for success were met • Step 4: Summarize assessment data collected • Step 5: State that criteria were met and indicate future of intended outcome • no further action required and retire outcome from next assessment cycle • re-assess next cycle using different criteria / measures

  18. Winning Scenarios • Criteria were NOT met / results were used • Step 4: Summarize assessment data collected • Step 5: State that criteria were not met and explain how results have been used to make improvements • Changes made to program • Change criteria (criteria too strict?) • Use different assessment method (corroborate) • Step 5: Indicate future of outcome • No further action required (?) • Assess again using different criteria / assessment method? • Sparked new initiative for strategic plan / re-assess at later date?

  19. Problematic Scenario Solution • Potential Problem: Haven’t yet used results; can’t “close the loop” by end of assessment cycle • Solution: • Not a problem IF new / strategic initiatives must be taken in order to make improvements – write these into strategic plan and reference assessment report • Otherwise, need careful wording to put “will” into past tense. Hold meetings / make plans prior to submission of report so that decision actions may be stated in the past tense.

  20. Problematic “We are planning a retreat to discuss results” …or… “We will…” Preferred “Assessment results revealed insufficient student access to the internet. See new initiative regarding additional workstations in 03-04 strategic plan “ …or… “We have met and have agreed that these are the actions to take… (outline a plan)” Problematic Scenario Solution

  21. Problematic Scenario Solution • Potential Problem: Insufficient data • Solution: Explain why (be specific about nature of the problem) and state what is to be done differently next time to obtain data

  22. Problematic “No data available” … or … “Sample size too small” Preferred “Survey administration was delayed; no data collected. Same outcome/criteria will be carried over to next cycle when survey is to be administered” … or … “Sample size (n=3) was too small to determine whether criteria for success was met. Outcome carried over to next assessment cycle; will combine three years of survey data to ensure sufficient sample size” Problematic Scenario Solution

  23. Use of Results • Responses to unmet outcomes / objectives • Academic Programs • “What is taught” • Closer alignment of coursework with “world of work” • Change in sequence of courses • Additional courses required for degree completion • “How it is taught” • Methodology / technology • Active participation • Administrative Departments • Varies widely - improvements in services or processes

  24. Preparing for Next Cycle • There is an expectation that the assessment process will evolve and mature • As you close out current cycle (Oct. 15, 2002) • complete Steps 4 & 5 – no need for re-writes now • use troubleshooting tips if useful • start thinking about Steps 1-3 that will guide your assessment activities in the next cycle • Some areas that could use improvement…

  25. Preparing for Next Cycle • Step 1: Expanded Statement of Institutional Purpose • Check your mission statement – it may need to be revised to reflect changing services, new clients, etc. • Step 2: Intended Administrative Objectives • Objectives should have broad focus • Step 3: Criteria for Success & Means of Assessment • State specific criteria (set a target) • Detailed description of assessment method • Include sub-scores

  26. Common Step 2 Problems • Administrative objective is too specific • Example:“Increase student satisfaction with service X by 10%” • Problems: • Criterion in Step 3 is made redundant • : “The increase in students that are satisfied with service X will be 10% greater than the last administration” • Can’t really set more than one criterion for success • Instead: “Students will be satisfied with service X”

  27. Common Step 3 Problems • Criteria for success are not specific enough • Examples:“turnaround time” or “data accuracy” • Problems: • No targets set to indicate whether criteria are met successfully or not • Little chance of using results to “close the loop” • Instead: “An audit of 100 randomly selected files will reveal no more than 5% with missing data”

  28. Common Step 3 Problems • No reference to assessment method(s) • Problems: • No guidance for assessment activities • No documentation for how data were gathered • Instead: “A survey of web users will be administered in the spring by Institutional Research”

  29. Suggestion • Consider using subscale scores • Easier to formulate a specific response for use of results using subscale scores than using overall scores only. • Example of subscale use: “Overall, at least 75% of students responding to the Student Satisfaction Survey agree or strongly agree that services of Dept. X are satisfactory and on none of the 4 specific services mentioned will 25% of students or more give ratings of poor or fair.

  30. Suggestion • Consider qualitative assessment (focus groups, etc.) • Instead of survey • To clarify survey results

  31. Review • This is a learning process • Incorporate existing assessment; don’t duplicate efforts (IR, ABET, CAA, ACBSP, grants) • Establish non-threatening, non-accusatory environment; use results only for improvement • USE results

  32. Resources • Campus Assessment Committee • Institutional Research • Assessment support office • Provides logistical means for conducting and processing surveys • Website contains survey data and calendar of projects • URL: http://irweb.erau.edu • Assessment coordinator, Tiffany Phagan – contact via phone 386-226-6224 or via email phagant@erau.edu • Assessment Website • Log onto ERAU Online->Faculty/Staff->Strategic Planning/ Assessment • Forms, training materials • Archived assessment reports

  33. Questions and Discussion

More Related