1 / 18

Quality Review Process

Quality Review Process . IRC Meeting April 23, 2014. overview. Survey Results - QRP Survey Results - QRPDS Updates & Revisions Existing Tools Next Steps. Survey participants. YOU! 39 respondents 20 research & evaluation staff 10 college administrators 7 unknown 2 other

hovan
Télécharger la présentation

Quality Review Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Review Process IRC Meeting April 23, 2014

  2. overview • Survey Results - QRP • Survey Results - QRPDS • Updates & Revisions • Existing Tools • Next Steps

  3. Survey participants • YOU! • 39 respondents • 20 research & evaluation staff • 10 college administrators • 7 unknown • 2 other • 15 colleges • 55% did not participate in 2011 QRP review

  4. Satisfaction with QRP 37% (n=14) 13% (n=5)

  5. “Is the qrp, as is, useful for your college?” 29% (n=9) 23% (n=7)

  6. USEFUL? Yes. 6 colleges • “The model is useful, but the mechanism for use is difficult” • “Provides structure to have a formal assessment of program outcomes” • “Our IR staff is small in number and we rely on state data for metrics” • “QRP is regarded as valuable campus-wide” • “Allows us to look at overall health of the program” • “Provides a framework for continuous improvement and comparison across the WTCS” • “Helps make sure all the steps are taken when a program undergoes an evaluation”

  7. Useful? Not sure. 5 colleges “I do think having state benchmarks is important to help narrow the analysis” “Some indicators are useful, however some indicators should be college specific” “QRPDS can be confusing to use” “Data is too unreliable” “It has been hard to get some of the deans to take it seriously and to complete them in a timely manner”

  8. Useful? No. 9 colleges • “QRP system has been unreliable and confusing” • “QRP data is not always reliable” • “Requires additional work to comply with the QRP requirement” • “WTCS QRP is redundant and simply adds unnecessary work for staff without aiding in the quality improvement process” • “We have made customizations to the QRP at our college and are not dependent on the WTCS for data or to provide guidance in the process” • “We rely on our internal data and reporting for program performance” • “There may be more important data that a program needs to address for program improvement that is not included” • “The data system is very useful. The QRP Process is not.” • “We enter data into QRPDS because we have to, but we rarely look at what has been entered there”

  9. Original QRP Mission Statements • “QRP helps improve overall organizational performance, practices, and capabilities” • 39% Agree or Strongly Agree (n=15) • 42% Disagree or Strongly Disagree (n=16) • “QRP facilitates communication and sharing of best practices among colleges” • 26% Agree or Strongly Agree (n=10) • 42% Disagree or Strongly Disagree (n=16) • “QRP serves as a tool for understanding and managing performance” • 42% Agree or Strongly Agree (n=16) • 26% Disagree or Strongly Disagree (n=10) • “QRP develops a guide for further planning and training” • 32% Agree or Strongly Agree (n=12) • 39% Disagree or Strongly Disagree (n=15)

  10. Current responses to 2011 QRP Review Feedback • >50% agree or strongly agree with the following statements: • “We should include several years of data in one scorecard” (95%) • “We have concerns about the reliability and validity of the data” (86%) • “Evaluation handbook & user guides need to be updated” (78%) • “We need to revise our approach to calculating thresholds and targets” (78%) • “It would be nice to update the list of indicators” (78%) • “Current indicator definitions are confusing” (76%) • “We should have a clearinghouse of best practices” (63%) • “We would like more training & direction” (54%)

  11. Satisfaction with QRPDS

  12. Keep or Replace? • Other: • “Need to know what new system would look like and how it would function” • “What is the purpose of an online file sharing system?” • “Identify which components are being used and improve on these”

  13. Features of New “QRPDS” • Other: • “Share scorecard data only” • “Reporting and documentation templates” • “Only do this if there is a true purpose for it and someone at the WTCS will utilize it as well”

  14. Updates & changes • QRP • Confirm validity of data • Update scorecards • Multiple years of data • Reassess indicators • Identify appropriate benchmarks • Update manual and user guides • Develop clearinghouse of best practices • QRPDS • Replace! (or eliminate entirely…)

  15. Existing tools • Local Evaluation Frameworks • 14 colleges have a local evaluation process • 10 colleges conduct local evaluations every year • Other Statewide Initiatives • AQIP • Student Success • Performance Based Funding • TAACCCT grants • Other?

  16. Next steps • Form work group & set meetings • Identify valuable approach that will not duplicate effort of other initiatives • Revise indicators • Identify appropriate benchmarks & thresholds • Update scorecards • Multiple years of data • New indicators, benchmarks, & thresholds • Update/rewrite manual • Replace or eliminate QRPDS

  17. Thank you Thank you for completing the survey!

  18. QUESTIONS? • Leah Childress • (608) 266-1840 • leah.childress@wtcsystem.edu

More Related