1 / 23

Steps to Review and Improve Your Quality Assurance System

This article provides an overview of the steps involved in reviewing and improving a Quality Assurance System (QAS), including assessing assessment instruments, determining data reliability, examining validity evidence, and seeking external expert reviews.

gregoryt
Télécharger la présentation

Steps to Review and Improve Your Quality Assurance System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Steps to Review and Improve Your Quality Assurance System Christina O’Connor and Kristen Smith The University of North Carolina at Greensboro

  2. What is a Quality Assurance System (QAS)? • Systematic collection of data • Includes multiple measures • Measures are relevant, verifiable, representative, cumulative and actionable • Provides empirical evidence Council for the Accreditation of Educator Preparation , 2013

  3. Why is the QAS important? • The QAS is where you collect evidence for all standards • Gaps in the QAS can negatively impact evidence of meeting other standards

  4. Think-Pair-Share • Think about potential barriers to a high quality QAS • Discuss with an elbow partner • Share with the group

  5. Reviewing your QAS • Look for indicators of quality as defined by CAEP • Is your process systematic? • Are there multiple measures in place for each component? • Are the measures relevant, verifiable, representative, cumulative and actionable? • Is there empirical evidence?

  6. Four Steps • Step 1- Review assessment instruments • Step 2- Determine reliability of data • Step 3- Examine validity evidence • Step 4- External and expert review

  7. Action Plan • Work in pairs or small groups • Create an action plan for how you will implement these steps at your institution to review your QAS • Identify Key Collaborators

  8. Step 1- Review assessment instruments • Why? • If instruments are not sound, the data they provide may not be high quality • “Bad” instrumentation can compromise integrity of QAS • Who? • Insider/outsider perspectives • Teacher educator preparation specialist • Assessment specialist or expert

  9. Step 1- Review assessment instruments • What? (Rudner, 1994) • Instrument intended use or purpose • Instrument construction (e.g., item types, formats, organization) • Content validity • Instrument administration & data collection procedures • Results reporting and/or data sharing • Data use

  10. Action Plan • Work in pairs or small groups • Create an action plan for how you will implement Step 1 at your institution to review your QAS • Identify Key Collaborators

  11. Step 2- Determine reliability of data • Why? • If data are unreliable, should we be using them to make decisions about our programs or our candidates? • Reliability of data is necessary precursor to validity (Crocker & Algina, 2008) • CAEP standard 5.2 • Although CAEP does not require reliability evidence for scores on proprietary instruments, reliability is sample specific, so EPPs should still examine reliability of their data (Traub & Rowley, 1991)

  12. Step 2- Determine reliability of data • Who? • Assessment or measurement expert • Assessment, measurement, edu research, or statistics graduate students • What? • Using software like SPSS, SAS, R • Cronbach’s alpha reliability coefficient for internal consistency • Inter-rater reliability for when candidate is evaluated by two raters • Exact agreement % • Inter Class Correlation (ICC) • G-theory Instruments are neither valid nor reliable- DATA ARE!

  13. Action Plan • Work in pairs or small groups • Create an action plan for how you will implement Step 2 at your institution to review your QAS • Identify Key Collaborators

  14. Step 3-Examine validity evidence • Why? • If we lack validity evidence, howdo we know we are making accurate inferences about our programs or our candidates? • Even if data are reliable, they may not necessarily be valid • Validity evidence helps support the accuracy of the conclusions that you draw from scores on the assessment instrument

  15. Step 3- Examine validity evidence • Who? • Assessment or measurement expert • Assessment, measurement, edu research, or statistics graduate students • Faculty or other Subject Matter Experts (SMEs)

  16. Step 3-Examine validity evidence • What? • Numerous “types” of validity evidence should be considered- from content to predictive validity (Cronbach & Meehl, 1955) • Content validity • Lawshe Method • Convergent /Divergent validity • Associations or correlations • Predictive • Utility of assessment scores to predict dependent variable of interest Instruments are neither valid nor reliable- DATA ARE!

  17. Action Plan • Work in pairs or small groups • Create an action plan for how you will implement Step 3 at your institution to review your QAS • Identify Key Collaborators

  18. Step 4- External and expert review • Why? • The more frequently programs consult with assessment experts, the higher quality their assessment processes tend to be (Fulcher & Bashkov, 2012) • Aligns with CAEP standard 5.5. • Who? • Assessment specialist or expert • Individual external to your EPP

  19. Step 4- External and expert review • What? • Share your assessment processes and results with assessment expert and external reviewer in the same way they would be shared with your faculty • Assessment expert and external reviewer can objectively evaluate assessment results and the system for sharing results with program faculty • Reviewers should be able to create actionable steps for program improvement based on the results that were shared with them

  20. Action Plan • Work in pairs or small groups • Create an action plan for how you will implement Step 4 at your institution to review your QAS • Identify Key Collaborators

  21. Wrap Up Identify one person you will share this action plan with when you get back to your institution

  22. Contact Us Christina O’Connor, PhD Director of Professional Education Preparation, Policy, and Accountability, UNCG School of Education ckoconno@uncg.edu Kristen Smith, PhD Director of Assessment, UNCG School of Education k_smith8@uncg.edu

  23. References Banta, T. W., & Blaich, C. (2011). Closing the assessment loop. Change: The Magazine of Higher Learning, 43(1), 22-27. doi: 10.1080/00091383.2011.538642 Council for the Accreditation of Educator Preparation. (2013). Council for the Accreditation of Educator Preparation 2013 CAEP Standards. Retrieved from http://caepnet.files.wordpress.com/2013/05/annualreport_final.pdf Crocker, L., & Algina, J. (1986). Introduction to Classical and Modern Test Theory. Orlando, FL: Holt, Rinehart and Winston. Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin (52), 281-302. Fulcher, K. H., & Bashkov, B. M. (2012, November-December). Do we practice what we preach? The accountability of an assessment office. Assessment Update, 24(6). 5-7, 14. Fulcher, K. H. & Orem, C. D. (2010). Evolving from quantity to quality: A new yardstick for assessment. Research and Practice in Assessment, 5, 13-17. Messick, S. J. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18, 5-11. Rodgers, M., Grays, M. P., Fulcher, K. H., & Jurich, D. P. (2013) Improving academic program assessment: A mixed methods study. Innovative Higher Education, 38(5), 383-395. Rudner, L. M. (1994). Questions to ask when evaluating tests. Practical Assessment, Research & Evaluation,4 (2). Retrieved August 9, 2005 fromhttp://PAREonline.net/getvn.asp?v=4&n=2. Traub, R. E., & Rowley, G. L. (1991), Understanding reliability. Educational Measurement: Issues and Practice, 10, 37-45.

More Related