1 / 25

Assessment 101

Assessment 101. Daniel Martinez, PhD Associate Director of Institutional Research Riverside Community College District. Disclaimer. Views expressed in this presentation are mine and do not represent approval of RCCD, CCCAA, or the CCCCO

ponce
Télécharger la présentation

Assessment 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment 101 Daniel Martinez, PhD Associate Director of Institutional Research Riverside Community College District

  2. Disclaimer • Views expressed in this presentation are mine and do not represent approval of RCCD, CCCAA, or the CCCCO • Questions should be checked with the “Standards” document and/or with consultation with the CCCCO’s matriculation office • Standards, policies and procedures for the evaluation of assessment instruments used in the California Community Colleges (4th edition, revised March 2001)

  3. The Placement Process • A standardized test is NOT required • Interviews • Holistic scoring processes • Attitude surveys • Career aptitude inventories • Transcripts • Measures of performance

  4. 2nd Party Tests COMPASS CELSA P&P Computerized ACCUPLACER CPT Companion MDTP CTEP Critical Mass CA Chemistry Diagnostic Test Local Developed or Managed 52 CCCs Approved Tests

  5. The Postmodern Context of Assessment • Hermeneutic Circle Why did the psychic cross the road? • Options for new (i.e., unknown) students • Start at the beginning • Start NOT at the beginning Demonstrate knowledge

  6. The 6 Pieces of the Placement Process • Validity • Freedom from Bias • Reliability • Standard Error of Measurement (SEM) • Disproportionate Impact • Standardization

  7. The 6 Pieces of the Placement Process • Validity • Freedom from Bias • Reliability • Standard Error of Measurement (SEM) • Disproportionate Impact • Standardization

  8. The 6 Pieces of the Placement Process • Validity • Freedom from Bias • Reliability • Standard Error of Measurement (SEM) • Disproportionate Impact • Standardization

  9. Validity • Content • Criterion Or • Consequential

  10. Content Validity • Placement acts like a prerequisite • Faculty participation is necessary • Match between items on a test and prerequisite skills for a class

  11. Criterion/Consequential Validity • Evidence that students can be placed into different levels of courses • Either is permissible to meet this standard • Rule of thumb: • Criterion: test has not been used for placement • Consequential: test has been used for placement

  12. Criterion Validity • Various measures can be used • Student ratings • Faculty ratings • Mid-term test scores • Final grades • The infamous “.35” • Professional Judgment

  13. Consequential Validity • At a minimum • Student evaluation • Faculty evaluation 75% agreement

  14. Reliability Reliability = Consistency • Internal • Stable • Test-retest • Equivalent form

  15. Standard Error of Measurement (SEM) The standard deviation of test scores that would have been obtained from a single student had that student been tested multiple times. SE = S1-r • S = Standard deviation • r = Reliability coefficient

  16. Disproportionate Impact If the percent of students from a particular group are directed to a particular placement is significantly different then the representation of that group in the population being tested 80/20 Rule

  17. Standardization • Consistent administration of test • Any accommodations need to be documented

  18. Multiple Measures • If a test is used, more than one measure needs to be used for placement. • If another test is used, the correlation between the two tests must be below .80 • No specific criteria as to what other measures can be

  19. 2nd Party Test Validity Content Criterion/Consequential DI Standardization (Bias, reliability, SEM) Developed/Managed Validity Bias Reliability SEM DI Standardization College Responsibilities

  20. Developed/Managed For an INITIAL submission • Validity • Content • Criterion (Professional judgment) • Bias • DI – how it will be monitored

  21. Direct Performance • The same 6 criteria are required • Reliability uses ratings between judges • Correlation r = .70 • Agreement = 90% (within 1 point on a 6-point scale) • Description of how differences are resolved

  22. Thank You! Are there any questions?

  23. CCCAA Perspective Steven Jones, MEd Counselor/Assessment Center Coordinator Reedley College

  24. Resources Available to Colleges • CCCAA Website WWW.CCCAA.NET • Site Visit Advisory Program • Free (almost) • Pre-site visit assistance

  25. Thank You! Are there any questions?

More Related