1 / 30

Using Surveys to Evaluate Cut-Scores on Placement Tests

Using Surveys to Evaluate Cut-Scores on Placement Tests. NCTA National Conference Scottsdale, AZ September 2003. Presenters. Darlene Nold, Director of Testing Community College of Denver, CO Ramzi Munder, Assistant Director of Testing Community College of Denver, CO

euclid
Télécharger la présentation

Using Surveys to Evaluate Cut-Scores on Placement Tests

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Surveys to Evaluate Cut-Scores on Placement Tests NCTA National Conference Scottsdale, AZ September 2003

  2. Presenters • Darlene Nold, Director of Testing Community College of Denver, CO • Ramzi Munder, Assistant Director of Testing Community College of Denver, CO • Karla Bohman, Basic Skills Curriculum Specialist San Juan College, Farmington, NM

  3. CCD Demographics • FTE for fall 2002: 3,071 • Hispanic 32%; African-American 16%; Asian 6%; Native American 2%; White 41%; Other 3% • Mandatory assessment AND mandatory placement state-wide • Number of first-time degree-seekers: 1,123 • Number of students taking Accuplacer: 877 • Percent of students needing basic skills instruction based on cut-scores: • English: 60% Reading: 66% Math: 87%

  4. SJC Demographics • FTE for fall 2002: 3,606 (census) • Native American 24%; Hispanic 10%; White 64%; Other 2% • Mandatory assessment and working toward mandatory placement. • Number of first-time students: 1,654 • Number of first-time students who took all or some parts of the Accuplacer: 832 • Percent of students needing basic skills instruction based on cut-scores: • English: 63% Reading: 56% Math: 97%

  5. Why Evaluate Cut-Scores? • To ensure proper placement into courses • To make sure students have the minimum skills to be successful • To ensure test content matches the curriculum • To raise awareness and increase understanding about testing issues & student success • To fulfill ethical testing standards • To provide data to policy makers to make informed decisions (mandatory placement)

  6. Compare content of test with curriculum in course Survey what other colleges use for cut-scores Compare new test with old test Faculty take the test as if they were a student Test students at the END of the course Perform a series of statistical analysis comparing test scores and student performance (grades) in the course. Survey faculty and students Methods for Evaluating Cut-Scores

  7. Using Survey Instruments to Evaluate Cut-Scores • Survey faculty about students level of preparedness and their likelihood of success in their classes • Survey students about their beliefs of preparedness, assignment difficulty and whether they would achieve a passing grade in the class

  8. Faculty Question: How well is each student prepared to learn the material presented in this course? • Under prepared (Does not possess adequate skills to comprehend the new materials presented in this class.) • Adequately prepared (Possesses the skills to complete the course successfully with a normal amount of study and tutoring.) • Over prepared (Has skill levels more appropriate for a higher level course.)

  9. Faculty Survey Sample Form • Teacher Name • Course Name • Student Name IDUnder Prep Adequately Prep Over Prep No Show

  10. Faculty Question at SJC: How likely is each student to earn a “C” or better in this course? • YES = will be successful based on ability and a normal amount of tutoring and support • NO = will not be successful based on ability and a normal amount of tutoring and support

  11. Student Question: How prepared were you for the work in this course? • Under Prepared • Adequately Prepared • Over Prepared

  12. Student Question: Do you find the assignments in this course to be: • Too difficult • Within your capabilities • Too easy

  13. Student Question:CCD – What grade do you believe you will earn in this class?CCD – A, B, C, D, F SJC – Do you think you will get a C or better in this class?SJC – YES or NO

  14. Survey Delivery & Return Method • Announced survey study in college on-line newsletter and campus meetings • Faculty and Student surveys combined in one package • Package sent via internal campus mail to each instructor • Detailed instructions accompanied each survey package (letter from VP) • Support offered as needed • Surveys returned via internal campus mail or in person

  15. CCD – FALL 2002 Administered the 10th week of a 15 week term Developmental through first year college level reading, English & math 55% response rate from faculty 36% response rate from students SJC – FALL 2002 Administered the 5th week of a 16 week term Developmental through first year college level reading, English & math Nearly 100% response rate from faculty 65 – 70% response rate from students Two Placement Survey Studies

  16. SJC Survey Results: Faculty

  17. SJC Survey Results: Faculty

  18. SJC Survey Results: Student

  19. SJC Survey Results: Student

  20. SJC Survey Results: Student

  21. Grades Compared with Survey Results

  22. Test Scores Compared with Survey Results

  23. Faculty Survey Compared with Student Survey Results

  24. Wording of questions Data integrity Manpower Pulling data from student database Cultural issues Return rate Time Money Challenges

  25. Factors Affecting the Evaluation of Cut-Off Scores • Restriction of Range • Defining Success • Reliability & Validity of Grading System

  26. Withdrawals? Incompletes? • What do you do with grades that indicate withdrawal from class and/or non-completers of the course? • What about other types of grades • CCD uses SP for Satisfactory Progress • SJC uses RR for Reregister

  27. What did faculty gain? • Knowledge of the placement tool • Understanding of the relationship between the tool and the curriculum • Insight into curriculum Qs and issues • Understanding of the importance of evaluating cut scores on a regular basis • Confirmation about their perceptions of students and students’ perceptions of themselves • Awareness of further areas of inquiry related to grading systems, testing, et cetera.

  28. What does this do for SJC? • Validates our testing process • Develops a shared understanding about what methods can be used to modify cut scores • Creates much needed floors for lowest developmental courses and refines cut scores at other levels • Provides further areas of inquiry related to testing, grading systems, curriculum, and student tracking • Establishes a benchmark for future self-study

  29. What Did We Learn? • This is a continual process • This study was essential in the modification of cut scores • There will be a large curricular impact • There are not always clear answers • Faculty’s perceptions more accurate than students • Faculty were more open to this process than we imagined • There are many other studies that can be done with this data

  30. Thank you!

More Related