300 likes | 407 Vues
Using Surveys to Evaluate Cut-Scores on Placement Tests. NCTA National Conference Scottsdale, AZ September 2003. Presenters. Darlene Nold, Director of Testing Community College of Denver, CO Ramzi Munder, Assistant Director of Testing Community College of Denver, CO
E N D
Using Surveys to Evaluate Cut-Scores on Placement Tests NCTA National Conference Scottsdale, AZ September 2003
Presenters • Darlene Nold, Director of Testing Community College of Denver, CO • Ramzi Munder, Assistant Director of Testing Community College of Denver, CO • Karla Bohman, Basic Skills Curriculum Specialist San Juan College, Farmington, NM
CCD Demographics • FTE for fall 2002: 3,071 • Hispanic 32%; African-American 16%; Asian 6%; Native American 2%; White 41%; Other 3% • Mandatory assessment AND mandatory placement state-wide • Number of first-time degree-seekers: 1,123 • Number of students taking Accuplacer: 877 • Percent of students needing basic skills instruction based on cut-scores: • English: 60% Reading: 66% Math: 87%
SJC Demographics • FTE for fall 2002: 3,606 (census) • Native American 24%; Hispanic 10%; White 64%; Other 2% • Mandatory assessment and working toward mandatory placement. • Number of first-time students: 1,654 • Number of first-time students who took all or some parts of the Accuplacer: 832 • Percent of students needing basic skills instruction based on cut-scores: • English: 63% Reading: 56% Math: 97%
Why Evaluate Cut-Scores? • To ensure proper placement into courses • To make sure students have the minimum skills to be successful • To ensure test content matches the curriculum • To raise awareness and increase understanding about testing issues & student success • To fulfill ethical testing standards • To provide data to policy makers to make informed decisions (mandatory placement)
Compare content of test with curriculum in course Survey what other colleges use for cut-scores Compare new test with old test Faculty take the test as if they were a student Test students at the END of the course Perform a series of statistical analysis comparing test scores and student performance (grades) in the course. Survey faculty and students Methods for Evaluating Cut-Scores
Using Survey Instruments to Evaluate Cut-Scores • Survey faculty about students level of preparedness and their likelihood of success in their classes • Survey students about their beliefs of preparedness, assignment difficulty and whether they would achieve a passing grade in the class
Faculty Question: How well is each student prepared to learn the material presented in this course? • Under prepared (Does not possess adequate skills to comprehend the new materials presented in this class.) • Adequately prepared (Possesses the skills to complete the course successfully with a normal amount of study and tutoring.) • Over prepared (Has skill levels more appropriate for a higher level course.)
Faculty Survey Sample Form • Teacher Name • Course Name • Student Name IDUnder Prep Adequately Prep Over Prep No Show
Faculty Question at SJC: How likely is each student to earn a “C” or better in this course? • YES = will be successful based on ability and a normal amount of tutoring and support • NO = will not be successful based on ability and a normal amount of tutoring and support
Student Question: How prepared were you for the work in this course? • Under Prepared • Adequately Prepared • Over Prepared
Student Question: Do you find the assignments in this course to be: • Too difficult • Within your capabilities • Too easy
Student Question:CCD – What grade do you believe you will earn in this class?CCD – A, B, C, D, F SJC – Do you think you will get a C or better in this class?SJC – YES or NO
Survey Delivery & Return Method • Announced survey study in college on-line newsletter and campus meetings • Faculty and Student surveys combined in one package • Package sent via internal campus mail to each instructor • Detailed instructions accompanied each survey package (letter from VP) • Support offered as needed • Surveys returned via internal campus mail or in person
CCD – FALL 2002 Administered the 10th week of a 15 week term Developmental through first year college level reading, English & math 55% response rate from faculty 36% response rate from students SJC – FALL 2002 Administered the 5th week of a 16 week term Developmental through first year college level reading, English & math Nearly 100% response rate from faculty 65 – 70% response rate from students Two Placement Survey Studies
Wording of questions Data integrity Manpower Pulling data from student database Cultural issues Return rate Time Money Challenges
Factors Affecting the Evaluation of Cut-Off Scores • Restriction of Range • Defining Success • Reliability & Validity of Grading System
Withdrawals? Incompletes? • What do you do with grades that indicate withdrawal from class and/or non-completers of the course? • What about other types of grades • CCD uses SP for Satisfactory Progress • SJC uses RR for Reregister
What did faculty gain? • Knowledge of the placement tool • Understanding of the relationship between the tool and the curriculum • Insight into curriculum Qs and issues • Understanding of the importance of evaluating cut scores on a regular basis • Confirmation about their perceptions of students and students’ perceptions of themselves • Awareness of further areas of inquiry related to grading systems, testing, et cetera.
What does this do for SJC? • Validates our testing process • Develops a shared understanding about what methods can be used to modify cut scores • Creates much needed floors for lowest developmental courses and refines cut scores at other levels • Provides further areas of inquiry related to testing, grading systems, curriculum, and student tracking • Establishes a benchmark for future self-study
What Did We Learn? • This is a continual process • This study was essential in the modification of cut scores • There will be a large curricular impact • There are not always clear answers • Faculty’s perceptions more accurate than students • Faculty were more open to this process than we imagined • There are many other studies that can be done with this data