1 / 62

CBAS RESEARCH: Leveraging Benchmark Assessment to Promote Student Learning and Achievement

CBAS RESEARCH: Leveraging Benchmark Assessment to Promote Student Learning and Achievement. Dr. Gilbert Andrada , CT Department of Education Joshua Wilson, University of Connecticut Kelly O’Shea, University of Connecticut.

cid
Télécharger la présentation

CBAS RESEARCH: Leveraging Benchmark Assessment to Promote Student Learning and Achievement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CBAS RESEARCH: Leveraging Benchmark Assessment to Promote Student Learning and Achievement Dr. Gilbert Andrada, CT Department of Education Joshua Wilson, University of Connecticut Kelly O’Shea, University of Connecticut Transition to the Connecticut State Standards and System of Assessments Third Annual Connecticut Assessment Crown Plaza, Cromwell Rocky Hill, CT August, 2012 PowerPoint available at: http://www.education.uconn.edu/assessment/

  2. Relationship between the use of benchmark assessments and summative assessments in math and reading Kelly O’Shea, University of Connecticut Transition to the Connecticut State Standards and System of Assessments Third Annual Connecticut Assessment Forum Crown Plaza, Cromwell Rocky Hill, CT August, 2012 PowerPoint available at: http://www.education.uconn.edu/assessment/

  3. Overview • Sample description • How strongly correlated are the CBAS and CMT? • Can the CBAS identify high and low performing students? • How do students who take the CBAS compare to students who do not take the CBAS? • Conclusions and Implications

  4. Sample description • CMT: 2009, 2010, 2011 • CBAS: 2009-2010, 2010-2011 • Only the results for 2011CMT and 2010-2011 CBAS are presented here. • Math CMT: 144,520 • Math CBAS: 15,444 • Reading CMT: 144,520 • Reading CBAS: 12,344

  5. How strongly correlated are the cmt and cbas? • Correlations evaluate the strength of the relationship between two variables (2011 CMT and 2010-2011 CBAS). • Partial correlations look at the strength of the relationship between those two variables after accounting for a third variable (2010 CMT) that is also correlated to the other two variables. • Low: How a student performs on the CBAS is not at all related to how they perform on the CMT • Moderate: How a student performs on the CBAS is somewhat related to how they perform on the CMT • High: How a student performs on the CBAS is strongly related to how they perform on the CMT

  6. Correlations - math • Correlation between math CBAS and the 2011 math CMT (after accounting for the 2010 CMT): .42 (moderate)

  7. Correlations - math .91 .61 .50 Partial correlation=.42

  8. Correlations - reading Correlation between reading CBAS and the 2011 reading CMT (after accounting for the 2010 CMT): .35 (moderate)

  9. Correlations - reading .88 .59 .66 Partial correlation=.35

  10. How strongly correlated are the cmt and cbas? • The CBAS is moderately correlated with the CMT in both math and reading - generally students who perform well on one will perform well on the other.

  11. Can the CBAS identify high and low performing students? The purpose of a t-test is to determine if differences between groups are statistically significant. • The differences in mean CMT scores were evaluated. • Students were sorted into three groups based on their CBAS scores for the t-tests: • Students who did not take the CBAS • Students who took the CBAS and scored below the average CBAS score • Students who took the CBAS and scored at or above the average CBAS score

  12. T-tests - math • Students who took the CBAS and scored at or above average had mean CMT scores that were statistically significantly higher than students who did not take the CBAS as well as students who took the CBAS and scored below average. • Students who didn’t take the CBAS had mean CMT scores that were statistically significantly higher than students who took the CBAS and scored below average.

  13. Group means - math

  14. T-tests - reading • Students who took the CBAS and scored at or above average had mean CMT scores that were statistically significantly higher than students who did not take the CBAS as well as students who took the CBAS and scored below average. • Students who didn’t take the CBAS had mean CMT scores that were statistically significantly higher than students who took the CBAS and scored below average.

  15. Group means - reading reading

  16. Can the CBAS identify high and low performing students? • Based on the t-tests, there were statistically significant differences in mean CMT scores between all three groups in both reading and math. • The group that took the CBAS and scored at or above average had the highest mean CMT scores • The group that did not take the CBAS had around average CMT scores • The group that took the CBAS and scored below average had below average CMT scores • These results complement the correlations by confirming the CBAS can successfully identify the high and low performing students on the CMT.

  17. How do students who take the CBAS compare to students who do not take the CBAS? • Gain Score: Difference between 2010 CMT and 2011 CMT scores on the vertical scale • This allows us to examine the growth during the year for students who took the CBAS vs. those who did not. • Three ANOVAs were performed to determine if the mean gain scores were statistically significantly different between CBAS-users and non-CBAS-users: • By subject (math/reading) • By subject and grade • By subject, grade, and achievement (proficiency) levels

  18. Analysis of gain scores - math • In math, there were statistically significant differences between the gain scores for students who took the CBAS (Mean=25.29, SD=23.32) and students who did not take the CBAS (Mean=24.59, SD=23.29).

  19. Analysis of gain scores by grade - math • Gain scores decrease across grades. • In grades 3-5, non-CBAS-users have higher gain scores than CBAS-users; however, in grades 6-7, CBAS-users start having higher gain scores. • However, there were not statistically significant differences in grades 5 and 7.

  20. Analysis of gain scores by grade and achievement level - math • Grade 3 • Non-CBAS-Users had statistically significant higher gain scores in math at all achievement levels except Basic. • Achievement Levels: • 1 = Below Basic** • 2 = Basic • 3 = Proficient* • 4 = Goal** • 5 = Advanced**

  21. Analysis of gain scores by grade and achievement level - math • Grade 4 • Non-CBAS-Users had statistically significantly higher gain scores in math across all achievement levels except Below Basic. • Achievement Levels: • 1 = Below Basic • 2 = Basic** • 3 = Proficient* • 4 = Goal* • 5 = Advanced**

  22. Analysis of gain scores by grade and achievement level - math • Grade 5 • Non-CBAS-Users had statistically significantly higher gain scores in math at Proficient and Advanced levels. • Achievement Levels: • 1 = Below Basic • 2 = Basic • 3 = Proficient* • 4 = Goal • 5 = Advanced*

  23. Analysis of gain scores by grade and achievement level - math • Grade 6 • CBAS-Users had statistically significantly higher gain scores at the Advanced level. • Achievement Levels: • 1 = Below Basic • 2 = Basic • 3 = Proficient • 4 = Goal • 5 = Advanced*

  24. Analysis of gain scores by grade and achievement level - math • Grade 7 • Non-CBAS-Users had statistically significantly higher gain scores in math at Basic level. • Achievement Levels: • 1 = Below Basic • 2 = Basic* • 3 = Proficient • 4 = Goal • 5 = Advanced

  25. Analysis of gain scores - reading • In reading, there were not statistically significant differences between the gain scores for students who took the CBAS (Mean=21.14, SD=26.10) and students who did not take the CBAS (Mean=21.22, SD=25.97).

  26. Analysis of gain scores by grade - reading • Gain scores generally decrease across grades, (with the exception of 5th grade). • Only in 3rd grade did CBAS-users have higher mean gain scores than non-CBAS users. • However, the differences were not statistically significant in 5th and 7th grades.

  27. Analysis of gain scores by grade and achievement level - reading • Grade 3 • CBAS-Users had statistically significantly higher gain scores in reading across at Below Basic level. • Achievement Levels: • 1 = Below Basic* • 2 = Basic • 3 = Proficient • 4 = Goal • 5 = Advanced

  28. Analysis of gain scores by grade and achievement level - reading • Grade 4 • Non-CBAS-Users had statistically significantly higher gain scores in reading at Goal level. • Achievement Levels: • 1 = Below Basic • 2 = Basic • 3 = Proficient • 4 = Goal* • 5 = Advanced

  29. Analysis of gain scores by grade and achievement level - reading • Grade 5 • No statistically significant differences between mean gain scores in reading. • Achievement Levels: • 1 = Below Basic • 2 = Basic • 3 = Proficient • 4 = Goal • 5 = Advanced

  30. Analysis of gain scores by grade and achievement level - reading • Grade 6 • Non-CBAS-Users had statistically significantly higher gain scores at Goal and Advanced levels. • Achievement Levels: • 1 = Below Basic • 2 = Basic • 3 = Proficient • 4 = Goal** • 5 = Advanced**

  31. Analysis of gain scores by grade and achievement level - reading • Grade 7 • CBAS-Users had statistically significantly higher gain scores in reading at Basic level • Non-CBAS-Users had statistically significantly higher gain scores in reading at Proficient level • Achievement Levels: • 1 = Below Basic • 2 = Basic* • 3 = Proficient* • 4 = Goal • 5 = Advanced

  32. How do students who take the CBAS compare to students who do not take the CBAS? • In math, CBAS-users in 6th grade had statistically significantly higher mean gain scores than non-CBAS-users. • In Grade 6 at the Advanced level, CBAS-users had statistically significantly higher mean gain scores • In reading, CBAS-users in 3rd grade had statistically significantly higher mean gain scores than non-CBAS-users. • In Grade 3 at the Below Basic level, CBAS-users had statistically significantly higher mean gain scores • In Grade 7 at the Basic level, CBAS-users had statistically significantly higher mean gain scores

  33. conclusions • The CBAS and CMT are moderately correlated – indicating that they most likely measure similar content and/or skills • The CBAS does appear to be a good predictor of how students will do on the CMT • Students who score below average on the CBAS also generally score below average on the CMT • Students who score above average on the CBAS also generally score above average on the CMT • It appears that taking the CBAS does not necessarily improve gains on the CMT (with exception at certain grade and achievement levels)

  34. implications • The CBAS can be used to identify high- and low-performing students • Tips for moving the CBAS beyond a predictive purpose (Perie, et al., 2007): • CBAS results should be provided and used right away • The strengths and weaknesses of students (or groups of students) should be examined and changes to instruction should be made • There should be strong support and professional development for teachers to interpret and use the CBAS data to effectively modify classroom instruction • Test should be administered so they are a good fit within the curriculum and not an interruption (students shouldn’t be tested on content not yet taught) Perie, M., Marion, S., Gong, B., & Wurtzel, J. (2007). The role of interim assessments in a comprehensive assessment system: A policy brief (Report). National Center for the Improvement of Educational Assessment, The Aspen Institute, and Achieve, Inc. Retrieved from http://www.achieve.org/files/TheRoleofInterimAssessments.pdf.

  35. Contact information Kelly O’Shea University of Connecticut kelly.o’shea@uconn.edu Thank you!

  36. CBAS Writing Research Using CBAS-WRITE to Identify Struggling Writersandimprove Writing Skills Joshua Wilson, University of Connecticut Transition to the Connecticut State Standards and System of Assessments Third Annual Connecticut Assessment Forum Crown Plaza, Cromwell Rocky Hill, CT August, 2012

  37. Section Overview • Importance of Writing Ability • Features of CBAS–Write • Description of Sample • Using CBAS–Write to Identify Struggling Writers • Using CBAS-Write to Improve Writing Skill • Wrap- Up

  38. Importance of Writing Ability • Writing skills contribute to strong academic and critical thinking skills (Graham & Hebert, 2010). • Weaker writers perform worse in school and suffer lower grades (Graham & Perin, 2007). • Writing skill is one of the best predictors for college success (Geiser & Studley, 2001; Norris et al., 2006; ACT, 2005) • Weak writing skills are a strong predictor of college dropout (McGuirre, 1986). • Surveys of human resource directors from large companies and state governments concur describe writing ability as a gatekeeper—weak writing skills prevent applicants from being hired and employees from being promoted (National Commission on Writing, 2004, 2005).

  39. Importance of Writing Ability • However, studies reveal that the majority of U.S. students experience very little instructional time practicing writing skills • Teachers report including instruction in planning and revising strategies only 9 minutes a day (Cutler & Graham, 2008) • Cognitive theories of writing place heavy emphasis on the role of individual practice to promote mastery (e.g., Kellogg, 2008; Kellogg & Whiteford, 2009). • Cognitive processes: planning, translating, reviewing (e.g., Hayes & Flower, 1981); knowledge and memory (Bereiter & Scardamalia, 1987); metacognition (Wong, 1999). • Particularly troubling is the minimal time spent practicing writing extended texts that require multiple phases of planning, drafting, revising, and editing. • In secondary classrooms (see Applebee & Langer, 2006; Kiuhara, Graham, & Hawken, 2009), writing assignments frequently consist of writing that requires little analysis, interpretation, or actual composing (i.e., short answers, worksheets, summarizing) • In the National Commission on Writing (2008) survey of teens, nearly 80% reported the average length of their writing assignments is a page or less and a majority believed teachers should give them more time to write to help them improve their writing abilities.

  40. Features of CBAS-Write • Voluntary • Computer-based* • Automated scoring • PEG = “Project Essay Grade”** • 6 Trait Scoring • Highly Reliable - close to 100% reliability • Individualized feedback • Revise and resubmit Multiple Times • Links to skill-based tutorials • Teachers receive individual-student reports and class reports *https://www.cbaswrite.com/Home/Welcome **Page, E. B. (1994). Computer grading of student prose, using modern concepts and software. The Journal of Experimental Education, 62(2), 127-142.

  41. Source: http://www.cbaswrite.com/Home/Features#AutoScoring

  42. Additional Info on 6 Trait Scoring • Overall Development: How well the writer communicates with the reader and shows awareness of the audience, task, and purpose. • Organization: The writer's ability to develop a logical plan of organization and maintain control throughout the paper. • Support: The use of appropriate reasons, details, and examples that enhance and develop the presentation. • Sentence Structure: Correct usage, variety, and completeness of sentences. • Word Choice: Use of specific vocabulary and vividness of language. • Mechanics: The correct and effective use of spelling, punctuation, and capitalization. • Total Score: Sum of scores for each of the six traits (range 6-36)

  43. Description of Sample: AY 2010-2011 Full Sample SubSample 284 students who each completed a minimum of 8 successive drafts (first draft + 7 revisions) All grades but 4 and 12 12 Districts 15 schools All DRG’s but ‘A’ • 9,003 students who submitted over 40,000 writing samples • Grades 3 - 12 • 29 districts • 62 schools • DRG’s A – H represented

  44. Subsample Demographics

  45. CBAS Use 1: Identifying Struggling Writers • Can CBAS-Write aid teachers in identifying struggling writers? • Step 1: Identify a pattern of scores across the six traits that is typical of struggling writers at initial draft. • Method: Utilize cluster analysis to identify the profile of scores for different levels of writing performance • Sample: subsample of 284 students who had completed a minimum of 8 drafts (initial draft + 7 revisions)

  46. CBAS Use 1: Identifying Struggling Writers • Results of K-Means Cluster Analysis: Three Cluster Solution Low: n = 30; Average: n = 186; Advanced: n = 63

  47. CBAS Use 1: Identifying Struggling Writers • Range and Average Total Score for Each Cluster at Initial Draft Note. Total Score range = 6 – 36.an = 30. bn = 186. cn = 63.

  48. n = 63 n = 186 n = 30

  49. CBAS Use 1: Identifying Struggling Writers • Key Points: • Struggling writers tend score low – avg ~2 out of 6 – across all traits • Typically there is very little variation in a student’s scores across traits • Struggling writers particularly stand out by their weak score in mechanics • Struggling writers tend to score <18 on the Total Score variable • Identification at initial draft is reliable and valid only to a certain degree

  50. CBAS Use 1: Identifying Struggling Writers • Step 2: Determine how reliable initial draft classification is by analyzing changes in cluster membership over 7 successive drafts. • Method: • Utilize the profile of scores that identified struggling writers at initial draft (draft1) and see how many students fit this profile at draft 2, draft 3, draft 4…draft 8 • Examine whether there are subclasses of students within those initially identified as struggling writers

More Related