170 likes | 181 Vues
This presentation discusses the implementation of online reporting systems to support assessment programs and educational decisions. It explores the background and challenges of reporting scores, the rise of interactive online reporting systems, and the importance of clear and valid score reports. The study includes feedback from educators on a dynamic reporting system and suggestions for improvements.
E N D
Influencing Education: Implementing Online Reporting Systems to Support Assessment Programs and Educational Decisions Linette McJunkin, CA&L Sharon Slater, ETS CCSSO National Conference on Student Assessment San Diego, CA June 24, 2015
What we should have called it Feedback From The Field: Educator Perspectives On A Dynamic Reporting System
Overview • Background on the ongoing challenge to report scores in meaningful ways • Description of the current study • Demonstration of an online reporting system • Overall findings by reporting system feature • Next steps
Background • Accessing information in digital format increasingly common – banking, movie tickets, health and fitness data • Increasing use of online testing in K12 setting • Rise in interactive online reporting systems • In 2008, 12 states using interactive online reporting (Knupp & Ansley, 2008) • Five years later, 21 states with interactive online reporting (Shin, et al., 2013) • This past year, at least 26 states had access to reporting systems through either PARCC or Smarter Balanced
Background, cont. • Score reports have historically been unclear. • In 1996, Hambleton & Slater found many misinterpretations of score reports by asking content related questions of educators. • In 1998, the National Education Goals Panel suggested that consistency to score report content may help reduce misinterpretation. • This year, Achieve provided sample reports to help better communicate student assessment results http://www.achieve.org/samplestudentreports
Background, cont. • Latest revision of the Standards (2014) have multiple recommendations for reporting. Standard 12.18 In educational settings, score reports should be accompanied by a clear presentation of information on how to interpret the scores, including the degree of measurement error associated with each score or classification level, and by supplementary information related to group summary scores. In addition, dates of test administration and relevant norming studies should be included in score reports.
Validity Argument • A test itself is not “valid” – the use of the test results is what is valid (or not). • Zapata-Rivera & Katz (2014) state that the validity of test results is a product of the interpretation and use of those results. • Score reports and supporting materials provide the basis for valid interpretation and use. • The score reports are essentially the test “product” – not the test itself – and require sufficient time and resources to design.
Validity Argument, cont. • How much time is spent on design of score reports? • How does that compare to time spent on item development? Specifications? Analysis? Technical documentation? • Does score report design typically occur before or after test design? • Who is involved in the design of score reports? How many different perspectives involved?
Validity Argument, cont. • Accuracy and appropriateness of user interpretation of score reports (Hattie, 2009) has been the focus of recent research, with emphasis on the process: • Determining intended user(s), the information they need, the decisions they will make • Evaluating the report with assessment experts and the intended user(s) – accounting for user characteristics • Provide training and support on how to interpret score reports
Current Study • The goals of our study were to: • Learn more about how educators interpret assessment results as presented via an interactive reporting system • To gain feedback on the reporting system itself, in terms of usability and understanding • One-on-one interviews with: • Teachers (17) • Principals (2) • District administrators (2) • Convenience sample from primarily suburban schools (5 districts in PA, 4 in KS)
Interview Protocol • Brief demographic information section • How (if) reporting systems used currently • Initial impressions of the reporting system • Specific content related questions on various screens within the system • Overview by scale score and proficiency classification (e.g., How many students are advanced? What is the average scale score?) • Detailed class roster view (e.g., How did Alicia Paulson do on the test?; Can you explain to me what the percentile means?) • Strengths and weaknesses by standard (e.g., In which area is the class performance strongest?)
Feedback Based Improvements • Average Scale Score screen: • Add ability to see student names when a user selects one portion of the pie chart • Include PRINT button to top right side of screen • Move legend to top of the screen so it is more prominent • Create meaningful comparison groups, let user define • Proficiency screen: • Change title to “Roster View” • Ability to sort based on various columns of data (total score, subscore, alphabetical)
Feedback Based Improvements, cont. • Strengths and Weaknesses screen: • Remove the “spider web” graphic • As many options as possible for comparison groups in the tree chart • System Overall: • Need something simple, tied to grade book • Single sign-on, all data in one place
Primary Findings: Teachers • Focused on subscore results to tailor individual or small group remediation • Understood standard error, but did not want to see it – certainly did not want to have to explain it to a parent
Primary Findings: Administrators • Focus on proficiency classifications – percent in the Below Basic and Basic categories • Also interested in subscore results to plan professional development, pair teachers with mentors who performed better in a certain area, etc.
Next Steps • Larger, more representative sample of educators • Gather formal, specific feedback on the system from states and districts where it is operational • Replicate the study with parents/guardians • More research needed on how score reports can support appropriate uses of subscore results