1 / 48

Port Phillip/Bayside Network: Using student achievement data to inform improved teaching programs

Port Phillip/Bayside Network: Using student achievement data to inform improved teaching programs. Presented by Philip Holmes-Smith (School Research Evaluation and Measurement Services) www.sreams.com.au. Overview of the workshop. Some definitions and some facts about tests

tkahn
Télécharger la présentation

Port Phillip/Bayside Network: Using student achievement data to inform improved teaching programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Port Phillip/Bayside Network:Using student achievement data to inform improved teaching programs Presented by Philip Holmes-Smith (School Research Evaluation and Measurement Services) www.sreams.com.au

  2. Overview of the workshop • Some definitions and some facts about tests • Diagnostic vs. Summative Testing • The reliability of summative (standardised) tests • Using other summative tests to monitor student progress • Using and interpreting NAPLAN Data • Assessment of Learning • - Using SPA • Assessment for Learning (Using tests diagnostically to improve learning) • - Writing Criteria Report and the Student Response Report • - Item Analysis Report

  3. 1(a). Diagnostic vs. Summative Tests

  4. Diagnostic Testing • Assessment tools such as the Maths Online Interview, Marie Clay Inventory and Probe together with teacher assessments such as teacher questioning in class, teacher observations and student work (including portfolios) are all examples of “Diagnostic” data • Research shows that our most effective teachers, in terms of improving the learning outcomes of students, constantly use diagnostic information to inform their teaching. • IF a teacher users diagnostic information about what each student can and can’t do to inform their teaching for each student, Hattie (2003) shows that this has the single biggest impact on improving student learning outcomes

  5. Summative (Standardised) Testing • AIM/NAPLAN and other tests like TORCH, PAT-R, PAT-Math, SA Spelling and On-Demand Adaptive Tests are referred to as “Summative” tests • Summative testing is essential to monitor the effectiveness of your teaching. (We will look at ways of doing this later.) • Research shows that summative tests do not lead to improved learning outcomes. As the saying goes: “You don’t fatten a pig by weighing it” • So, although it is essential, keep summative testing to a minimum.

  6. 1(b). The Reliability of Summative Tests

  7. Three Questions • Do you believe that your students’ NAPLAN results accurately reflect their level of performance?

  8. Three Questions • Do you believe that your students’ NAPLAN results accurately reflect their level of performance? • If we acknowledge that the odd student will have a lucky guessing day or a horror day, what about the majority? • Have your weakest students received a low score? • Have your average students received a score at about expected level? • Have your best students received a high score?

  9. Three Questions • Do you believe that your students’ NAPLAN results accurately reflect their level of performance? • If we acknowledge that the odd student will have a lucky guessing day or a horror day, what about the majority? • Have your weakest students received a low score? • Have your average students received a score at about expected level? • Have your best students received a high score? • Think about your students who received high and low scores: • Are your low scores too low? • Are your high scores too high?

  10. Is this reading score reliable? Is this reading score reliable?

  11. Summary Statements about Scores • Low scores (i.e. more than 0.5 VELS levels below expected) indicate poor performance but the actual values should be considered as indicative only. • High scores (i.e. more than 0.5 VELS levels above expected) indicate good performance but the actual values should be considered as indicative only. • Average scores indicate roughly expected levels of performance but the actual values should be considered as indicative only.

  12. Item Difficulties for She’s Crying on the TORCH scale score scale

  13. Converting Raw test Scores (She’s Crying) to TORCH scale score

  14. Test difficulties of the TORCH Tests on the TORCH score scale together with Year Level mean scores

  15. Different norm tables for different tests

  16. Test difficulties of the PAT-Maths Tests on the PATM scale score scale together with Year Level mean scores Year 10 Year 8&9 Year 6&7 Year 5 Year 4 Year 3 Source: ACER, 2006 Year 2 Year 1

  17. Summative Testing and Triangulation • Even if you give the right test to the right student, sometimes, the test score does not reflect the true ability of the student – every measurement is associated with some error • To overcome this we should aim to get at least three independent measures – what researchers call TRIANGULATION. • This may include: • Teacher judgment • NAPLAN results • Other pen & paper summative tests (e.g. TORCH, PAT-R, PAT-Maths) • On-line summative tests (e.g. On-Demand ‘Adaptive’ testing, Assessment of English) • BUT remember, more summative testing does not lead to improved learning outcomes – so keep the summative testing to a minimum

  18. Things to look for in a summative test • Needs to have a single developmental scale that shows increasing levels of achievement over all the year levels at your school • Needs to have “norms” or expected levels for each year level (e.g. The National “norm” for Yr 3 students on TORCH is an average of 34.7). • Needs to be able to demonstrate growth from one year to the next (e.g. during Yr 4, the average student grows from a score of 34.7 in Yr 3 to an expected score of 41.4 in Yr 4 • As a bonus, the test could also provides diagnostic information

  19. Norms for Year 3 to Year 10 On the TORCH scale

  20. My Recommended Summative Tests(Pen & Paper) • Reading Comprehension • TORCH and TORCH plus • Progressive Achievement Test - Reading (PAT-R, 4th Edition) • Mathematics • Progressive Achievement Test - Mathematics (PAT-Maths, 3rd Edition) combined with the I Can Do Maths • Spelling • South Australian Spelling

  21. My Recommended Summative Tests(On-Line) • On-Demand - Reading Comprehension • The 30-item “On-Demand” Adaptive Reading test • On-Demand - Spelling • The 30-item“On-Demand” Adaptive Spelling test • On-Demand - Writing Conventions • The 30-item“On-Demand” Adaptive Writing test • Assessment of English in the Early Years (2010). Includes: • Comprehension, • Spelling, • Writing • Speaking & Listening • On-Demand - Mathematics (Number, Measurement, Chance & Data and Space) • The 60-item“On-Demand” Adaptive General Mathematics test

  22. 2(b). Assessment of Learning Using SPA

  23. The NAPLAN Data Service Main Menu

  24. The Student Achievement Level Report Menu

  25. The Student Achievement Level Report

  26. The Student Achievement Level Report Menu

  27. The Student Achievement Level Report

  28. Extracting Outcome Level Data for Further Analysis

  29. Cut-points for colour coding

  30. Working with the extracted data See SPA

  31. 2(b) Assessment for Learning Using the tests diagnostically to improve student learning

  32. The NAPLAN Data Service Main Menu

  33. The Writing Criteria Report Menu

  34. Writing – The marking rubric • NAPLAN • Marking rubric is comprised of 10 criteria. Namely: - Audience - Cohesion - Text structure - Paragraphing - Ideas - Sentence structure - Character and setting - Punctuation - Vocabulary - Spelling • The marking rubric can be downloaded from: http://www.naplan.edu.au/verve/_resources/napmarkguide08.pdf or www.sreams.com.au. • This document also contains some annotated marked examples.

  35. Writing – The marking rubric (cont)

  36. Writing – Annotated marked example

  37. The Writing Criteria ReportAcross the State less than 20% received a score of “1” and about 40% received a score of “3”However, in this school, nearly 40% received a score of “1” and under 20% received a score of “3”This school could benefit from some lessons on how to develop ideas in a Narrative

  38. The Student Response Report Menu

  39. The Student Response Report – Writing (by criteria)

  40. The NAPLAN Data Service Main Menu

  41. The Item Analysis Report Menu

  42. The Item Analysis Report

  43. Year 5 Reading Correct = ? Most common incorrect = ?

  44. Year 5 Reading Correct = D (93%) Most common incorrect = A, B, C (2-3% each)

  45. Year 5 Reading Correct = ? Most common incorrect = ?

  46. Year 5 Reading Correct = B (56%) Most common incorrect = C (34%)

More Related