1 / 38

Data Analysis: Using Data to Inform Instruction

Data Analysis: Using Data to Inform Instruction. Dr. Tracey Severns. Introductions – Who am I?. Background Check Teacher Vice Principal & Principal Superintendent Researcher Presenter Student. Teachers/Leaders need to be educated consumers and users of data in order to:.

ashley
Télécharger la présentation

Data Analysis: Using Data to Inform Instruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Analysis: Using Data to Inform Instruction Dr. Tracey Severns

  2. Introductions – Who am I? Background Check • Teacher • Vice Principal & Principal • Superintendent • Researcher • Presenter • Student

  3. Teachers/Leaders need to be educatedconsumers and users of data in order to: • Evaluate progress and performance • Establish goals and mobilize efforts • Leverage resources • Inform practice • Guide decision-making • Market results

  4. We’re going to: • Gain insight into the uses of data • Learn how to “unpack” the scores • Learn to ask questions of the data • Discuss how to use data to improve teaching and learning

  5. When working with data, use three reference points. • How are we doing compared to standard? (Proficiency) • How are we doing compared to ourselves? (Progress) • How are we doing compared to others? (Relative performance)

  6. According to Just for the Kids…www.just4kids.org • Students who have been continually enrolled in the school for at least one year. • Schools that have an equal or greater number of economically disadvantaged students and students with limited English proficiency.

  7. As Victoria Bernhardt says… “Disaggregation is not a problem-solving strategy, it is a problem-finding strategy.”

  8. Possible disaggregates include • Gender • Enrollment in special programs • Ethnicity • School, class, level • SES • Years in the district • Course sequence

  9. Simpson’s Paradox • Has nothing to do with Homer. • Beware of changes in groups over time whenever the aggregate data show one pattern and the disaggregated, subgroup data show the opposite.

  10. Consider this… SAT Scores 2002 SAT Scores 2007 Mean = 480 Mean = 478 At a BOE meeting, people demand to know, “Why are SAT scores dropping?” But are they?

  11. Let’s examine the data SAT Scores 2002 SAT Scores 2007 500 510 500 510 500 510 500 510 500 510 500 510 500 430 500 430 400 430 400430 Mean = 480 Mean = 478

  12. Take a look at the scores. In 2002, the 500s represent scores of white students and 400s represent scores of black students. In 2007, the 510s represent scores of white students and 430s represent scores of black students.

  13. So what happened? White students’ scores went up 10 points. Black students’ scores went up 30 points. but In 2002, 80% were white, 20% were black. In 2007, 60% were white, 40% were black.

  14. And so… • Although the SAT scores for both groups increased, the overall mean decreased because there was a higher percentage of minority students taking the test. • Thus, beware of shifts in subgroup proportion and performance over time.

  15. Simpson’s Paradox at work… Ethnic Group 1981 2005 Gain White 519 529 +10 Black 412 433 +21 Asian 474 511 +37 Mexican 438 453 +15 Puerto Rican 437 460 +23 Am Indian 471 489 +18 All Students 504 508 +4

  16. Why are our scores dropping?

  17. They’re not. We’re doing better!

  18. Beware of Means… The mean is only one measure of central tendency. Often, the mode or median is a more accurate descriptor of the group’s performance.

  19. When a teacher tries to teach something to the entire class at the same time, “chances are, one-third of the kids already know it; one-third will get it; and the remaining third won’t. So two-thirds of the children are wasting their time.” - Lillian Katz

  20. Monitoring Student Progress Research has found that student achievement increases a full standard deviation when teachers test first, then teach, then test. When you pre-test, you determine the degree of mastery of prerequisite skills. You need to know where students are in order to teach them effectively.

  21. Look beneath the surface • Enter the data in a spread sheet. • Total each row (student performance) • Total each column (item analysis)

  22. Data Analysis • What trends do you find in the data? • To what would you attribute the results? • What questions come to mind when you review the data? • What recommendations would you make to improve student performance?

  23. Assessing Student Progress Close the gaps between the • Written curriculum • Taught curriculum • Assessed curriculum

  24. In other words… • Did we teach what we said we would? • Did they learn what they were supposed to learn?

  25. Time for a Test Create a question to assess mastery of the following standard: SWBAT understand and use percents.

  26. SWBAT understand and use % • 50% of 40 • 34% of 67 • 26 is 40% of what number? • In a town election, 5985 people voted. This is 63% of the town’s registered voters. How many people are registered to vote?

  27. SWBAT understand and use % • You deposit $1200 into a savings account that earns 3% interest compounded annually. Find the balance of the account after 2 years. • Describe how to find your percent increase in height from last year to this year. Show how to find this percent using a proportion.

  28. Common Quarterly Assessments What evidence do we need to demonstrate mastery of the standard? 1. Create common tests 2. Schedule assessments in advance 3. Analyze student performance 4. Re-teach/implement interventions 5. Retest

  29. District Writing Assessment Ensures an annual assessment of writing skills for all students in grades 3-9. Allows teachers to target weaknesses and track student progress over time. Provides the best PD the district can offer

  30. Step 1: Collection 1. Plot and analyze the data 2. Look for patterns by question type (multi choice vs. constructed response) 3. Examine frequency of “distracters.” 4. Look for patterns by skill/content area 5. Look for patterns by student subgroups

  31. Debrief the Test Ask the kids! • Did the test match your expectations? • How did you prepare for the test? • What would you do differently next time? • What should I do differently next time?

  32. Step 2 - Examination Examine the patterns • Allocate adequate time • Involve multiple people/perspectives • Proceed systematically

  33. For example: For criterion-referenced tests • All students by content area (LA) • All students by skill/content • All students by question type • Subgroups by skill/content • Subgroups by question type • Individual students by benchmarks

  34. Step 3 - Interpretation 1. Turn numbers into pictures! 2. Summarize observations 3. Describe patterns of strengths 4. Identify areas of concern 5. Generate hypotheses/potential causes 6. Prioritize problems/areas of concern

  35. Step 4 – Take Action Establish Goals. Get SMARTS S Specific and clearly articulated M Measurable A Attainable R Results-oriented T Time-bound S Supported by all stakeholders

  36. Develop an Action Plan. • What do I want to know? • What data do I need? • Where and when can I obtain the data? • What will I do with the data? • Who needs to know the information?

  37. Does it really matter? Marzano and Stronge’s research: One year with an ineffective teacher takes three consecutive years with a highly effective teacher to catch up.

  38. The end! Thank you for coming! If you have any questions, contact me at: tseverns@mtoliveboe.org

More Related