1 / 48

Data Informed Decision Making that Improves Teaching and Learning

OAA Assistant Principal’s Meeting January 25, 2006. Data Informed Decision Making that Improves Teaching and Learning. Why are educators so fired up about data? . Superintendents ask. How do we know if teachers are teaching our curriculum?

arabella
Télécharger la présentation

Data Informed Decision Making that Improves Teaching and Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OAA Assistant Principal’s Meeting January 25, 2006 Data Informed Decision Making that Improves Teaching and Learning

  2. Why are educators so fired up about data? Superintendents ask • How do we know if teachers are teaching our curriculum? • How do we maximize the value of dollars spent for assessment and data management? • Are all of our students achieving at acceptable levels?

  3. Professional learning communities ask • What is it we want our students to • know and be able to do? • How will we know when they have learned it? • What will we do when students are not learning?

  4. Creating some common languageabout data in schools What are the major systems? How are they related? What have districts done? Where do we want to go?

  5. 4 Major Data & Technology Systems in Schools. Assessment Systems Student Information Systems Data analysis systems Data warehouse

  6. Data analysis process From Matt Stein. Making Sense of the Data: Overview of the K-12 Data Management and Analysis Market, Eduventures, Inc., Nov. 2003.

  7. What is a Student Information System? • Registers new students • Demographic information (address, emergency contacts, etc.) • Attendance • Scheduling of classes • Achievement data • Examples include: CIMS, Skyward, Chancery, Pentamation, Zangle, etc. It is not keeping track of what is going on in classrooms.

  8. What is an Assessment System? Tool for gathering achievement information • Some deliver item banks • Benchmark by NCS Pearson • MAP by the Northwest Evaluation Association • Some deliver intact tests • Assess2Learn by Riverside • EdVision by Scantron, • Homeroom by Princeton Review • Most are web-based It is assessing what is going on in classrooms.

  9. Administrators, public, legislators Evaluation Accountability Long range planning Teachers, parents, students Diagnosis Prescription Placement Short range planning Very specific ach info Who needs what data? A single assessment cannot meet all needs. e.g., Who understood this concept? Why is Becky having trouble reading? e.g., What percent met standards on 4th grade MEAP math? Are students doing better this year than they were doing last year? Large Grain Size Fine Grain Size

  10. What is a “data analysis system?” • The vendor maps your data to their system • Predefines the kinds of analyses staff will do • Allows user to create answers to questions • Lots of nice graphs, lists, etc. Examples: AMS by TurnLeaf, SAMS by Executive Intelligence, QSP, STARS by SchoolCity, Pinnacle by Excelsior Inform by Pearson. File Maker lets districts invent their own system. D’Tool and TestWiz are “sort of” data analysis systems.

  11. What is a data warehouse? • It brings all the various sets of data together • Financial data • Personnel data • Building infrastructure data • Student demographic information • Student program information • Student achievement information • Example: Center for Educational Performance and Information’s Michigan Education Information System. (80% of work is data cleansing.)

  12. What’s in CEPI’s data warehouse? School Code Master School Infrastructure Database (SID) Single Record Student Database (SRSD) Financial Information Database (FID) Registry of Educational Personnel (REP) Student Test and Achievement Repository (STAR) MEAP ACT SAT

  13. Why some things aren’t in a warehouse…. Easier to ignore hoarding Not sure what it is or how to measure it overlooked stray

  14. How are these things related? You can have a Student Info System and nothing else. You can have an assessment system and nothing else (but most assessment systems “depend” on data from the SIS). There is no point in having a data analysis system unless you have data. If you have a SIS & an assessment system, you’ll probably want a data analysis system. The State of Michigan is creating a data warehouse. A data analysis system could also use data from the warehouse. A data analysis system can bring the pieces together without a warehouse.

  15. Oakland Schools Board of Education agreed to spend up to $1,600,000 in 2005-06 to makePearson Benchmark “Lite” & Inform available to all districts.

  16. What we are trying to do:Provide Technology that Will Help • Improve teaching and increase learning for all • Useful reports for teachers, principals and district administration • Common assessments tied to GLCEs • Item banks tied to GLCEs • Multiple district on-ramps

  17. Project Planning Process • Fall 2003 – Meetings with focus groups • Fall 2004 create RFP • Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements • Dec 2004 – RFP sent out to bid • Jan 2005 – 10 responses received • May 2005 – Committee selects products • July 2005 – Oakland School BOE approval

  18. Oakland & LEA Members Only(N = 15) Items are arranged by “Importance” rating.)

  19. Curriculum Framework (GLCE’s) Items Tests Administer tests Score & Report Interface with SIS Import external tests Import Benchmark tests Select & Analyze Groups Graphs Drill Down Major Parts of Each System “Pearson Benchmark” Student Assessment System “Pearson Inform” Data Analysis Tool An assessment Portfolio “for learning” An electronic CA-60 “of learning”

  20. Measure, Manage and Maximize Student Achievement

  21. Benchmark Test Results By Test • This view displays one • or all tests that the • selected student • population has taken. • Student scores are • plotted across a • proficiency scale. • The view displays the • percentage of students • who scored within the • range of each level on • the proficiency scale.

  22. Benchmark Test ResultsBy Standard • This view displays • each assessed standard • and graphs the • percentage of students • who mastered and did • not master the standard • on each assessment. • Selecting a single test • displays detailed results • by standard for that test. • Selecting all tests • displays student • performance on the • standards over time.

  23. Benchmark Test ResultsBy Individual - View Mastery Details • This view displays • all mastery records • for the given student, • sorted by standard. • This represents a • detailed running • record of a student’s • mastery across all • benchmark tests.

  24. Benchmark Test Results Item Analysis • This view displays • each test question and • the percentage of • students in the current • sample who responded • with each option (A, B, • C, etc.). • The bar graph displays • the percentage of • students who answered • each question correctly • and incorrectly.

  25. Benchmark Test ResultsItem Analysis • Click on the question • number to see the • question itself. • Click on the icon next • to the question number • to see a breakdown of • the item’s performance • by demographic • category.

  26. Benchmark Test ResultsFrequency Distribution • This view plots a line-dot graph • based on the test frequency • distribution, and calculates the • range, mean, standard • deviation, and standard error. • In addition to this baseline data, • you can choose to plot up to • four graphs for particular • demographic groups. • The sample displays the • distribution of female scores • compared to the overall • baseline. • The view also displays how the • scores fall along the selected • proficiency scale.

  27. Pearson Benchmark Benchmark Lite ends here

  28. SCoPE Science Kindergarten I Constructing New Scientific Knowledge I.1 Constructing New Scientific Knowledge I.1.E.1 Generate questions about the world based on observation. I.1.E.1.01 Generate questions about the physical characteristics of plants or animals based on observation. I.1.E.2Develop solutions to problems through reasoning, observation, and investigations. I.1.E.2.01 Create clues to help identify physical objects. I.1.E.2.02 Develop solutions to problems of waste management through reasoning. I.1.E.3 Manipulate simple devices that aid observations and data collection. I.1.E.4 Use simple measurement devices to make measurements in scientific investigations. I.1.E.5 Develop strategies and skills for information gathering and problem solving. I.1.E.6 Construct charts and graphs and prepare summaries of observation. I.1.E.6.01 Construct graphs based on observations of the physical characteristics of animals or plants. I.1.E.6.02 Construct a chart classifying objects based upon physical attributes/properties.

  29. What Attaches Where? Science - Sequence of Study, Grade Level Overview (K-11) Kindergarten - Units of Study (documents in their entirety), Grade Level Overview (K only) I Constructing New Scientific Knowledge I.1 Constructing New Scientific Knowledge I.1.E.1 Generate questions about the world based on observation. - Test Items I.1.E.1.01 Generate questions about the physical characteristics of plants or animals based on observation. - Lesson Plans, Test Items * Resources to be attached (hyperlinked) in blue text.*

  30. Pearson School Systems *** School District Self-Guided Product Tour  Please see comments in Notes Section, using “Notes Page” view.

  31. Principal’s Dashboard

  32. All users can run queries and reports(Teachers, principals, counselors, etc.)

  33. All tests are also broken down byConcepts (“Strands”)

  34. Parent’s / Student’s Dashboard

  35. Quick Start We suggest you start simple with Pearson Benchmark… …that means giving your first few tests using “Answer Key Only”

  36. Quick Start Why Answer Key Only? • You get up and running in the shortest amount of time • You get up and running with the least amount of up front set-up • You get access to content based reports • You don’t have to put items into Benchmark • You can use the paper tests that you have been using all along • You’ve minimized your “degrees of freedom” which will maximize your chance for success!

  37. Quick Start Why NOT Answer Key Only? • You won’t get reports that include the actual test item.

  38. Quick Start Steps for AKO tests • Tell Benchmark how many items there are on the test • Tell Benchmark what the answers are • Tell Benchmark how the items relate to the curriculum • Assign/print/administer test • Scan answer sheets • Emerge from your office, victorious!

  39. OS Support Oakland Schools’ Continuing Role in Pearson Benchmark We’re here for you…help is just a phone call away!

  40. Oakland Schools’ Continuing Role in Inform • Create structure for naming/filing queries for • Principals • Teachers • Create a consistent set of queries for each • Teach all principals to run their own queries • Get additional test data into Inform

  41. Professional Development for LEA’s • Using data to inform instruction • Using Benchmark & Inform for grouping and differentiation • Using the Benchmark with Common Assessments • Using the Benchmark for Classroom Assessments • Administrator use of Inform • SIP Planning using both products

  42. Stage setting (planning) System Administration training AKO Use Curriculum Management Test Item Input Test Construction Online Test Delivery Reports Test Diagnostics Others? OS Support A “modularized” notion of PD

  43. Current Status

  44. Early successes • Lake Orion High School • 5 departments • 14 courses • 36 teachers (about 25%) • 72 sections • Over 2200 scan sheets

  45. Phase I (Sept-Nov) • Meet individually with department heads • Review exams with course teams • Create answer keys • Verify data • Distribute results to participating teachers • Review detailed results to participating teachers • All-staff professional development (11-11-05)

  46. Impact of Phase I • Improved dialogue between participating teams • Discussion and modification of course assessment schedule • Question issues • Assessment design • Increased participation • Improved teacher comfort level of common assessment procedures

  47. Phase 2 (Nov-Jan) • Try online testing • Try using rubrics • Additional course benchmarks • Build new tests • Identify & train department experts

More Related