1 / 65

Data Interpretation I Workshop

Data Interpretation I Workshop. 2008 Writing Assessment for Learning. Purposes for the Day. Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results;

hayes-snow
Télécharger la présentation

Data Interpretation I Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Interpretation I Workshop 2008 Writing Assessment for Learning

  2. Purposes for the Day • Bring context and meaning to the writing assessment project results; • Initiate reflection and discussion among school staff members related to the writing assessment results; • Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results; • Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and, • Provide an opportunity to discuss and plan around the data

  3. Agenda • Understanding data—sources, categories & uses • Provincial Writing Assessment • Conceptual Framework • Comparators • Student Performance Data • Opportunity to Learn Data • Standards and Cut Scores • Predicting • Categories of Data • Action Planning • Linking Data, Goals and Intervention • Closure

  4. Synectics • Please complete the following statement: “Data use in schools is like . . . because . . .” Data use in schools is like molasses because it is slow and gets slower as it gets colder. Data use in schools is like molasses because it is sticky and can make a big mess!

  5. A Data-Rich Environment Wellman & Lipton (2004) state: Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

  6. International Data Sources • Programme for International Student Assessment (PISA) http://snes.eas.cornell.edu/Graphics/earth%20white%20background.JPG

  7. National Data Sources • Pan-Canadian Achievement Program (PCAP) • Canadian Test of Basic Skills (CTBS) • Canadian Achievement Tests (CAT3) http://www.recyclage.rncan.gc.ca/images/canada_map.jpg

  8. Provincial Data Sources • Assessment for Learning (AFL) • Opportunity to Learn Measures • Performance Measures • Departmentals http://regina.foundlocally.com/Images/Saskatchewan.jpg

  9. Division Data Sources • Division level rubrics • Division bench mark assessments http://www.sasked.gov.sk.ca/branches/ed_finance/north_east_sd200.shtml

  10. Local Data Sources • Cum Folders • Teacher designed evaluations • Portfolios • Routine assessment data

  11. Definitive Indicative Individual Classroom School Division Provincial National International Student Evaluations System Evaluations Nature of Assessment Data From Understanding the numbers. Saskatchewan Learning

  12. In-depth knowledge of specific students Individual Individual Classroom Classroom School School Division Division Provincial Provincial National National International International Little knowledge of specific students Assessments . Depth and Specificityof Knowledge In-depth knowledge of specific students In-depth knowledge of systems Assessments From Saskatchewan Learning. (2006). Understanding the numbers.

  13. Using a Variety of Data Sources • Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make? • What can you do with this data? • What is its impact on classrooms?

  14. Using a Variety of Data Sources

  15. Using a Variety of Data Sources

  16. Using a Variety of Data Sources Please refer to the “Using a Variety of Data Sources” template on p. 3 in your handout package as a guide for your discussion.

  17. Assessment for Learningis a Snapshot • Results from a large-scale assessment are a snapshot of student performance. • The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. • The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Learning, 2008)

  18. Provincial Writing Assessment: Conceptual Framework – p. 4 & 5 • Colourful Thoughts • As you read through the information on the Provincial Writing Assessment, use highlighters or sticky notes to think about your reading: Wow! I agree with this. Hmm! I wonder. . . Yikes! Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

  19. Comparators: Types of Referencing – p. 6 • Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing. (Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca) • Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention. (Figure .2b, .3c, .4a, .6b, .7b and .8b.)

  20. Comparators: Types of Referencing • Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning. (E.g.. Comparing these results to current school data. The standards set by the panel.) • Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm-reference comparisons contribute very little to determining how to use the assessment information to make improvements. (E.g.. Tables comparing the school, division and province.)

  21. Comparators: Types of Referencing • Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

  22. Propensity to Learn using resources to explore models, generate ideas and assist the writing process Motivation, attitude and confidence Participation, perseverance and completion Reflection Knowledge and Use of Before, During and After Writing Strategies Home Support for Writing and Learning Encouragement and interaction Access to resources and assistance Opportunity-to-Learn Elements as Reported by Students

  23. Availability and Use of Resources Teacher as key resource Teacher as writer Use of curriculum Educational qualifications Professional development Time Student resources Classroom Instruction and Learning Planning focuses on outcomes Expectations and criteria are clearly outlined Variety of assessment techniques Writing strategies explicitly taught and emphasized Adaptation Opportunity-to-Learn Elements as Reported by Teachers

  24. Demonstration of the writing process Pre-writing Drafting Revision Quality of writing product Messaging and content Focus Understanding and support Genre Organization and coherence Introduction, conclusion, coherence Language use Language and word choices Syntax and mechanics Student Performance Outcome Results

  25. Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. • Assessment items will be developed for each assessment cycle using a consistent table of specifications. • The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments. • A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

  26. Opportunity-to-Learn and Performance Standards • In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post-secondary academics including Education faculty. • The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

  27. Thresholds of Adequacyand Proficiency

  28. Threshold of Adequacy Threshold of Proficiency 1.87 3.92 Adequate Proficient & Beyond Thresholds of Adequacyand Proficiency

  29. Cut Scores • On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels:

  30. Predicting Card Stackand Shuffle • Individually: As you refer to the cut scores on page 4, create a stack of cards with some of your predictions about student outcomes in Narrative and Expository writing – consider each separately. • Writing Process – (Prewriting, drafting, revising) • Writing Product – (Message, organization and language choices) • Eg. I predict our 85% of our Gr. 8s will meet the adequate standardor higher in Propensity to Learn and of those, 20% will be proficient or higher because our students are very comfortable with writer’s workshop processes, which we have emphasized for the last three years. • Eg. I predict 90% of our Gr. 5s will score adequate or higher on demonstration of writing process in narrative writing because of our whole school emphasis on writing, especially with respect to narrative writing.

  31. Predicting Card Stackand Shuffle • As you complete each card, place it in the center of the table. • As a group, shuffle the cards. • In turn, each group member picks a card to read aloud to the table group. The group engages in dialogue or discussion about the items. • Guiding questions: • With what parts of this prediction do you agree? Why? • With what parts of this prediction do you disagree? Why? • To what extent is this prediction generalizable to all the classrooms in your school?

  32. Predictions • Considering all of the predictions, are there any themes or patterns emerging upon which you can all agree? • Why might this be?

  33. Comparisons The completed tables are on page 7. • What are you noticing about the data? • What surprised you? • Which of your predictions were confirmed? • Which of your predictions were not confirmed? • Consider your assumptions as you discuss the results. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

  34. Examining the Report • Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation.

  35. Please return at 12:40 I’d trade, but peanut butter sticks to my tongue stud.

  36. Local Level Sources of Data While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms.

  37. Local Data Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. Can disaggregate other data by demographic variables. AFL Opportunity-to-Learn Data Family/Home support for student writing encouragement and interaction access to resources Four Major Categories of Data: Demographics – p. 7 Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  38. Local Data Describes outcomes in terms of standardized test results, grade averages, etc. AFL Readiness Related Opportunity-to-Learn Data Using resources to explore writing Student knowledge and use of writing strategies (before, during, after) Student performance outcomes Writing 5,8,11 – Narrative and Expository Writing process Writing product Four Major Categories of Data: Student Learning Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  39. Local Data Provides information regarding what students, parents, staff and community think about school programs and processes. This is data is important because people act in congruence with what they believe. AFL Readiness Related Opportunity-to-Learn Data Commitment to learn Using resources Motivation & attitude Confidence Participation Perseverance & completion Reflection Knowledge and use of writing strategies Four Major Categories of Data:Perceptions Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  40. Local Data What the system and teachers are doing to get the results they are getting. Includes programs, assessments, instructional strategies and classroom practices. AFL Classroom Related Opportunity-to-Learn Data Instruction and learning Planning and reflection Expectations and assessment Focus on writing strategies Adaptations Availability and use of resources Teacher Time Resources for students and teachers Four Major Categories of Data:School Processes Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  41. What Data are Usefuland Available? P. 8 • Think about the goals/priorities set within your school or school division regarding student writing. • Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set. • An example follows on the next slide.

  42. Goal: Students will consciously use writing strategies for all genres. Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  43. Designing Interventions • Assumptions must be examined because our interventions will be based on them. • We must strive to correctly identify the causal factors. • Don’t fall in love with any theory until you have other data. • Use a strength-based approach to interventions.

  44. Team Action Plan • Please turn to page 9 in your handout package. • What are some areas of strength indicated within your data? • What are some areas for improvement indicated within your data? • Please consider all aspects of the report including the Opportunity to Learn Measures.

  45. Fishbone Analysis:Strengths - p 10 • At your table, analyze one strength and consider all contributing factors that led to that strength. All classrooms using Writers’ Workshop Majority of PD focused on writing Writing Process PLC read Strategies that Work Teachers explicitly teaching pre-writing strategy in all subjects

  46. Fishbone Analysis:Area for Improvement – p. 11 • Identify one area for improvement. • What elements from your area of strength could contribute to improvement in this area? • Eg. We did well in the process of writing because all teachers are explicitly teaching pre-writing across the curriculum with every writing activity • So, we need to explicitly teach how to write introductions, conclusions, and transitions in writing in all subject areas

  47. Setting a Goal – p. 12 • Based on your previous discussions regarding strengths and areas for improvement, write a goal statement your team will work on over the coming year. • Eg. For the 2010 AFL in Writing, all students will score at level 4 and above with respect to their use of before, during and after writing strategies. • Write your goal on the provided bubble map. This is a template – add more bubbles if you need them! You do not have to fill in all the bubbles. • Brainstorm possible strategies for meeting that goal. You may need to use different strategies at different grade levels.

More Related