1 / 70

District Determined Measures Diman Regional Vocational School

District Determined Measures Diman Regional Vocational School. Dr. Deborah Brady. Goals for Today. By the end of this session You will understand what needs to be done and be able to explain it to your colleagues You will have tools to begin to do that work in your district

kermit
Télécharger la présentation

District Determined Measures Diman Regional Vocational School

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. District Determined MeasuresDiman Regional Vocational School Dr. Deborah Brady

  2. Goals for Today • By the end of this session • You will understand what needs to be done and be able to explain it to your colleagues • You will have tools to begin to do that work in your district • But please email questions or confusions any time during this process: • dbrady3702@msn.com • http://tinyurl.com/lumqld7materials from this presentation

  3. The Steps Necessary to Get Ready for June Report and After

  4. Living Likert • Take a magic marker • Review all 6 stages • After considering each one, go to your “stage” • Consider: • What are your school’sbarriers? • What are your district’s strengths?

  5. The DESE Requirements Purpose, timeline, requirements, direct and indirect assessments

  6. District Determined Measures DEFINITION TYPES OF MEASURES Portfolio assessments Approved commercial assessments District developed pre and post unit and course assessments Capstone projects DDMs are defined as: “Measures of student learning, growth, and achievement related to the Curriculum Frameworks, that are comparable across grade or subject level district-wide”

  7. Timeline for Piloting and Full Implementation 2013-2014 District-wide training, development of assessments and piloting June 2014: Report: All educators in the district have 2 DDMs to be implemented fully in SY2015. 2014-2015 All DDMs are implemented; scores are divided into H-M-and Low and stored locally 2015-2016 Second year data is collected and all educators receive an impact rating that is sent to DESE based on 2 years of data for two DDMs

  8. District Determined Measures Regulations • Every educator will need data from at least 2 different measures • Trends must be measured over a course of at least 2 years • One measure must be taken from State-wide testing data such as MCAS if available (grades 4-8 ELA and Math SGP for classroom educators) • One measure must not be MCAS; it must be a District Determined Measure which can include local assessments,and Galileo, normed assessments (DRA, MAP, SAT)

  9. NEW!Determining Educator Impact on Each DDM • Evaluator and educator meet. Evaluator determines whether students demonstrated high, moderate, or low growth on each DDM. • Evaluator shares the resulting designations of student growth with educator. • Educators confirm rosters. • Must be on roster by 10/1 and remain on roster through last day of testing. • Must be present for 90% of instructional time.

  10. Performance & Impact Ratings Performance Rating Impact Rating Ratings are based on trends and patterns in student learning, growth and achievement over a period of at least 2 years Data gathered from DDM’s and State-wide testing High Moderate Low Ratings are obtained through data collected from observations, walk-throughs and artifacts • Exemplary • Proficient • Needs Improvement • Unsatisfactory

  11. NEW! Determining a Student Impact Rating • Introduces the application of professional judgment to determine the Student Impact Rating • Evaluator assigns rating using professional judgment. • Evaluator considers designations of high, moderate, or low student growth from at least two measures in each of at least two years. • If rating is low, evaluator meets with educator to discuss • If rating is moderate or high, evaluator/educator decide if meeting is necessary.

  12. Student Impact Rating Determines Plan Duration for PST (not future employment) Impact Rating on Student Performance Massachusetts Department of Elementary and Secondary Education

  13. NEW! Intersection of Ratings • Reinforces independent nature of the two ratings. • Exemplary or Proficient matched with Moderate or High = 2-Year Self-Directed Growth Plan • Exemplary/Moderateand Exemplary/High = recognition and rewards, including leadership roles, promotions, additional compensation, public commendation, and other acknowledgements. • Proficient/Moderate and Proficient/High = eligible for additional roles, responsibilities, and compensation. • Exemplary or Proficient matched with Low = 1-Year Self-Directed Growth Plan • Evaluator’s supervisor confirms rating. • Educator and evaluator analyze the discrepancy. • May impact Educator Plan goals. • Student Impact Rating informs the self-assessment and goal setting processes.

  14. Indirect Measures • Indirect measures of student learning, growth, or achievement provide information about students from means other than student work. • These measures may include student record information (e.g., grades, attendance or tardiness records, or other data related to student growth or achievement such as high school graduation or college enrollment rates). • To be considered for use as DDMs, a link (relationship) between indirect measures and student growth or achievement must be established. • ESE recommends that at least one of the measures used to determine each educator’s student impact rating be a direct measure.

  15. Indirect Measure Examples • Consider SST Process for a team: • High school SST team example • RTI team example • High school guidance example • Subgroups of students can be studied (School Psychologist group example) • Social-emotional growth is appropriate (Autistic/Behavioral Program example) • Number of times each student says hello to a non-classroom adult on his or her way to gym or class (Direct) • Number of days (or classes) a student with school anxiety participates • Assess level of participation in a class (Direct) • Improve applications to college • IEP goals can be used as long as they are measuring growth (academic or social-emotional)

  16. Turn and Talk Time to de-brief and review the “rules”

  17. Using the 6-phase overview, what are your priorities?

  18. Assessment Quality Requirementsand Definitions from DESE Alignment, Rigor, Comparability, “Substantial,” Modifications

  19. What are the requirements? • 1. Is the measure aligned to content? • Does it assess what is most important for students to learn and be able to do? • Does it assess what the educators intend to teach? • Bottom Line: “substantial” content of course • At least 2 standards • ELA: reading/writing • Math: Unit exam • Not necessarily a “final” exam (unless it’s a high quality exam)

  20. 2. Is the measure informative? • Do the results of the measure inform educators about curriculum, instruction, and practice? • Does it provide valuable information to educators about their students? • Does it provide valuable information to schools and districts about their educators? Bottom Line: Time to analyze is essential

  21. Two Considerations for Local DDMs, 1. Comparable across schools • Example: Teachers with the same job (e.g., all 5th grade teachers) • Where possible, measures are identical • Easier to compare identical measures • Do identical measures provide meaningful information about all students? • Exceptions: When might assessments not be identical? • Different content (different sections of Algebra I) • Differences in untested skills (reading and writing on math test for ELL students) • Other accommodations (fewer questions to students who need more time) • NOTE: Roster Verification and Group Size will be considerations by DESE

  22. Five Considerations (DESE) • Measure growth • Employ a common administration procedure  • Use a common scoring process • Translate these assessments to an Impact Rating (High-Moderate-Low) • Assure comparability of assessments within the school (rigor, validity).

  23. Two Forms to Adapt to Your School’s Standards • Handout—DDM Proposal form • Simple Excel List for June report

  24. June Report Form (Not Yet Released)Educators Linked with DDMs

  25. Handout Sample

  26. Turn and Talk If you took an inventory of assessments, what are your next steps?

  27. Calculating Growth Scores Defining growth, measuring growth, calculating growth for a classroom, for a district

  28. Sample Student GROWTH SCORES from the MCAS TEACHER GROWTH SCORES are developed from student growth scores 244/ 25 SGP 4503699 230/ 35 SGP 225/ 92 SGP

  29. Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures • Holistic Evaluation • Post-Test Only

  30. Pre/Post Test • Description: • The same or similar assessments administered at the beginning and at the end of the course or year • Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year with the passages changed • Measuring Growth: • Difference between pre- and post-test. • Considerations: • Do all students have an equal chance of demonstrating growth?

  31. Pre-Post AnalysisCut Scores for L-M-H Growth

  32. Determining Growth with Pre- and Post Assessments • Cut scores need to be locally determined for local assessments • Standardized assessments use “The Body of the Work” protocol which easily translates to local assessments • First determine the difference between pre- and post- scores for all students in a grade or course • Then determine what Low Moderate and High growth is. (Local cut scores) • Top and bottom 10% to begin as a test case • Body of the Work check • Then all scores are reapportioned to each teacher • The MEDIAN score for each teacher determines that teacher’s growth score

  33. Further measures beyond pre- and post- tests Repeated measures, Holistic Rubrics, Post-Test Only

  34. Repeated Measures • Description: • Multiple assessments given throughout the year. • Example: running records, attendance, mile run • Measuring Growth: • Graphically • Ranging from the sophisticated to simple • Considerations: • Less pressure on each administration. • Authentic Tasks

  35. Repeated Measures Example Running Record # of errors Date of Administration

  36. Holistic • Description: • Assess growth across student work collected throughout the year. • Example: Tennessee Arts Growth Measure System • Measuring Growth: • Growth Rubric (see example) • Considerations: • Option for multifaceted performance assessments • Rating can be challenging & time consuming

  37. Holistic Example Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts

  38. Post-Test Only • Description: • A single assessment or data that is paired with other information • Example: AP exam • Measuring Growth, where possible: • Use a baseline • Assume equal beginning • Or use your own historical data • Considerations: • May be only option for some indirect measures • What is the quality of the baseline information?

  39. Post-Test OnlyA challenge to tabulate growth • Portfolios • Capstone Projects • AP scores • If you have local histories, you can use them • For example, a C student in pre-Calculus getting a 1 might be low growth, a 2 or 3 might be moderate, a 4 or 5 might be high. This is a local determination of growth.

  40. Turn and Talk Discuss the calculations, security, storage, fairness of determining local cut scores.

  41. Tools That May be Helpful What is important? What does a good assessment look like?

  42. Core Curriculum Objectives(http://www.doe.mass.edu/edeval/ddm/example/)

  43. ELA-Literacy — 9 English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02si Assessment Hudson High School Portfolio Assessment for English Language Arts and Social Studies Publisher Website/Sample Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments. http://www.doe.mass.edu/edeval/ddm/example/

  44. Click this link to see the Blueprint of each assessment http://www.workforcereadysystem.org/index.shtml

  45. Take a sample test • You can take a ten question sample test once you find the BLUEPRINT for your exam. • To take a sample test: • Select Blueprint • Select “EXPERIENCE IT” • You will have to register. • Each assessment provides instructions for giving the assessment. • The on-line assessments may have videos and may require students to put things in sequence • The questions are not just multiple choice

  46. Customer Service • Demonstrate professional development skills in a simulated customer service or employment situation. Examples may include:  Job interview  Customer service scenario  Communications  Decision making, problem solving and/or critical thinking

  47. Safety Could be a common assessment for areas without a specific assessment

  48. Additional Testing Examples • Massachusetts Model Curriculum Units and Curriculum Embedded Performance Assessment examples www.doe.mass.edu. You must sign up for them • Other juried sites that may be helpful • Most Core areas: Engage New York • www.engageny.org • ELA: Achieve at www.achievethecore.org • Math and ELA and Literacy: PARCC On Line www.parcconline.org • Rubrics for writing by grade level and writing type • http://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtml

  49. MA Model Curricula and Rubrics CEPAs

More Related