1 / 67

Assessment Workshop I Creating and Evaluating High Quality Assessments

Assessment Workshop I Creating and Evaluating High Quality Assessments. Dr. Deborah Brady. Do Now. Good Morning! Please make sure you sign in (table at left with printer) Look over the 2 handouts: the PowerPoint and the Agenda/Handout Sit with team members if possible

ivo
Télécharger la présentation

Assessment Workshop I Creating and Evaluating High Quality Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Workshop ICreating and Evaluating High Quality Assessments Dr. Deborah Brady

  2. Do Now Good Morning! • Please make sure you sign in (table at left with printer) • Look over the 2 handouts: the PowerPoint and the Agenda/Handout • Sit with team members if possible • Please set your cell phones to vibrate • Coffee and… are at table at back; help yourselves; thank you Central Mass Readiness Center and Tahanto District

  3. Norms • Processing Partners • Movement • Exit Slips • Planning next class • Questions, resources • Deborah Brady dbrady3702@msn.com

  4. Information Overload Ahead!

  5. Agenda • Introductions: Overview • Break at about 10, lunch at about 11:30, session ends at about 3:00 • Morning presentation (with frequent processing breaks) and afternoon time for beginning to plan • High quality Assessments (DESE criteria) • Tools to evaluate assessments • Tools to track all educators’ DDMs • Quality Tracking Tool • Educator Alignment Tool • Measuring Student Growth • Direct measures • Local alternatives to determine growth • Pre-/Post, Holistic Rubrics, Measures over time, Post-test only • “Standardization” an alternative, but not required • Indirect measures • Piloting, preparing for full implementation in SY 2015 • TIME to work

  6. Where Are You in This Journey?“Living” Likert Scale

  7. Carousel Walk/Living Likert Scale 1) CAROUSEL WALK • Take a walk past each of the phases in this process • Put a check to the left of each area that you have addressed (even partially) • Put an!next to each bullet/category that you have some concerns about • Put a?Next to any area that seems problematical or is unfamiliar to you • Add + if you see something missing that is a concern 2) LIVING LIKERT SCALE • After your walk, stand by the stage of DDM development where you (and your team, school, or district) are.

  8. Potential as Transformative ProcessWhen Curriculum, Instruction or Assessment is changed….Elmore, Instructional Rounds, and the “task predicts performance”

  9. District Determined Measures DEFINITION TYPES OF MEASURES Portfolio assessments Approved commercial assessments District developed pre and post unit and course assessments Capstone projects DDMs are defined as: “Measures of student learning, growth, and achievement related to the Curriculum Frameworks, that are comparable across grade or subject level district-wide”

  10. The Role of DDMs To provide educators with an opportunity to: Understand student knowledge and learning patterns more clearly Broaden the range of what knowledge and skills are assessed and how learning is assessed Improve educator practice and student learning Provide educators with feedback about their performance with respect to professional practice and student achievement Provide evidence of an educator’s impact on student learning Bottom Line: Time to do this is critically important!

  11. District Determined Measures Regulations: Every educator will need data from at least 2 different measures Trends must be measured over a course of at least 2 years One measure must be taken from State-wide testing data such as MCAS if available (grades 4-8 ELA and Math SGP for classroom educators) One measure must be taken from at least one District Determined Measure which can include Galileo, normed assessments (DRA, MAP, SAT)

  12. The Development of DDMs Timeline 2013-2014 District-wide training, development of assessments and pilot 2014-2015 All educators must have 2 DDMs in place and collect the first year’s data 2015-2016 Second year data is collected and all educators receive an impact rating that is sent to DESE

  13. Impact Rating Performance & Impact Ratings Performance Rating Ratings are obtained through data collected from observations, walk-throughs and artifacts Exemplary Proficient Needs Improvement Unsatisfactory 4 Standards plus 2 Goals Ratings are based on trends and patterns in student learning, growth and achievement over a period of at least 2 years of data gathered from DDM’s and State-wide testing High Moderate Low

  14. Student Impact Rating Determines Plan Duration for PST (not future employment) Impact Rating on Student Performance Massachusetts Department of Elementary and Secondary Education

  15. What kinds of assessments will work for administrators, guidance, nurses, school psychologists? • Use School-wide Growth Measures • Use MCAS growth measures and extend them to all educators in a school • Use “indirect measures” such as dropout rates, attendance, etc., as measures • Use Student Learning Objectives (SLOs) • Or create measures • Pre- and post-tests are generally required to measure growth except with normed assessments

  16. Indirect Measures • Indirect measures of student learning, growth, or achievement provide information about students from means other than student work. • These measures may include student record information (e.g., grades, attendance or tardiness records, or other data related to student growth or achievement such as high school graduation or college enrollment rates). • To be considered for use as DDMs, a link (relationship) between indirect measures and student growth or achievement must be established. • For some educators such as district administrators and guidance counselors, it may be appropriate to use one indirect measure of student learning along with other direct measures; • ESE recommends that at least one of the measures used to determine each educator’s student impact rating be a direct measure.

  17. Indirect Measure Examples • Consider Student Support Team (SST) Process for a team • High school SST team example—increase in-depth studies • Child Study Team example—make the process consistent district-wide • RTI team example—follow the referral process • High school guidance example • Subgroups of students can be studied (School Psychologist group example)—school anxiety • Social-emotional growth is appropriate (Autistic/Behavioral Program example)—saying hello • Number of times each student says hello to a non-classroom adult on his or her way to gym or class • Number of days (or classes) a student with school anxiety participates • Assess level of participation in a class • “Spot-check,” for example every Friday for 15 minutes • Increase applications to college • IEP goals can be used as long as they are measuring growth (academic or social-emotional)

  18. GROWTH SCORES for Educators Will Need to Be Tabulated for All Locally Developed Assessments MCAS SGP (for students) in this example 244/ 25 SGP 4503699 230/ 35 SGP 225/ 92 SGP

  19. What are the requirements? • 1. Is the measure aligned to content? • Does it assess what is most important for students to learn and be able to do? • Does it assess what the educators intend to teach? • Bottom Line: “substantial” content of course • At least 2 standards • ELA: reading/writing • Math: Unit exam • Not necessarily a “final” exam (unless it’s a high quality exam)

  20. 2. Is the measure informative? • Do the results of the measure inform educators about curriculum, instruction, and practice? • Does it provide valuable information to educators about their students? • Does it provide valuable information to schools and districts about their educators? Bottom Line: Time to analyze is essential

  21. Five Considerations (DESE) • Measure growth • Employ a common administration procedure  • Use a common scoring process • Translate these assessments to an Impact Rating • Assure comparability of assessments (rigor, validity).

  22. Comparability • Comparable within a grade, subject, or course across schools within a district • Identical measures are recommended across a grade, department, course • Comparable across grade or subject level district-wide • Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor

  23. Two Considerations for Local DDMs, 1. Comparable across schools • Where possible, measures are identical • Easier to compare identical measures • Do identical measures provide meaningful information about all students? • Exceptions: When might assessments not be identical? • Different content (different sections of Algebra I) • Differences in untested skills (reading and writing on math test for ELL students) • Other accommodations (fewer questions to students who need more time) • NOTE: Roster Verification and Group Size will be considerations by DESE

  24. “Common Sense” • The purpose of DDMs is to assess Teacher Impact • The student scores, the Low, Moderate, and High growth rankings are totally internal • DESE (in two years) will see • MEPIDS and • L, M or H next to a MEPID • The important part of this process needs to be the focus: • Your discussions about student learning with colleagues • Your discussions about student learning with your evaluator • An ongoing process

  25. Writing to Text and PARCC The Next Step? • The 2011 MA Frameworks Shifts to the Common Core • Complex Texts • Complex Tasks • Multiple Texts • Increased Writing A Giant Step? Increase in cognitive load • Mass Model Units—PBL with Performance-Based Assessments (CEPAs) • PARCC assessments require matching multiple texts

  26. 2. Comparable across the District • Aligned to your curriculum (comparable content) K-12 in all disciplines • Appropriate for your students • Aligned to your district’s content • Informative, useful to teachers and administrators • “Substantial” Assessments (comparable rigor): • “Substantial” units with multiple standards and/or concepts assessed. (DESE began talking about finals/midterms as preferable recently) See Core Curriculum Objectives (CCOs) on DESE website if you are concerned http://www.doe.mass.edu/edeval/ddm/example/ • Quarterly, benchmarks, mid-terms, and common end of year exams • NOTE: All of this data stays in your district. Only HML goes to DESE with a MEPID for each educator.

  27. Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures • Holistic Evaluation • Post-Test Only

  28. Pre/Post Test • Description: • The same or similar assessments administered at the beginning and at the end of the course or year • Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year with the passages changed • Measuring Growth: • Difference between pre- and post-test. • Considerations: • Do all students have an equal chance of demonstrating growth?

  29. Repeated Measures • Description: • Multiple assessments given throughout the year. • Example: running records, attendance, mile run • Measuring Growth: • Graphically • Ranging from the sophisticated to simple • Considerations: • Less pressure on each administration. • Authentic Tasks

  30. Repeated Measures Example Running Record # of errors Date of Administration

  31. Holistic • Description: • Assess growth across student work collected throughout the year. • Example: Tennessee Arts Growth Measure System • Measuring Growth: • Growth Rubric (see example) • Considerations: • Option for multifaceted performance assessments • Rating can be challenging & time consuming

  32. Holistic Example Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts

  33. Post-Test Only • Description: • A single assessment or data that is paired with other information • Example: AP exam • Measuring Growth, where possible: • Use a baseline • Assume equal beginning • Considerations: • May be only option for some indirect measures • What is the quality of the baseline information?

  34. MCAS Has 2 Holistic Rubrics

  35. Post-Test OnlyA challenge to tabulate growth • Portfolios • Measuring achievement v. growth • Unit Assessments • Looking at growth across a series • Capstone Projects • May be a very strong measure of achievement

  36. Selecting DDMs “Borrow, Buy, or Build” • PRIORITY:Use Quality Tool to Assess Each Potential DDM to pilot this year for your school (one district final copy on a computer) • CCOs will help if this is a District-Developed Tool • If there is additional time, Use Educator Assessment Tool to begin to look at developing 2 assessments for all educators for next year

  37. “Tools” to Support the Process • For determining what is important (Core Curriculum Objectives) • For determining adequacy for use as DDM (Quality Tool) • “Shifts” of Common Core examples and rubrics • For making sure each educator has 2 DDMs (Educator Alignment) • For assessing rigor (Cognitive Complexity Rubric, CEPA Rubric)

  38. Grade and Subject or Course _____________________ • Potential DDM Name_____________________________ • Potential DDM Source • Developed within district • From another district—indicate which one • Commercial—indicate publisher • Type of assessment • On-Demand (specific time for administration) • Performance/Project • Portfolio • Hybrid • Other • Item types • Selected Response (Multiple choice) • Constructed Response (written, oral) • Performance/Portfolio • Two or more • Other • Alignment to Curriculum • Well-aligned • Moderately aligned • Poorly aligned • Not yet aligned • Alignment to Intended Rigor • Well-aligned • Moderately aligned • Poorly aligned • Not yet aligned Assessment Quality Checklist Tool • Tracker • Checklist

  39. MCAS and PARCCThe Curriculum/Assessment Shifts • MCAS • PARCC Shifts to CC • ORQs • Math: Application of Concepts • ELA: ONLY comprehension not writing quality • Personal narrative, persuasive essay, literary analysis of any novel • MC questions some application • Emphasis on content • MC at MUCH HIGHER cognitive level • All writing is assessed as writing (unlike ORQs) • NEW Text Types—Writing at far higher level: Narratives, Informational Text, Arguments • Math—Processes, depth of understanding, beyond application • Emphasis on content plus • Literacy in ELA, math, social sciences, science, technology

  40. Critically Important! 1)Rigor and 2)Alignment to Curriculum Rigorous Aligned to District curriculum Shifted to new expectations Shifted from MCAS expectations Consider PARCC This is a district decision Gradual increments? Giant steps? • 2011 Massachusetts Frameworks • Common Core Shifts • Complex texts • Complex tasks • Writing to text • Shift in Persuasive Essay (Formal Argument) • Shift in Narrative (More substantial and linked to content) • Shift in Informational Text (organization substantiation) • Math, Science , History/SS frameworks

  41. Understanding the Research Simulation Task • Students begin by reading an anchor text that introduces the topic. • EBSR and TECR items ask students to gather key details about the passage to support their understanding. • Students read two additional sources and answer a few questions about each text to learn more about the topic, so they are ready to write the final essay and to show their reading comprehension. • Finally, students mirror the research process by synthesizing their understandings into a writing that uses textual evidence from the sources.

  42. Grade 10 Prose Constructed-Response Item Use what you have learned from reading “Daedalus and Icarus” by Ovid and “To a Friend Whose Work Has Come to Triumph” by Anne Sexton to write an essay that provides an analysis of how Sexton transforms Daedalus and Icarus. As a starting point, you may want to consider what is emphasized, absent, or different in the two texts, but feel free to develop your own focus for analysis. Develop your essay by providing textual evidence from both texts. Be sure to follow the conventions of standard English. Thus, both comprehension of the 2 texts and the author’s craft are being assessed along with the ability of the student to craft a clear argument with substantiation from two texts.

  43. Texts Worth Reading? • Range: Example of assessing reading across the disciplines and helping to satisfy the 70%-30% split of informational text to literature at the 9-11 grade band (Note: Although the split is 70%-30% in grades 9-11, disciplines such as social studies and science focus almost solely on informational text. English Language Arts Teachers will have more of a 50%-50% split between informational and literary text, with informational text including literary non-fiction such as memoirs and biographies.) • Quality: The texts in this set about Abigail Adams represent content-rich nonfiction on a topic that is historically significant. • Complexity: Quantitatively and qualitatively, the passages have been validated and deemed suitable for use at grade 11.

  44. Text Types, their Shifts, Rubrics for each

  45. Shifted Analytical Writing • Claims • Evidence • Use of textural evidence • Multiple perspectives

  46. Template for the Argument from They Say/I Say

  47. Templates to scaffold a smoothly written analysis or argument James Burke) They Say I Say I make a claim for the whole argument I explain what “they say” I am responsible for organizing the claims, the evidence, and my explanations I am responsible for making links between/among the sources using transitional sentences and transitional words. In contrast,…. Like….. Somewhat similar to… • What others say about this claim and topic • Quoted appropriately • Cited appropriately • Worked into whole essay smoothly

  48. Shifted Informational/Explanatory Writing • Conveys information accurately • Serves one or more of the following purposes: • Increase a reader’s knowledge about the subject • Helps readers understand a procedure or process • Provides readers with an enhanced comprehension of a concept Appendix A CC p 23

  49. Shifted Narrative Examples • In the service of information • Science—read article and retell the story from the perspective or the scientist who was in disagreement with the evidence • Math—look at the solution to this problem which has some problems. Create a dialogue between you and this student in a peer discussion in which you tell a peer what is good about and what he needs to do to improve his work • History—read the newspaper article written during Lincoln’s time written by one of his rivals. Write a narrative of a meeting between him and President Lincoln in which Lincoln answers some of this person’s objections to his policy based upon the information in the Gettysburg Address

More Related