1 / 72

Navigate to the homeroom page: RIA Homeroom site.

ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal: www.pdesas.org. Navigate to the homeroom page: RIA Homeroom site. Log in and if not a user then register for the site: Pause until entire room is registered or with a partner:.

vince
Télécharger la présentation

Navigate to the homeroom page: RIA Homeroom site.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO ProcessSAS Portal: www.pdesas.org

  2. Navigate to the homeroom page: RIA Homeroom site.

  3. Log in and if not a user then register for the site: Pause until entire room is registered or with a partner:

  4. Home Page for information: Open ALS

  5. The ALS Box expands…………..

  6. Assessment Literacy Series -Orientation Module-

  7. What is Assessment Literacy? A process of designing, building, and reviewing performance measures for use in evaluating student achievement.

  8. Series Objectives • Develop high-quality performance measures for use within the greater educator effectiveness system. • Increase assessment literacy of participants. • Gain a deeper understanding of academic content standards.

  9. Assessment Life Cycle

  10. Overview of the Development Process The Assessment Literacy training contains six (6) modularized components within three (3) action strands: • DESIGN • M1-Design & Purpose Statement • M2-Test Specifications & Blueprints • BUILD • M3-Item Specifications • M4-Scoring Keys & Rubrics • M5-Operational Forms & Administrative Guidelines • REVIEW • M6-Quality Assurance & Form Reviews

  11. Development Process Purpose Statement • Explain why measures are made, what they will measure, and how the results will be used. Targeted Standards • Select content standards associated with the performance measure.

  12. Development Process (cont.) Test Specifications and Blueprints • Outline the types of items being used, point values, and depth of knowledge (DoK) distribution. • Guide item development by focusing items/tasks on discrete aspects of the content standards. Item/Tasks Specifications • Articulate the type and characteristics of items/tasks that align to the test specifications and blueprints.

  13. Development Process (cont.) Scoring Keys & Rubrics • Provide answers for multiple choice items and detailed scoring rubrics for constructed response tasks. Operational Forms • Organize and sequence items/tasks within an outlined format that includes item tags (to uniquely identify each item/task).

  14. Development Process (cont.) Administrative Guidelines • Provide detailed instructions for how to prepare, administer, and collect results. Quality Assurance & Form Reviews • Review all operational forms, scoring keys/rubrics, and guidelines to ensure pre-established quality expectations have been met. • Determine the overall rigor and alignment at the item/task-level and for the overall operational form.

  15. Helpful Materials Guides • Handout #1 – Purpose Statement Examples • Handout #2 – Targeted Standards Example • Handout #3 – Test Specifications & Blueprint Example • Handout #4 – Depth of Knowledge (DoK) Chart • Handout #5 – Item Examples • Handout #6 – Scoring Key Example • Handout #7 – Rubric Examples • Handout #8 – Operational Form Example • Handout #9 – Administrative Guidelines Example • Handout #10 – Quality Assurance Checklist

  16. Helpful Materials (cont.) Templates • Template #1-Purpose Statement • Template #2-Targeted Standards • Template #3-Test Specifications & Blueprint • Template #4-Scoring Key • Template #5-Operational Form • Template #6-Administrative Guidelines

  17. Helpful Material (cont.) Stuff “Smart Book” • Provides techniques and technical criteria for developing measures. Performance Rigor Checklist • A quick screening tool used to examine item/task quality.

  18. Principles of Well-developed Measures Measures must: • Be built to achieve the designed purpose • Produce results that are used for the intended purpose • Align to targeted content standards • Contain a balance between depth and breadth of targeted content • Be standardized, rigorous, and fair • Be sensitive to testing time and objectivity • Be valid and reliable

  19. Evaluating Measures Quality Assurance Rubrics • Three (3) versions- • Teacher • District • State or Vendor • Comprehensive tools that evaluate the technical process used in developing performance measures. • Organized into strands and provides an overall rating.

  20. Decision-Making Approaches • Build Consensus Approach • All opinions expressed • Ideas of all participants gathered and synthesized • Group agreement reached • Team Leader Approach • Guiding questions • Group discussion about the pros and cons • Create a “best fit” course of action

  21. Think-Pair-Share • 5 minutes to think about the statement below and write down your thoughts • 5 minutes to share your ideas with a partner • 10 minutes to share with your group “The Assessment Literacy process will help me as an educator by _______________.”

  22. Summary & Next Steps Summary Orientation Module • Introduced the Assessment Literacy process used to create high quality measures of student achievement. Next Steps Module 1: Design & Purpose Statement • Designing performance measures, including creating purpose statements and selecting academic content standards.

  23. Assessment Literacy Series -Module 1- Design & Purpose Statement

  24. Objectives Participants will: • Develop a Purpose Statement that states the “what”, “how”, and “why” aspects of the performance measure being developed. • Determine the targeted content standards to be measured.

  25. Helpful Tools • Participants may wish to reference the following: Guides • Handout #1 – Purpose Statement Examples • Handout #2 – Targeted Standards Example Templates • Template #1 – Purpose Statement • Template #2 – Targeted Standards Other “Stuff” • Content standards associated with the applicable grade-levels/spans; Common Core Standards • Textbooks, teacher guides

  26. Outline of Module 1 Module 1: Design & Purpose Statement Purpose Statement Targeted Standards

  27. Purpose Statement

  28. Purpose Statement [Handout #1] • Statement outlining what the performance measure is measuring • Statement about how the results (scores) can be used • Statement of why the performance measure was developed

  29. Purpose Statement (WHAT) The Social Studies assessment is intended to measure student proficiency of grade-level expectations in the sequence of the district’s curriculum. (HOW) This grade-level assessment is provided to all students as a post-test. (WHY) Scores are reported to the district and used as a part of a comprehensive teacher evaluation program.

  30. Process Steps [Template #1] • Each team member will work independently to create a statement about the performance measure in terms of the content standards it will purport to measure. • Build consensus by focusing on each aspect of the statement: What, How, Why • Draft a sentence reflecting the group’s consensus for each aspect and review as a group. • Merge each sentence to create a single paragraph “statement”. Again, review to ensure the statement reflects the group’s intent. • Finalize the statement and double-check for editorial soundness.

  31. QA Checklist • Statement is clear and concise; Free of technical jargon. • Statement identifies what the performance measure is designed to measure; Grade-level/subject area/course. • Statement articulates how the information from the performance measure is intended for use; Provides insight about what the scores mean.

  32. Targeted Content Standards

  33. Targeted Standards [Handout #2] Choosing targeted standards means: • Selecting certain standards for use with the performance measure being developed. • Identifying standards representing the “big ideas” within the content area.

  34. Targeted Standards… • are a refined list of the content standards. • represent the essential knowledge and skills that students are expected to acquire. • are the standards that educators will spend the most time on. • create transparency for families and the community about what is most important for student success. • become the identified content standards used to create the measures.

  35. Selection Criteria • Endurance - Will this standard provide students with knowledge and skills that will be of value beyond a single test date? • Leverage - Does this standard provide knowledge and skills that will be of value in multiple disciplines? • Readiness for the next level of learning - Will this standard provide students with essential knowledge and skills that are necessary for success in the next level of instruction?

  36. Targeted Standards Example [Handout #2]

  37. Process Steps[Template #2] • First, copy and paste the content standards into the table (or use an Excel spreadsheet) and then each team member will work independently to apply the three criteria (endurance, leverage, preparation) to his/her targeted content standards. • Build consensus about which identified content standards should be on the team’s draft list. Remember to balance the number of distinct standards selected and the time needed to sufficiently measure them. Place a checkmark in the draft box if the team agrees. • Compare the draft of targeted standards to the “conceptual” blueprint indicating what is likely going to be emphasized on the performance measure. • Review the list of targeted standards and look for gaps and redundancies. Apply the checklist to each standard. • Finalize the list. Place checkmarks in the final box of the standards that will be used to guide the blueprint development.

  38. QA Checklist • Do my targeted standards have endurance? • Do my targeted standards have leverage? • Do my targeted standards prepare students for the next level of learning?

  39. Summary & Next Steps Summary Module 1: Design & Purpose Statement • Developed a purpose statement and chose the targeted standards that the performance measure will be based upon. Next Steps Module 2: Test Specifications and Blueprints • Given the purpose and targeted standards, create a specification table and associated blueprint that will guide item/task development.

  40. Assessment Literacy Series -Module 2- Test Specifications & Blueprints

  41. Objectives Participants will: 1. Develop test specifications that articulate: • Number of items by type • Item point value • Depth of Knowledge (DoK) levels 2. Develop a blueprint that designates: • Items per content standard across DoK levels

  42. Helpful Tools Participants may wish to reference the following: Guides • Handout #3 – Test Specifications and Blueprint Example • Handout #4 – Depth of Knowledge (DoK) Chart Templates • Template #3 – Test Specifications and Blueprint

  43. Outline of Module 2 Module 2: Test Specifications & Blueprints Content Standards Depth of Knowledge Test Specifications Blueprints Process Steps Item Type Depth of Knowledge (DoK)

  44. Specification Tables

  45. Test Specifications When developing test specifications consider: • Sufficient sampling of targeted content standards • Aim for a 3:1 items per standard ratio • Developmental readiness of test-takers • Type of items • Multiple Choice (MC) • Short Constructed Response (SCR) • Extended Constructed Response (ECR)/Complex Performance tasks • Time burden imposed on both educators and students

  46. Test Specifications (cont.) When developing test specifications consider: • Cognitive load • Aim for a balance of DoK levels • Objectivity of scoring • Each constructed response item/task will need a well-developed rubric • Weight of items (point values) • Measures should consist of 25-35 total points; 35-50 points for high school • Item cognitive demand level/DoK level • Measures should reflect a variety of DoKlevels as represented in the targeted content standards

  47. Test Specifications Example[Handout #3]

  48. Test Specifications Example (cont.)[Handout #3]

  49. Multiple Choice Items • Stem (question) with four (4) answer choices • Typically worth one (1) point towards overall score • Generally require about one (1) minute to answer Pros • Easy to administer • Objective scoring Cons • Students can guess the correct answer • No information can be gathered on the process the student used to reach answer

  50. Short Constructed Response Items • Requires students to apply knowledge, skills, and critical thinking abilities to real-world performance tasks • Entails students "constructing" or developing their own answers in the form of a few sentences, a graphic organizer, or a drawing/diagram with explanation • Worth 1-3 points Pros • Allows for partial credit • Provides more details about a student’s cognitive process • Reduces the likelihood of guessing Cons • Greater scoring subjectivity • Requires more time to administer and score

More Related