1 / 55

Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAS

Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAS. National Conference on Student Assessment Los Angeles, CA June 23, 2009. Introduction. Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAAS. SALLSA Research Team. Panel of States.

woody
Télécharger la présentation

Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAS National Conference on Student Assessment Los Angeles, CA June 23, 2009

  2. Introduction

  3. Hitting a Moving Target: A Discussion of Ten Alignment Studies for AA-AAAS SALLSA Research Team Panel of States Toni Bowen, Georgia Department of Education Linda Turner, South Dakota Department of Education Charlene Turner, Wyoming Department of Education Bill Herrera, Wyoming Department of Education Sue Bechard, Measured Progress Patricia Almond, University of Oregon Meagan Karvonen, Western Carolina University Shawnee Wakeman, University of North Carolina at Charlotte

  4. Topics for this session • Introduction to the alignment study model • Results of 10 studies • Case studies of states in the SALLSA project • Panel of states: Experiences and advice • Discussion • Questions and Reflections

  5. State Academic Learning Links with Self-Evaluation for Alternate Assessment(based on Alternate Academic Achievement Standards)

  6. Flowers, Wakeman, Browder, & Karvonen (Spring, 2009). Links for Academic Learning (LAL): A Conceptual Model for Investigating Alignment of Alternate Assessments Based on Alternate Achievement Standards. Educational Measurement: Issues and Practice, 28(1), 25-37. Links for Academic Learning (LAL): Alignment Model for AA-AAAS

  7. LAL Model of Alignment of educational components specific to AA-AAAS Academic Content Standards Prioritized/Extended Academic Content Standards Enacted CurriculumProfessional Development Alternate Assessment

  8. 8 Criteria in the LAL model: Academic content • The content is academic • Content is referenced by grade level • Link with grade level content and level of performance • The content differs in range, balance, and depth of knowledge (DOK) • Differentiation of content across grade levels or grade bands • Expected achievement of students is grade referenced academic content

  9. 8 Criteria in the LAL model: Focus on SWD 7. Barriers to performance 8. Instructional program promotes learning in the general curriculum • Professional development • Program quality • Curriculum Indicators Survey

  10. SALLSA Project Phases Implement Phase 2 plan Eval changes (finish case study) Phase 1 Info gathering & LAL study Reporting and Reflection Phase 2 continues Share LAL results; self-eval and plan 10 06/23/09

  11. 10 Studies

  12. What evidence was rated for alignment? Content Standards, Extended (if applicable) Performance Tasks, a Rating Scale, & Portfolios Teacher Perceptions, Instructional Materials, and Test Documentation

  13. Content Areas Studied Using the LAL Model

  14. Sample of Alignment Studies for this Investigation 10 LAL Studies representing 9 states • Portfolios = 4 • Performance tasks = 4 • Combination AAs = 2 • Rating Scale plus Evidence • Performance Tasks and Portfolios • 7 studies w/ extended content standards

  15. Sample of Alignment Studies C = Combination AA, L = Levels of entry

  16. Criterion 1: ACADEMIC: The content is academic and includes the major domains/strands of the content areas as reflected in national standards as defined by (NCTE), (NCTM), and the National Research Council. Measurement of Alignment Criterion 1a. Number and percentage of items and extended standards that are rated academic 1b. Number and percentage of nonacademic items and extended standards that are rated foundational.

  17. Extended Standards are Academic? Range = 95 to 100%

  18. Task/Item/Entry Percent Academic? Range = 59 to 100%

  19. Criterion 4—RANGE, BALANCE, & DOK: The content differs from grade level in range, balance, and DOK, but matches highexpectations set for students with significant cognitive disabilities. Measurement of Alignment Criterion 4a. Categorical Concurrence—a measure of the extent that the same categories of content appear in the standards and in the assessment (6 items per standard) 4b. Depth-of-Knowledge Consistency—a measure of the degree that knowledge elicited from student by the assessment is at or above the same DOK demand found in the standards

  20. LAL Depth of Knowledge Codes

  21. Median Percent of Items At or Above DOK of ELA Standards (P = Portfolio/Evidence, T = Performance Task, C = Combination )

  22. Items At or Above DOK of ELA Standards

  23. Percent of ELA Items At/Above DOK of Standards * These studies had complete data for each grade.

  24. Special ELA Case Study E-T07L multi-levels/forms

  25. Variation in ELA Depth of Knowledge Ratings—Percent of Observations in Each Category

  26. Criterion 5—DIFFERENTIATION & AGE APPROPRIATE: There is some differentiation in content across grade levels. Measurement of Alignment Criterion 5a. Description of how the content changes across grade levels (e.g., broader, deeper, prerequisite, new, or identical) 5b. Frequency and percentage of age-appropriate AA-AAS items

  27. Math Extensions—Differentiation across grades

  28. Math item/task/entry—Differentiation across grades

  29. Items/Tasks/Entries Percent Math Age/Grade Appropriate

  30. SALLSA Case Studies Preliminary Data: Implementation and Use of Findings

  31. Purpose Overall Describe SALLSA implementation and impact Component of project evaluation Cross-case analyses that examine contextual influences on how states approached the project and used the findings Today, just preliminary findings Description of states’ participation – phases 1 and 2 (Detailed for 2 states, summary for 2 states)

  32. Data Sources Background documents “Pre” interview Final LAL and CIS reports Transcripts/notes from: Dissemination call Other calls and meetings “Post” interview (in progress) Research Team communication Researcher log

  33. Case #1: Georgia The system Portfolio-based system, aligned to grade-level standards Specific # of entries, combination of predetermined standards and those that can be chosen from a range of standards New AA in 2007 (standards in transition) Alignment addressed via scoring – fidelity to standard

  34. Case #1: Georgia Going into the project Already approved through peer review (commended for alignment) Strengths of GAA: Flexibility in designing tasks; functional contexts Focus on progress Concerns / questions: Foundational vs. academic Covering full range of standards Degree of alignment at lower vs. upper grade levels Interpretation of results (in context) Having teachers be more explicit about how they are making the link

  35. Case #1: Georgia LAL findings Crit 1: + Crit 2: + Crit 3: - (below 90% on CC and PC) Crit 4: ~ (generally good; a few problems with range and balance) Crit 5: + age appropriate, - differentiation Crit 6: ~ (+ new learning, achievement vs. PQIs; - accuracy, independence, generalization) Crit 7: + Crit 8: ++

  36. Case #1: Georgia Phase 2

  37. Case #2: Wyoming The system Both standardized performance events and portfolio – overall scores based on combination of the two Aligned to WY Academic Content Standards and Academic Benchmarks (extended standards) Some content assessed every year, other content rotates on 3-year cycle Reading, writing, math = redesigned; science = new

  38. Case #2: Wyoming Going into the project Need to redo alignment study for newly redesigned AA Interested in LAL because of connection to instructional program Wonder about grade level appropriateness, instructional material New blueprint viewed as a strength – covering all standards over a 3-year cycle Some emerging evidence of teachers expanding breadth of coverage

  39. Case #2: Wyoming LAL findings Crit 1: + Crit 2: + Crit 3: ~ (strong in math & sci, weak in ELA because of grain size) Crit 4: ~ (generally good; -DOK b/c of restricted range on AA) Crit 5: + age appropriate for SPE, a few problems with PSWs, ~ differentiation (good EXS and science AA, needs improvement in reading comp, some math strands) Crit 6: ~ (+ accuracy, independence, new learning; generalization better in PSWs than SPEs. Independence lower in PSWs) Crit 7: ~ (strive for greater independence, use of appropriate supports like assistive technologies) Crit 8: ~ (good guidance; could use more examples on how to use blueprints to design aligned instructional tasks)

  40. Case #2: Wyoming Phase 2

  41. What else did states do in Phase 2? Modified state-sponsored professional development Develop structure for state advisory board on AA convened two meetings Reviewed LAL and CIS reports Improve accessibility of information to a range of stakeholders: Published more AA-related materials on website Created test blueprint for AA Strengthened connections between AA and the rest of the state’s assessment system Revising scoring rubric to give more credit when students respond independently and correctly without prompting

  42. Phase 2 observations Phase 2 meetings with interpretation are important for moving the systems forward Keep the report from sitting on a shelf Action is quick States focused on interpretability by questioning each other about their findings – helped them develop their thinking about their results. Brainstorming about how to interpret the results for different audiences Interpretability  use in their states Built stronger bonds across state partners, helped them broaden their perspectives by seeing what each others’ systems looked like

  43. General Interpretations State expectations and value statements drove their action AA not marginalized within the assessment system Driven by high expectations for students, teachers, and for assessment quality Context matters: state politics and SEA priorities can promote or hinder continuous improvement Alignment findings not the sole source of improvement – integrated into broad set of priorities Assessment system maturity plays a (nonlinear) role

  44. Panel of States Toni Bowen, Georgia Department of Education Linda Turner, South Dakota Department of Education Charlene Turner, Wyoming Department of Education Bill Herrera, Wyoming Department of Education • High points to share based on their own experience • Advice for other states about planning, conducting, and using an AA-AAAS study?

  45. Self-Study Guide

  46. Alignment State Self Study Guide- Handout Purpose and Potential Uses Principles Four main and two exemplars Descriptions Sample Key Questions and Supporting Evidence Checklist

  47. Discussion

  48. Discussion: Alignment: Where have we been and where are we going for AA-AAAS We’ve come a long way. • Academic or not • Test blueprint? • Looking only for DOK, range, balance, and categorical concurrence (when many times we already knew the answers) • Bias within assessments- can all students participate?

  49. Current ongoing considerations The link of items to standards and of extended standards to general ed standards The evaluative criteria for considerations related to DOK, range, balance, etc. and the role the assessment format plays in those decisions Differentiation across grades/grade bands in performance and content expectations (items and standards) The criteria for scoring to make accurate appropriate inferences about student performance

  50. Future considerations based upon the alignment work of SALLSA and UNCC • Can we develop a common language across states for the contents of the AA-AAAS system? • For example, how can we better define an item (e.g., grain size) to allow for cross-state discussions related to alignment issues? • What do the outcomes of an alignment study mean in terms of state-led professional development? • How can teachers make the connections about the role that alignment plays between the assessment and their classroom instruction? • How do teachers make judgments about, once mastery occurs on one standard, where instruction for the child should be directed next? How do the students’ opportunities to learn play a role in alignment considerations?

More Related