1 / 47

Student Learning Objectives: Our Last, Best Hope ( for improving classroom assessment, instruction, & learning)

Student Learning Objectives: Our Last, Best Hope ( for improving classroom assessment, instruction, & learning). Scott Marion & Elena Diaz-Bilello* Center for Assessment ACEE Meeting January 17, 2014

galvin
Télécharger la présentation

Student Learning Objectives: Our Last, Best Hope ( for improving classroom assessment, instruction, & learning)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Learning Objectives: Our Last, Best Hope (for improving classroom assessment, instruction, & learning) Scott Marion & Elena Diaz-Bilello* Center for Assessment ACEE Meeting January 17, 2014 *Thanks to our colleagues, Charlie DePascale & Jeri Thompson for many contributions to this presentation

  2. Advance Organizer Center for Assessment ACEE Meeting (1/17/14)

  3. The business of schools is to invent tasks, activities, and assignments that the students find engaging and that bring them into profound interactions with content and processes they will need to master to be judged well educated. Schlechty, P.C. (2001) Shaking up the schoolhouse. San Francisco: Jossey-Bass Center for Assessment ACEE Meeting (1/17/14)

  4. What Do You Think? • How do teachers think about learning goals for students? Be honest! • They follow the sequence of content and lessons in the text • They create learning goals for lessons and units, but not necessarily sequenced throughout the year • They create long-term learning goals and sequence instruction to help students reach the goals • Some combination of the above Center for Assessment ACEE Meeting (1/17/14)

  5. Defining Student Learning Objectives Center for Assessment ACEE Meeting (1/17/14) • SLOs are content- and grade/course-specific measurable learning objectives that can be used to document student learning over a defined period of time. They include: • A learning goal focused on a “big idea” of the discipline • Assessment(s) to measure students’ learning of the goal • A description of the instruction and materials used toprovide students with an opportunity to learn the goal • Targets for students and aggregate targets for teachers We focus on 1, 2, & 4 today

  6. It’s About the Learning Center for Assessment ACEE Meeting (1/17/14) Yes, SLOs have gained popularity because of teacher evaluation SLOs provide a vehicle for thinking about critical and long-term learning goals and how students progress towards those learning goals We haven’t had many large-scale initiatives that have provided this sort of dedicated focus on the learning process for both teachers and students

  7. Our last, best hope SLO are our last, best hope for reining in large-scale assessment-based accountability. Since the mid-1980s, large-scale, standardized assessment in K-12 education has increased inexorably on virtually every conceivable metric (time, frequency, cost, consequences, uses, etc.) Ironically, SLO represent the first crack in the accountability door that has been closed tight on classroom- and school-based measures of student and teacher performance. Center for Assessment ACEE Meeting (1/17/14)

  8. The Challenge of the “Learning Goals” • We have not seen evidence that teachers and other educators can generate high quality SLOs without significant practice and training • Identifying meaningful learning goals appears to quite difficult • We can draw on work such as Wiggins & McTighe’sUnderstanding by Design and the assessment specifications being developed by both large scale assessment consortia • But this is still a huge challenge! Center for Assessment ACEE Meeting (1/17/14)

  9. How People Learn Bransford, Brown, & Cocking (1999), NRC • Learning with understanding is facilitated when new and existing knowledge is structured around major concepts and principles of the discipline. • Learners use what they already know to construct new understandings. • Metacognitive strategies and self-regulatory abilities facilitate learning. • Learners’ motivation to learn and sense of self affect what is learned. • Participation in social practice is a fundamental form of learning. Center for Assessment ACEE Meeting (1/17/14)

  10. Deep Understanding Center for Assessment ACEE Meeting (1/17/14) There is a close relationship between truly understanding a concept and being able to transfer knowledge and use it in new situations Deep understanding is flexible, connected, and generalizable Deep understanding is challenging to achieve, so it must be focused on the critical components of the discipline

  11. All SLOs are not the same Center for Assessment ACEE Meeting (1/17/14) • Some (many) SLOs are being implemented around the country simply as an accountability or compliance mechanism • We need to ensure SLOs are designed to promote deeper learning • Like Understanding By Design (UBD, Wiggins & McTighe), learning goals for SLOs should lead to enduringunderstandings • Core idea of the discipline • Useful (critical) for continued learning in that and other disciplines • Measureable and able to judge progress along the way

  12. Learning Goal Examples Center for Assessment ACEE Meeting (1/17/14) Please refer to the handout for examples of learning goals. We think some of these are really strong and others are less so, but still pretty good. We’ve also distributed the rubric, developed as part of the Center for Assessment SLO Toolkit (www.nciea.org) that can be used to evaluate the learning goals. We don’t have time today to go through this in detail, but wanted you to have access to the examples.

  13. But There’s Too Much Stuff! Center for Assessment ACEE Meeting (1/17/14) Legitimate complaints from teachers that there is too much content to cover, too much processes involved Not enough time to learn all required content and skills deeply Hopefully newer standards have trimmed the amount of content and skills, but… We need to establish structures, professional learning opportunities, and permission for teachers to facilitate deeper learning SLOs may serve as a promising structure

  14. The Goal and the Journey Center for Assessment ACEE Meeting (1/17/14) Identifying the long-term learning goal for teachers and students helps provide a sense of structure of the discipline and can support greater understanding of learning theory for teachers and metacognition for students But the goal is not enough… Teachers need a clear understanding of the important markers along the way to the goal

  15. Learning Progressions Knowing What Students Know (2001) • Learning progressions, based on professional judgment and empirical test, can be a powerful tool to support teacher learning. • “Progress maps describe skills, understandings, and knowledge in the sequence in which they typically develop: a picture of what it means to ‘improve’ in an area of learning.” (Masters & Forster, 1996). • A criterion-referenced growth model. • Learning progressions or Progress Maps provide an underlying model of learning to coherently link classroom and large-scale assessments. Center for Assessment ACEE Meeting (1/17/14)

  16. Learning Progressions Center for Assessment ACEE Meeting (1/17/14) Learning progressions provide a great organizing framework for: Clarifying the learning goals Establishing targets for student performance Identifying the set of assessments that can be used to both progress monitor and evaluate students on learning goals Facilitating conversations with and among teachers about student work

  17. Example: A semester long SLO framed as a progression Upper anchor (the SLO learning goal): Students will critique and compare the techniques and styles of two artists located in two time periods. Students will write up their critique and provide their comparative analysis in a report. Provides evidence supporting a point of view based on information gathered from the image content, the intended impact of seeing the work of art, the artists intended meaning, and interprets the work to communicate a deeper connection with the artists. Makes connections between the elements observed, the meaning and purpose perceived and the information gathered regarding works from each artistic period and movement. Compares the elements and multimedia language used in an image used by each artist. Compares the visual evidence produced using either traditional or new forms of materials and tools used by each artist. Lower anchor: Identifies methods of representing space including isometric and aerial perspectives used by each artist. What students should know and be able to do after instruction [SLO] The “Messy Middle”: What students should know and be able to do prior to instruction

  18. Discussion Questions • In what ways is the SLO process different from current practices in the classroom? • What do teachers consider now in planning instruction for the year? • Would a typical teacher be able to describe at the beginning of the year the level or type of performance that would be required to earn a final grade of A, B, or C in the course? • To what extent are those grades tied to specific knowledge and skills, status, growth, or other factors? • What is the current balance between focus on • content and students, • process and outcomes, • short-term activities and long-term goals? Center for Assessment ACEE Meeting (1/17/14)

  19. Learning and Assessment Center for Assessment ACEE Meeting (1/17/14) Meaningful assessment scores depend on tight linkages among learning targets and assessment design As we have discussed, the learning goals should expect students to learn rigorous content and use disciplinary skills to apply this content knowledge In other words, we want students to develop deep understanding of important knowledge and skills

  20. Assessing for Deep Understanding Center for Assessment ACEE Meeting (1/17/14) Students cannot develop deep understanding unless they are provided opportunities on both learning and assessment tasks. In other words, if low-level assessment items are the focus, it is unlikely that teachers will feel the need to teach students to think deeply. A major component of 21st Century skills is the ability to solve novel problems—this requires deep understanding! Assessment conveys what’s important to learn (a signal) as well as providing an opportunity to check on students’ understanding and evaluate achievement

  21. Developing an SLO: The role of assessment • Where does assessment enter into the process of designing an SLO? • Is assessment the driving force in the development of the SLO? • Or • Are assessment decisions the outcome of prior decisions on content, knowledge, skills, and required evidence? Center for Assessment ACEE Meeting (1/17/14)

  22. A “normal” sequence Begin with Content Standards and Curriculum Materials Identify Priority Knowledge and Skills Determine and Describe Desired Performance Identify the assessment(s) to collect evidence of the Desired Performance Center for Assessment ACEE Meeting (1/17/14)

  23. What happens when the normal sequence is turned upside down? Begin with Content Standards and Curriculum Materials Identify Priority Knowledge and Skills Determine and Describe Desired Performance Identify the assessment(s) to collect evidence of the Desired Performance Center for Assessment ACEE Meeting (1/17/14)

  24. Beginning with the Assessment Begin with a state or district assessment In the worst-case scenario, it will be necessary to force-fit an SLO to a misaligned assessment Determine what is measured on the assessment In the best-case scenario, there will be alignment between the content standards, curriculum, instruction, and the assessment Develop an SLO Center for Assessment ACEE Meeting (1/17/14)

  25. Disadvantages Beginning with the Assessment? Advantages Increased likelihood for a “quality” assessment Opportunity for common performance expectations across classrooms Reduces the burden of test development Reflects the “reality” of common content standards More likely to result in evidence being gathered from a single assessment Increased likelihood for the assessment and the SLO to be perceived as external and separate from instruction Potentially less sensitive to differentiated needs of students in a particular class Center for Assessment ACEE Meeting (1/17/14)

  26. How Many Assessments? Center for Assessment ACEE Meeting (1/17/14) • If the learning goal is really a big idea of the discipline, it is hard to imagine that it can be validly measured with a single assessment • The number of assessments islikelycontingent upon: • The scope of the learning goal • The scope of the assessment(s)

  27. How Many Assessments? Center for Assessment ACEE Meeting (1/17/14) • We need to tie this back to the evidence model • What type and number of assessments will provide the required evidence to support claims that students have mastered the learning goal? • Does this include claims about generalizability too? • If we have multiple assessments, we need to be very thoughtful about combining the results. A simple average will often not be the best approach. • What else should we consider?

  28. Assessment Quality for NTSG? Center for Assessment ACEE Meeting (1/17/14) • Some have proposed classifying assessments evaluating teachers’ contributions to student learning in non-tested subjects and grades according to the following scheme: • Type I: Statewide standards-based assessments • Type II: Commercially available standardized summative and interim assessments • Type III: Locally-created assessments • What’s wrong with this picture?

  29. Technical Quality Center for Assessment ACEE Meeting (1/17/14) • Many states and others are beginning to examine the technical quality of measures used in educator evaluation determinations • Most of what we have seen focuses on traditional aspects of assessment quality • This is a good start, but do we have to consider any other dimensions for measurement of SLOs?

  30. Criteria for Quality Assessments • Critical criteria and standards for the quality of educational assessments: • Validity • Reliability • Fairness (inclusivity and equitability) • Additional criteria often include: • Manageability or practicality • Relevance • Transparency • Darling-Hammond and colleagues (2013) recently posited: • Assessment of higher-order cognitive skills • High-fidelity assessment of critical abilities • Standards that are internationally benchmarked • Use of items that are instructionally sensitive and educationally valuable • Assessments that are valid, reliable, and fair Center for Assessment ACEE Meeting (1/17/14)

  31. Dilemma for the use of SLO There are few locally-developed assessments that meet the technical criteria for quality assessments There are few local assessment practices and policies in place that support high quality assessment It is not practical to develop and maintain external, standardized assessments (i.e., state assessments) in for every course – although some states and large districts have adopted this approach Even if a high quality, external assessment were available for each course, it would only partially fulfill the assessment needs of a good SLO Center for Assessment ACEE Meeting (1/17/14)

  32. Solution • Increase general assessment literacy to support informed decision-making regarding assessment • Improve understanding of • Which aspects of each criterion are critical and must receive attention • What steps to improve technical quality are important and practical for different levels (e.g, state, district, classroom) and assessment formats (e.g. performance tasks, multiple-choice, essay) • The impact of decisions, actions, and inaction related to certain quality criteria Center for Assessment ACEE Meeting (1/17/14)

  33. Assessment Review Tools to Support Quality SLOs Center for Assessment ACEE Meeting (1/17/14) Many states, districts and the Center have developed review tools to help users judge the quality of tasks and assessments that are created or selected locally For any of these tools, guidance is still required to help users understand quality criteria and standards The vast majority of these tools are publicly available on most websites Also…

  34. Accessing Assessments for SLOs • With the growth of the technological tools, there is • Increased opportunities for sharing assessment instruments and practices • Increased opportunities for sharing information and providing support regarding assessment decisions • Increased opportunities for state-supported assessment tools (in contrast to state-mandated assessments) • Large-scale assessment has reached a tipping point or saturation point. It will become obvious that large-scale assessment, alone, cannot meet the assessment demands of the Common Core or the Darling-Hammond et al criteria. Center for Assessment ACEE Meeting (1/17/14)

  35. Performance Targets Center for Assessment ACEE Meeting (1/17/14) Student targets: How well students are expected to perform on the assessment(s) tied to the learning goal Teacher targets: How well students are expected to perform in the aggregate to contribute to decisions about educator evaluation

  36. Major Approaches for Student Targets • Achievement or Status • Focus on the level of achievement attained at the end of the interval (e.g., proficiency, mastery, college-and-career readiness) • Growth • Focus on a change in performance over the course of the interval • The interval may be within a year (fall to spring), or • The interval may be across years (spring to spring) Center for Assessment ACEE Meeting (1/17/14)

  37. Status v. Growth: Is it a false dichotomy? • If not false, then perhaps not quite a true dichotomy • In some cases, status (or achievement) at the end of an interval is a proxy for growth. • Implied growth is reflected in status when there is an assumption that the student had not reached a particular level of achievement prior to the beginning of the interval. • That assumption may be stronger in some cases or courses than others. • That assumption may be stronger for a group of students than for an individual student. Center for Assessment ACEE Meeting (1/17/14)

  38. The Problem of “Growth” Many want to set the student and teacher targets using some sort of growth framework (pretest/posttest) In most cases, this makes very little sense Yes, I know what the RTTT requirements say… They are wrong! If these requirements were followed literally, value-added and student growth percentile models could not be used—not growth models! Center for Assessment ACEE Meeting (1/17/14)

  39. Administer two different tests Pretest/Posttest: Two Options Administer the same test twice Advantages “Easy” to see improvement between tests Only need one test Disadvantages Familiarity with the test items Danger of teaching to the test or learning the test Potential loss of generalizability • Disadvantages • Differences in content or difficulty between tests • Need multiple tests • Advantages • Less exposure of test items • Greater opportunity for targeted assessment at each point in time • Increased support for claims of generalizability Center for Assessment ACEE Meeting (1/17/14)

  40. Set A Set B Dangers of Using Different Tests When Thinking They Are the “Same” (An example) 89 x94 10 x20 97 x89 10 x10 20 x20 30 x10 87 x69 99 x99 Popham (1978). Criterion-Referenced Measurement, Prentice-Hall Center for Assessment ACEE Meeting (1/17/14)

  41. Gain Scores: A simple difference • The Siren Song of Simplicity • Gain scores are inherently appealing • They appear simple to compute and to interpret • The Illusion of Precision • A test score, whether expressed as a number correct or a percentage, seems very precise. • The difference between two scores conveys a sense of precision, truth, and objectivity. • Unfortunately, it’s just not true! • For a detailed explanation, see: • Marion, S.F., DePascale, C., Domaleski, C., Gong, B., & Diaz-Bilello, E. (2012, May). (2012). Considerations for analyzing educators’ contributions to student learning in non-tested subjects and grades with a focus on Student Learning Objectives. www.nciea.org. Center for Assessment ACEE Meeting (1/17/14)

  42. Gains Scores: Two popular approaches • Fixed Gain • Determining a fixed number of points for all students, or a subgroup of students, to gain from pretest to posttest. • May be based on a norm such as the average gain or a criterion such as the number of points needed to remain at the Proficient level from one grade to the next. • “Half the Distance” • Determining an individual “gain target” for each student based on cutting in half the gap from the pretest score to a fixed point (e.g., 100%). Center for Assessment ACEE Meeting (1/17/14)

  43. In case it’s not bad enough… • Measurement error influences both the pretest and posttest when setting and evaluating performance targets. • This could be reflected in: • Where you set the performance target for individual students, or whether you establish a target range rather than a fixed point. • Where you set the bar for the number (percentage) of students expected to meet the target. • When setting performance targets: • It is easier to notice gross than fine changes in performance • It is easy to classify performance far away from a benchmark or cut score. Center for Assessment ACEE Meeting (1/17/14)

  44. Implications of Measurement ErrorIn addition to all of the other caveats • You must consider the presence of measurement error on both the pretest and posttest when setting performance targets. • This could be reflected in • Where you set the performance target for individual students, or whether you establish a target range rather than a fixed point. • Where you set the bar for the number (percentage) of students expected to meet the target. • When setting performance targets or scoring an SLO: • It is easier to notice gross than fine changes in performance • It is easy to classify performance far away from a benchmark or cut score. Center for Assessment ACEE Meeting (1/17/14)

  45. A “Rough Conditioning” Approach for SLOs • Using prior performance information (e.g., last year) or some early assessments in the current year, we can group students into 3-4 “performance” groups • SLO targets would then be differentiated according to the students’ starting group. • At least two ways to differentiate targets: • Different levels of achievement (e.g., basic, proficient) • Different proportions of students reaching the same target (e.g., 80% of Level 3 students will achieve target, 65% of Level 2 students will achieve goal) Center for Assessment ACEE Meeting (1/17/14)

  46. Learning trajectory approach Center for Assessment ACEE Meeting (1/17/14) • Use prior performance data or early assessments in current year directly related to the learning trajectory, establish starting points for each student • Evaluate performance of students relative to specific assessments anchored to each level of the trajectory and determine the location of students on the trajectory • Options for evaluating movement on the trajectory: • Assign points to movement using ratio scale • Assign points to movement using pseudo-continuous scale • Key difference from rough conditioning approach: underlying points have content-referenced meaning

  47. Failure is not an option Regardless of how teacher evaluation evolves, teachers making informed decisions about instruction, setting rigorous yet realistic goals for students, and making accurate judgments about student performance (i.e., the SLO process) is the essence of education. If we cannot reach the point where we have confidence in the accuracy of performance information generated at the classroom and school level then what is the point… Center for Assessment ACEE Meeting (1/17/14)

More Related