1 / 33

Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level?

Mickey Garrison, PhD Oregon Department of Education. Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level? . Denise Airola, PhD Karee Dunn, PhD Sean Mulvenon , PhD University of Arkansas & Next Level Evaluation, Inc. Key Elements Addressed in SLDS Project.

bat
Télécharger la présentation

Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mickey Garrison, PhD Oregon Department of Education Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level? Denise Airola, PhD Karee Dunn, PhD Sean Mulvenon, PhD University of Arkansas & Next Level Evaluation, Inc.

  2. Key Elements Addressed in SLDS Project • What were the key elements in the design and implementation of the Oregon Direct Access to Achievement (DATA) Project? • Explosion of initiatives • Integration of initiative elements into comprehensive Oregon DATA Project • Professional development • Build capacity and sustainability • What were the key evaluation considerations? • Evaluation questions • Implementation and outcome measures

  3. Key Element in Project Design:Responsive to anexplosion of initiatives at state and local level • Response to Intervention (RTI) • Positive Behavior Support (PBS) • Oregon DATA Project/ Data Teams • Marzano/Effective Classroom Strategies • DuFour/Professional Learning Communities (PLC) • Effective Behavior and Instructional Support Systems (EBISS) • Scaling Up Evidence Based Practices (state level) • Effective School Practices Network (ESD/county level) • Coaching

  4. Key Element in Project Design: Integration Direct Access to Achievement: The Oregon DATA Project • Data teams • Data-Driven Decision-Making-CIP & SIP • Unwrapping Standards • Lesson Design • Effective Teaching Strategies • Intervention Design • Progress monitoring • System of Accountability Curriculum Instruction

  5. Key Element in Project Design: Professional Development Living Likert Read the statement below, then indicate your agreement level with a thumbs up meter. “Schools’ in my state/district are using data to inform their decisions in leadership and instruction.” Strongly agree Strongly disagree

  6. Key Element in Project Design: Capacity and Sustainability Essential steps toward building a PD framework that is focused on capacity and sustainability

  7. How would the educators you work with complete these thoughts? • I know what I’m doing is working in the classroom because… • I know this program is helping students because…. • I know our improvement efforts have positively impacted student achievement because…

  8. A wise man proportions his belief to the evidence. David Hume What evidence do you have to support your perceptions?

  9. Key Considerations for Evaluation Evaluation Questions Implementation Outcomes

  10. Key Consideration: Evaluation Questions • Teacher Impact—Do professional development and support through a job-embedded approach change teachers’ DDDM practice compared to non-participating teachers? • Student Impact—Do professional development and support through a job-embedded approach impact student achievement in participating teachers’ classrooms compared to non-participating teachers?

  11. How do we monitor implementation? Key Consideration for Evaluation: Monitoring Implementation • Grant reporting requirements for participants • Activity log completion by trainers and data team leaders • Focus of data team meeting • Use of PD content embedded in data team meeting • Changes to PD content documented • Strengths and concerns • Observation tool • Provides an additional measure of implementation fidelity

  12. Improved Teacher Assessment and DDDM Knowledge Change in teacher DDDM efficacy Strand 3 DDDM content Change in teacher’s instructional practices Improved Student Achievement Change in teacher concerns about implementation Support for DDDM in data teams and PLCs Teacher and student measures are based on our ‘theory of action’ Key Consideration for Evaluation: Monitoring Outcomes

  13. Measures Used to Monitor Change in Adults • Concerns—Stages of Concern Questionnaire (SocQ) • Efficacy—Data-Driven Decision Making Efficacy Survey (3DME) • Knowledge--Knowledge Measure (KM) • Direct Observation—support what is learned from the 3 profiles with direct observation of teacher/leader behavior • Data Team Observation Protocol

  14. Why Monitor Concerns? • Concerns—individual’s set of thoughts and feelings related to an innovation (Hall & Hord, 2001) • Concerns follow a developmental, predictable pattern in individuals faced with change (Conway & Clark, 2003; Fuller & Brown, 1975; Hall & Hord, 2001) • Stages of Concern Questionnaire (SoCQ) • (SFSoCQ)

  15. Six Stages of Concern • Stage 0 Unconcerned • Stage 1 Informational • Stage 2 Personal • Stage 3 Management • Stage 4 Consequence • Stage 5 Collaboration • Stage 6 Refocusing

  16. Why Monitor DDDM Efficacy? • Efficacy influences motivation-individuals with higher efficacy for a given task are more motivated to engage in and persevere at the task (Bandura & Schunk, 1981; Bouffard-Bouchard, 1990).

  17. How might efficacy for using data to make instructional decisions impact teachers’ instruction? • What would you expect from teachers with high DDDM efficacy? • What would you expect from teachers with low DDDM Efficacy? Brainstorm 2-3 ways efficacy could impact teachers’ behavior or actions in planning and implementing classroom instruction or assessment.

  18. Hypothesized that efficacy and concerns are related to educators’ knowledge and skills for DDDM. Why measure knowledge?

  19. Oregon DATA Project Knowledge Measure • Informs trainers about changes in educators’ knowledge and skills related to • Interpretation, evaluation and application of data-related information, particularly state test results. • Provides insight into the relationship between efficacy in DDDM and actual knowledge/skills

  20. Monitoring and Informing Implementation • DATA Project leaders and trainers were provided with profile reports for the three measures that enabled them to respond to different training/support needs within each region, ESD and district. • Direct observations of DATA Teams were used to provide greater detail about implementation progress and challenges.

  21. Pre- and Post-Survey Results for SoCQ, 3DME and KM and Recommendations: All participants Regions ESDs Districts Pre- and Post- Results for the 2010-2011 School Year

  22. Examined change among participants using all three measures • Changes in Concerns • Less resistance • Changes in Efficacy • Relatively higher efficacy • Changes in Knowledge • Relatively more knowledge

  23. Each profile represented different considerations for implementation and support for an innovation over time

  24. Disaggregated profiles showed differences in change in implementation concerns within entities Change in concerns was linked to fidelity of implementation.

  25. More variation among districts—ESDs and districts encouraged to use pre-post results to inform continued implementation

  26. Key Consideration for Evaluation: Student Outcomes • Teacher-Student connection not available • Smallest unit of analysis was school level • Caveats • Variability in teacher level participation in ODP training within schools. • Split-plot repeated measures design with school as unit of analysis

  27. Significant increase in math mean school z scores from 2010 to 2011 • Math-significant interaction between effect of time and program—closing gap between higher performing NonODP schools and ODP schools.

  28. Student Outcomes in terms of Met/Exceeded Math cut scores were raised for 2010-11 Students in participating ODP schools increased percent Met or Exceeded at a

  29. Evaluation Challenges • Survey completion rates varied • More reluctance to complete surveys as funding comes to a close • Opportunities for direct observation limited due to scheduling • Teacher to student link availability • Limited time to take ODP to full-scale within participating schools

  30. Key Elements of Project Design, Implementation and Evaluation • Integrate your initiatives. • Connect everything to student learning. • Use NSDC guidelines when designing your professional development. • Build capacity and sustainability from the onset. • Evaluate the effectiveness of your PD on teachers and students. Student achievement is a critical indicator!

  31. 32

  32. For more information: Mickey Garrison 541-580-1201 mickeyg@rmgarrison.com Sean Mulvenon 479-575-5593 seanm@uark.edu Denise Airola 479-575-7397 dairola@uark.edu Karee Dunn 479-575-5593 kedunn@uark.edu

More Related