1 / 81

Oregon DATA Project

Oregon DATA Project. Fall 2010 Baseline Survey Results Denise Airola , Ph.D.(c) Karee Dunn, Ph.D. Essential Questions. On average, what are the baselines for the state and each region for concerns, efficacy and knowledge? What are the implications of the baseline results?

beata
Télécharger la présentation

Oregon DATA Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Oregon DATA Project Fall 2010 Baseline Survey Results Denise Airola, Ph.D.(c) Karee Dunn, Ph.D.

  2. Essential Questions • On average, what are the baselines for the state and each region for concerns, efficacy and knowledge? • What are the implications of the baseline results? • Given the baseline results, what are the suggested next steps for state, regional and local leaders?

  3. Purpose of this evaluation is to answer key questions about the project: Evaluation Questions • Teacher Impact—Do professional development and support through a job-embedded approach change teachers’ DDDM efficacy, knowledge and practice compared to non-participating teachers? • Student Impact—Do professional development and support through a job-embedded approach impact student achievement in participating teachers’ classrooms compared to non-participating teachers?

  4. Key Consideration for Evaluation: Monitoring Outcomes Teacher and student outcome measures are based on a ‘theory of action’

  5. Practice What You Preach—If today’s teachers are expected to engage in DDDM, shouldn’t we do the same? We examined teachers’ concerns about DDDM, DDDM efficacy, and DDDM knowledge.

  6. What do the baseline data tell us? State and regional results for Concerns Efficacy Knowledge

  7. What are Concerns? • Concerns are individual’s thoughts and feelings regarding an innovation. • Concerns profiles help improve professional development. • The stages of concern profile provides a snap shot of an individual or a group’s concerns across seven stages: • Stage 0: Awareness • Stage 1: Informational • Stage 2: Personal • Stage 3: Management • — Stage 4: Consequence • — Stage 5: Collaboration • — Stage 6: Refocusing

  8. Interpreting Concern Profiles • Theoretically, concerns follow a predictable pattern. • The two broad categories are Nonuser and User: • Nonuser profile (High Stage 0, 1, or 2) • User profile (High Stage 3, 4, 5, or 6)

  9. Interpreting Concerns Profiles: Nonuser

  10. Interpreting Concerns Profiles: User

  11. Interpreting Concern Profiles • However, people’s feelings and thoughts aren’t always so “neat.” • So, we follow a few guidelines to interpret what teachers’ responses provide us. • First, we identify the highest stage that is roughly 10 points greater than adjacent stages. • Next, we look for a secondary peak (more rare with a User profile).

  12. Nonuser Profiles • High Stage 1 (Informational) Profile • They feel that they do not know enough about the innovation and need more general information. • What is it? What does it do? What does it involve? • Indicates lack of a sense of personal involvement. • High Stage 2 (Personal) Profile • Concerned about the impact on self – personal status, work loads, rewards, etc.

  13. Profiles: Two Considerations FIRST The “one-two split” occurs when Stage 2 is of equal or greater value than Stage 1. • Personal concerns outweigh informational concerns. • Need to help them understand their role in the innovation and how it affects them. • If this is not done prior to providing innovation-specific information, they will be less open to what you have to say.

  14. Nonuser Profiles: Two Considerations

  15. Nonuser Profiles: Two Considerations SECOND “Tailing-up” occurs when Stage 6 is greater than than Stage 5. • Indicates resistance to the innovation. • They think they know something that will work better. • A “tailing-up” that is greater than 1% indicates strong resistance, and that they are likely to undermine attempts to incorporate the innovation. • Any “tailing-up” is cause for concern and action.

  16. Nonuser Profiles: Two Considerations

  17. User Profiles • The highest peak is at Stage 3, 4, 5, or 6 (Secondary peaks are more rare, but should be attended). • The interpretation of User profiles is more simple and guided by the definition of the concern.

  18. User Profile • High Stage 3 (Management) Profile • Most concerned with management, time, and logistical issues. • High Stage 4 (Consequence) Profile • Most concerned with impact on students and how to evaluate that impact. • Also concerned with how to increase positive student outcomes.

  19. User Profile • High Stage 5 (Collaboration) Profile • Interested in working with others with regard to use of the innovation. • High Stage 6 (Refocusing) Profile • Interested in possibilities of making changes to the innovation or replacing it with something better. • Have definite ideas about something that they think will work better.

  20. State Summary Results—Adult Indicators Baseline Spring and Fall 2010

  21. Concerns About Implementation:

  22. State Results for All Respondents

  23. State Results for All Respondents • The overall results for the state present a non-user profile with a fairly severe tailing-up. • Toaddress this PD must • Further familiarize teachers with what is expected of them and why. • Also, how is this a better option than other routes they may consider?

  24. DDDM Efficacy What do teachers and leaders believe about their ability to make decisions using data?

  25. The Little Engine That Could • DDDM Efficacy is a teacher’s perception of his or her ability to successfully engage in DDDM. • Why is efficacy important? • Impacts pedagogical decision making. • Impacts persistence in the face of obstacles. • Like concerns, efficacy is a predictor of teacher behavior. • And like concerns, IT IS TRAINABLE!!!

  26. DDDM Efficacy • Data-Driven Decision Making Efficacy (3DME) Survey • Efficacy for Data Identification and Access • Efficacy for Data Technology Use • Efficacy for Data Interpretation, Evaluation, and Application • Efficacy for Data-Driven Decision Making

  27. Efficacy for Using Data to Drive Decisions

  28. Overall more confidence in using technology than in identifying appropriate data, interpreting/evaluating and applying to decisions

  29. State Results for All Respondents • The overall results for the state present a low efficacy profile. • Toaddress this PD must • Further support the development of teachers understanding of DDDM. • Provide opportunities for success. • Provide models of success.

  30. Knowledge Measure:What about DDDM knowledge and skills? A wise man proportions his belief to the evidence. David Hume

  31. Oregon DATA Project Knowledge Measure • Assesses specific skills and knowledge in DDDM • Provides information regarding Oregon DATA Project participants’ skills and knowledge in DDDM • Provides another lens for investigating the theory of action

  32. Knowledge Measure Strand 1 • Decision making based on data • Assesses participants’ ability to respond to analysis scenarios with appropriate inferences to support decisions. • Assesses participants' use of summative and formative data in making decisions, and the connection of testing results to decisions about instruction.

  33. Knowledge Measure Strand 2 • Interpretation, evaluation and application of data-related information • Assesses participants' ability to make meaning from data collected and reported for decision making. • Ability to interpret summative and formative test results, including standard OAKS online reports. • Assess use of critical analysis tools, and the adult factors (cause data) as they may relate to student effect data.

  34. Knowledge—The Content Matters

  35. State Results for All Respondents • The overall results for the state present a low knowledge profile on both scales. • Toaddress this PD must • Further familiarize teachers with the knowledge and skills necessary to be successful in DDDM. • Ongoing instruction is critical.

  36. How do you explore potentially related factors? The Wagon Wheel: the ultimate in triangulating!

  37. Concerns

  38. State Results for All Respondents • The overall results for the state for concerns, efficacy, and knowledge converge to indicate one critical target for PD in 2011. • Ongoing instruction for the development of DDDM understanding: • Awareness • Knowledge • Skills

  39. Region 1 Multnomah ESD Clackamas ESD Columbia Gorge ESD

  40. Region 1 Review • The regional profile appears to have a masking affect. • Due to the diversity of outcomes for the ESDs in Region 1 • Implication: Must examine the individual ESD profiles.

  41. Region 2 Northwest Regional ESD Willamette ESD

  42. Region 2 Results • Region 2 results present • A non-user profile with a fairly severe tailing-up. • Low efficacy and knowledge profile. • PD goal should be ongoing instruction for the development of understanding: • Awareness • Knowledge • Skills

More Related