1 / 22

Math and Science Partnership Program

Math and Science Partnership Program. Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public Works , Inc. The instructional practices and assessments discussed or shown

pembroke
Télécharger la présentation

Math and Science Partnership Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public Works, Inc. The instructional practices and assessments discussed or shown are not an endorsement by the U.S. Department of Education

  2. What is a longitudinal evaluation? • An analysis of data collected: • on the same set of participants (e.g., teachers or students)… • using a common measure (e.g., survey data, test scores or percentiles)… • collected regularly (e.g., annually) to assess the extent to which outcomes are changing.

  3. Examples of Longitudinal Data • Student achievement data (nationally normed or state test) • Collected in spring each year for students of participating teachers • Teacher survey data (state or locally developed instrument) • Collected each fall to evaluate summer professional development • Classroom observations of teachers • Collected in participating/non-participating classrooms to assess implementation of instructional practices during school year

  4. Key Elements of Longitudinal Studies • Participants • Follow the same students or teachers over time • Need consistent definitions of who to include in the study (e.g. ability to track dosage of PD) • Measures/Instrument Development • Use the same measures/instruments at each wave of data collection • Data Collection Methods • Timing of data collection must be the same each year (e.g., achievement data collected each fall not in the fall one year and spring the next year)

  5. Multiple Evaluation Lenses • Statewide Perspective (across MSP grants) • Local Partnership Impact (within MSP grantees) • Federal Reporting

  6. Statewide Perspective • Longitudinal data to track sets of grantees over time (across MSP grants). Useful to track: • Different approaches/models of PD • Regional differences/variation of impact by partnership • Statewide impact of grant/PD on teachers and students • Contribute to reporting to federal level

  7. Local Partnership Impact • Longitudinal data to track single MSP grants (within MSP). Useful to track: • Local research questions • PD model and impact given local context for implementation • Partner effectiveness and improvement in implementation • Not for comparison across MSPs

  8. Why Consider Longitudinal Data Collection and Evaluation • Program process monitoring • “Systematic and continual documentation of key aspects of program performance that assess whether the program is operating as intended.” (Ross, Lipsey, Freeman) • Program outcome monitoring • “The continual measurement of intended outcomes of the program.”

  9. Sample Questions • Program process monitoring • Is the duration and intensity of PD consistent over time? • Who’s participating and targets being met? • Is the PD content and emphasis changed/relevancy to student-teacher needs? • Program outcome monitoring • Long-term achievement trends of students taught by MSP teachers? Are gains sustained? • Are MSP practices becoming embedded in the classroom? Are they supporting achievement?

  10. Who’s Your Audience? • MSP Program Directors • Fidelity to program goals and requirements • Duration and intensity of PD across MSPs • Teacher/Student trends • Policymakers at the local, state, and federal levels • What’s working? Replication, expansion? Funding decisions? • Pay off? • Larger community of practitioners • Add to evidence-base of what is working and why

  11. Setting up a Longitudinal Evaluation • Lots of Work! • Takes resources and expertise. Funds are needed to collect and analyze the data. • Takes time and commitment. Planning, monitoring data collection and sharing of data, conducting the analysis is time-consuming. • Takes a long-term vision and common set of goals and outcomes. Requires a multi-year commitment and up-front planning/tweaks and adaptations over time. • Embed reporting and data collection in RFA/require MSPs to reserve grant funds for evaluation • Establish common goals across grantees

  12. Setting up a Longitudinal Evaluation • Start planning now… • The next MSP State grant competition is an opportunity to set up a data collection and evaluation system • Can’t (very difficult) to build retroactively • Building mid-stream also difficult • Build prospectively..definitions, measures, and reporting requirements established before program begins • Ideas… • Require/encourage grantees to collect common set of data elements • Attendance, length/duration of PD • Teacher knowledge • Teacher practice • Student achievement • Pick the most important to measure across MSPs

  13. Setting up a Longitudinal Evaluation • More ideas for the MSP Application Notice • State direction and focus areas of the MSP competition • List common measures and consider whether self-report vs. through observation or performance (e.g. classroom observations; student assessments) • Consider if you can require vs. ideas for encouraging • Additional points for participation in the evaluation • Larger awards for participation in the evaluation • Sheltered competition (set aside funds for testing of PD models and innovation)

  14. Setting up a Longitudinal Evaluation • Analyzing and Reporting the Data • Establish research questions upfront--needs to be part of data collection strategy including: • Data elements • Target population • Data collection timing and process • Diversity of MSP grantees must be considered--PD model, subject areas, grade levels covered • Data collection and analysis must align to grant competition emphasis • Think about trend data across cohorts of MSP grantees to provide a long term view of PD practices and changes over time/subgroup analysis

  15. An Example Central versus local • CDE/Public Works, Inc. • CDE/Federal • Partnership/ Local evaluators

  16. Statewide Evaluation Research Questions • How have the Partnerships ensured that all students have access to, are prepared for, and are encouraged to participate and succeed in challenging and advanced mathematics and science courses? • How have the Partnerships enhanced the quality of the mathematics and science teacher workforce? • What evidence-based outcomes from the Partnerships contribute to our understanding of how students effectively learn mathematics and science?

  17. Data Collection • Qualitative research includes site visits and phone interviews (Spring/Summer) • Teacher Database that incorporates data on all teachers in participating districts and attendance in professional development for all participating teachers (ongoing) • Partner Survey (Winter) • Participating Teacher Survey (Spring) • Student rosters of comparison and treatment teachers for statewide study/CST Data (collected in Fall)

  18. Teacher outcomes examined in state evaluation Teachers Number and characteristics of teachers who participate in professional development; satisfaction with training • Demographics/Years teaching • Hours of Training • Qualifications/Assignments • Treatment vs. Comparison Student Performance on CSTs

  19. Student outcomes examined in state evaluation Students • Improved student academic achievement on state mathematics and science assessment across the state and at the partnership level

  20. Local Evaluation in California Two Goals: • Fulfill/support state evaluation requirements • Fulfill commitments made in Local Evaluation Plan in response to RFA Most important for local: Teacher knowledge and instructional strategies & measuring student knowledge with local assessments. Also, the completion of an evaluation report to attach to fall Federal Report.

  21. Closing Thoughts and Considerations • Feasibility and cost • Getting evaluators on board may be part of grant requirements but can also be built through buy-in and the provision of technical assistance • Evaluation and data systems expertise at the state level is crucial as a liaison to the individuals responsible for the evaluation • Consider: • Payoff from data can be great but takes time and commitment • Contributions of evidence collected and best practices is valuable to the field • Less reinventing, more using lessons learned

  22. Contact Information • Patty O’Driscoll, patty@publicworksinc.org • Phone: (707) 933-8219 • www.publicworksinc.org

More Related