1 / 19

Classroom Observation in STEM Evaluation: Logistical and Conceptual Issues

Classroom Observation in STEM Evaluation: Logistical and Conceptual Issues. James P. Van Haneghan ( Jvanhane@usouthal.edu ) University of South Alabama Susan Pruet and Rhonda Neal Waltman Mobile Area Education Foundation

jude
Télécharger la présentation

Classroom Observation in STEM Evaluation: Logistical and Conceptual Issues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classroom Observation in STEM Evaluation: Logistical and Conceptual Issues James P. Van Haneghan (Jvanhane@usouthal.edu) University of South Alabama Susan Pruet and Rhonda Neal Waltman Mobile Area Education Foundation The authors would like to acknowledge that the completion of this work was supported by NSF award # 0918769. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation

  2. Our Project • Engaging Youth In Engineering (EYE) Middle School Module Study (for additional EYE info go to www.maef.net • Develop, revise, and formalize a series of Engineering Design Challenges • Focus on Middle School Students and their teachers • The modules generally take about one week to do and involve work in both Math and Science Classes • Science Themes are those that are consistent with grade level curricula for the most part (e.g., physical science, biology, etc.). • Integrate modules with mathematics as well as science to help build recognition of connections math and science

  3. Our Project • All students work on three modules a year through middle school • Students work on modules for about one week in science and mathematics class • Math and science teachers coordinate lessons to work on EYE • Five year study (Development of two additional modules, revision of existing, three year longitudinal study of students and teachers in EYE Schools)

  4. Example of Challenge Don’t go with the Flow—building barriers to decrease sediment flow Sixth Grade Challenge in Fourth Quarter Projects involve both science and math teachers (sediment flow, ratios, rates, etc) Students Collect data and use excel spreadsheets Real problem in Mobile (Sediment into Big Creek Lake) the water supply for Mobile Address Problem Dog River Watershed

  5. Classroom Observation Necessary • Modules take significant amount of class time (one per quarter—with one week of instruction) • We need to learn about what is and is not working in the classroom • Have feedback form that we are working with, but still need to observe as well • Starting comparative longitudinal study next year

  6. Classroom Observation Multiple Purposes in Evaluation • Fidelity of Implementation • Are the teachers implementing as trained or as envisioned by developers? • Does the actual curriculum match the designed curriculum? • Quality of what is happening in the classroom • Do the activities in the classroom engender the kinds of outcomes that developers thought they would? • Are the activities like those of other successful reformed teaching programs? • Formative feedback for teachers who are learning something new • Comparison of classrooms using EYE to non EYE classrooms

  7. Ideal Choice Valid instrument that has credibility in the larger community of researchers and evaluators Instrument that has training materials available Evidence of reliability (interrater agreement) Can serve multiple purposes Can work with projects across both math and science focus To use it in non EYE schools cannot be too project specific

  8. Reformed Observation Teaching Observation Protocol (RTOP) Chosen as Starting Point • Had some Validity and Reliability Data (Sawada et al.,2002) • Has gained some credibility with funding agencies (noticed it as a measure on MSP evaluation site) • Has some readily available training materials online that can be readily accessed and used as a starting point for training observers. • Can be used across math and science, so it should work well with engineering design challenges

  9. Make Up of RTOP • Background and description of context • Ratings on a 0 to 4 scale whether certain behaviors, dispositions, and instructional strategies were observed in three different areas: Lesson design and implementation, content, and classroom culture. • 0 means never occurred • 4 means very descriptive of classroom • Summation of item scores yields overall score on 0 to 100 scale • Higher scores mean greater evidence of reform-based pedagogy in math and/or science

  10. Began project using RTOP • New Watershed Module as first attempt • Held two formal training sessions • Worked with experienced evaluator and three retired educators, most familiar with inquiry-based instruction (e.g., talents program) • We had an inquisitive group of observers who asked good questions • We engaged in both live and videotaped observations. • Pairs of observers viewed classes • We also had pairs of observers review some video tapes.

  11. Observation of Watershed Classes • Picked one school and observed all sessions for two science and two math teachers • Videotaped one class as it worked through as well. Had 2 raters work through all of the videos • Overall had observers in 20 different class periods with 2 observers in 18 of the classes. • Quite labor intensive

  12. Some Initial Results • On average, on the RTOP 100 point scale, the EYE Watershed Modules had an average score of 74.5 (SD = 18.05). • Scores ranged from 33 at the lowest and 85 at the highest. • We found that observers were generally pretty reliable in scoring with median Cohen’s Kappas (using scores within 1 point on an item as agreement) being .64 and a median Intraclass correlation of .70. • However, there was one class observed where there was very low agreement suggesting the need for further analysis.

  13. Comments • One class where reliability was low, but may have been a function of teacher preparedness for the lesson • We did find that there were elements of the RTOP that were difficult to score within the modules • We felt that the RTOP did not account for the parts of the modules that we felt were valuable but were not purely “reformed-teaching activities”. • For example introductory activities like watching a slide show about watershed issues. • Taken across the lessons, we found that the Watershed Module did engender reform based teaching as we had hoped • However, we felt there were elements missing in this approach

  14. What was missing? • Elements of the flow of events • Use of CTEP (Lawrenz, Huffman, & Appeldoorn, 2002). • Has segments and codes for what is happening and its cognitive level across 5 minute intervals • Can compare to template of what to expect in lessons • Not as lengthy as RTOP (one of our Research Assistants, felt that it did not capture as much as RTOP) • Fidelity of Implementation • covers just one aspect, not the larger picture of how each unit works

  15. What was missing? • Ability to capture the quality of planned lecture elements of the modules • Need a system to capture • What is the appropriate unit of analysis for suggesting the EYE modules are engendering reformed instruction? • Average RTOP across classes? • Best RTOP?

  16. Can RTOP serve all four purposes we want it to?

  17. Others have been asking the Same Question • UTEACH project in Texas tried to develop their own tools • Ended up with simplified tools to easily capture what was going on the in their high school classrooms • We are not sure that it will provide us what we need • Just received an email from a listserv where someone asking about an RTOP –like observation instrument for engineering education

  18. Lessons Learned • RTOP useful but need to work on project focused observation as well (will need more than one tool) • We plan to work with elements of CTEP to develop the system for looking at activities over lessons • We are developing a system for helping observers know what to expect the days they observe • Need for collaboration across projects to develop measures (several projects focusing on engineering in P-12--already two have done work with RTOP and have been seeking alternatives)

  19. References Century, J., Rudnick, M., & Freeman, C. (2010). A Framework for Measuring Fidelity of Implementation: A Foundation for Shared Language and Accumulation of Knowledge. American Journal of Evaluation, 31, 199-218. Lawrenz, F., Huffman, D., & Appeldoorn, K. (2002). Classroom videotape observation guide. Available from: http://www.cehd.umn.edu/carei/cetp/PDF/COPVideoguide.pdf, May 26, 2010. Sawada, D., Piburn, M., Judson, E., Turley, J., Falconer, K., Benford, R. & Bloom, I. (2002). Measuring reform practices in science and mathematics classrooms: The Reformed Teaching Observation Protocol. School Science and Mathematics, 102(6), 245-253.

More Related