1 / 49

“She Was My Backbone”: Measuring the Impact of Literacy Coaching

“She Was My Backbone”: Measuring the Impact of Literacy Coaching . Kelly Feighan, Research for Better Schools Dr. Elizabeth Heeren, Memphis City Schools. http://www.ed.gov/programs/strivingreaders/awards.html. Grantee: Memphis City Schools Memphis, Tennessee Total Grant Award: $16,074,687

Ava
Télécharger la présentation

“She Was My Backbone”: Measuring the Impact of Literacy Coaching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “She Was My Backbone”:Measuring the Impact of Literacy Coaching Kelly Feighan, Research for Better Schools Dr. Elizabeth Heeren, Memphis City Schools

  2. http://www.ed.gov/programs/strivingreaders/awards.html • Grantee: Memphis City SchoolsMemphis, TennesseeTotal Grant Award: $16,074,687 • Memphis' Striving Readers project is designed to test the efficacy of the Memphis Content Literacy Academy (MCLA) professional development model for improving reading achievement and content literacy in high-need urban middle schools serving grades 6-8. All content teachers in Striving Readers treatment schools are eligible for participation in MCLA, which includes: • University Coursework (2 years:12 hours of upper division college credit) • Support from a site-based literacy coach • Access to differentiated instructional materials

  3. Challenges in Hiring/Training Literacy Coaches No certification available in state of Tennessee (and many other states) Standards for coaching not set (we used IRA and NCTE as a guide) Roles are undefined and differ depending on context

  4. Qualifications We Used for Hiring • 5+ years successful teaching in middle school • Advanced degree (Masters +) • Experience with literacy • Experience with professional development • Various content areas • Principal recommendations

  5. Coaching Cycle

  6. Coaches Report on Their Roles in the Schools • Coaches consistently report their role as: providing support, advocating for teachers, and modeling lessons. • All coaches feel part of the “school family.” • The coaching experience has improved with time. • Main challenge: limited time to observe CAP implementation between assignments.

  7. The data collection tool used was designed by the team after one year of coaching experience in the grant. Slight modifications were made after beginning use, and the tool has now been used for 3 years.

  8. Coaches establish rapport with teachers 95.2% of 62 teacher survey respondents reported “I can confide in my coach.”

  9. Trust between the coach and teacher(s) is critical: • To the provision of CAP implementation support • Pre-conference meeting • CAP Observation • Videotapes for use to train teachers, coaches, evaluators • Co-teaching; modeling • Post observation conference • To the effective and strategic selection of CRC & supplemental resources

  10. “She has been so available for me as a literacy coach… I don’t know what I would have done without her assisting me…we’re just basically working as a great team together.” (MSRP Focus Group Report, 2007)

  11. The Coaching Cycle: • Teacher attends CAP modeling or discussion • Teacher discusses lesson plan with coach • Coach observes teaching rehearsal • Debrief/revise lesson plan • Coach observes performance teaching • Final debrief

  12. “Equipping middle and high schools with trained literacy coaches is at least one line of attack to combat “the quiet resignation that seems to pervade education circles…that little if anything can be done” (Joftus, 2002, pg. 1).

  13. Content Teachers Support Literacy Strategies “I think literacy is so important because no matter what they do or where they go they are going to run into something they have to read… if they have to apply for a job, it requires them to be able to read…” (2007)

  14. Purpose of our Study

  15. Implementation Questions • What daily tasks do literacy coaches typically perform at the middle school level? • How much of their time is spent involved in substantive tasks that support teacher practice? • To what extent do teachers perceive coaching services as beneficial? • What are some of the challenges that coaches face?

  16. Impact Questions • To what extent has working with a coach improved teachers’ pedagogy? • Do those who received literacy coaching report higher frequency of strategy use? • How prepared do teachers feel to use literacy strategies in their content classes? • $64,000 question: To what extent has literacy coaching for teachers increased students’ academic achievement?

  17. Data Sources • Measuring what coaches do: • Coaches’ daily activity logs (N = 847) • Teacher surveys (three waves: N = 48, 62, and 54) • Teacher focus groups: four waves of 30 sessions • Coach interviews (four waves with six coaches)

  18. Data Sources • Measuring teacher impact: • Baseline and follow-up survey: MCLA completers and control group • Focus group interviews • Program feedback survey (three waves) • Follow-up checklist six months after program ended • Measuring student impact: • TCAP reading (Spring 2007 and Spring 2008) • ITBS reading (Spring 2007 and Spring 2008)

  19. Coach Log Analysis • Coaches’ logs represented from 52% to 86% of their 190-day work year • We entered a total of 5,791 individual records from 847 daily activity logs • Tasks fell into 12 overarching categories including observing, modeling, helping a teacher prepare for class and administrative tasks or school-related activities

  20. Time Spent with Teachers • Two months into the 2007-08 school year, almost 60 percent of respondents reported that they had met with their coaches more than four times. • By spring 2008, three-quarters (75.9%) reported that they had met with their coaches more than four times. These figures are corroborated by data in the coaching logs.

  21. Coaching “Dosage” • Analyses of logs showed that all (100%) MCLA completers in two schools received high levels of coaching assistance • Approximately three-quarters (76.9%) received high levels of coaching at the third school • One-third (35.7%) of MCLA completers received high levels of coaching at the fourth school

  22. Teacher Surveys: Fall 2006, Fall 2007, and Spring 2008

  23. Data source: RBS MCLA Feedback Surveys

  24. Data source: RBS MCLA Feedback Surveys

  25. Nature of Coach-Teacher Collaboration: Focus Group Findings: Wave One Teachers shared universally positive perceptions about coaching support. Stated one teacher: “She’s there. She’s not intrusive. She never comes off as being a judge, a threat. If she comes in, and if she sees something that wasn’t going like it should, she would offer advice, tips, as opposed to ‘Well that wasn’t right’ and leave. She would say ‘Maybe you should try something like this.’ ”

  26. Despite initial growing pains related to scheduling issues, coaches were highly valued: “My coach tends to be hard to find sometimes… But she’s very helpful I’ve always found… I’ve had some struggles… being a first-year teacher, and she took time out to help me plan a different lesson altogether, trying to figure out how to teach them, how to write better sentences… Actually it’s been a great resource even though it does seem she’s stretched a little too thin.”

  27. Second Wave of Focus Groups • Although teachers issued strong praise of coaches, few accepted her offer to model lessons because they did not feel they needed it. One science teacher stated: “She did ask… but I told her ‘No, no. Just go over what I need to do and I’ll take care of it.’ ” • A few teachers in a mathematics focus group said that although their coach made them feel comfortable, they did not “need” her to model a lesson because “she would explain it so well in class.”

  28. Third Wave of Focus Groups • Strong praise for coaches: very helpful, approachable, and committed to helping teachers succeed • Teachers said coaches “went out of their way” to supply them with needed materials and resources, and cited benefits from observation feedback “She has been so available for me as a literacy coach… I don’t know what I would have done without her assisting me… We’re just basically working as a great team together.”

  29. Third Wave of Focus Groups • Satisfaction with the coach’s accessibility increased “I’m very impressed with [the coach] this year. Last year we were trying to figure each other out (or her role or my role or something), but this year– Well, every time I’d look for her last year, she wasn’t around. I’d ask for something and I couldn’t get it, but this year she is dynamite.”

  30. Fourth Wave of Focus Groups • All nine focus groups held positive views about their coach, and characterized them as very helpful • Advice to others expecting MCLA at their school: avail yourself of the literacy coach’s services “Basically, the literacy coaches are there to help you and sometimes we as teachers, as secondary teachers, we don’t like to open our classrooms up to other people to come in and show us things.”

  31. Focus Group Summary • Across the 30 focus group sessions, most respondents described their coach as someone who "goes that extra mile" to provide assistance, and one who showed understanding and patience • Many teachers shared examples of coaches’ dedication • Initial concerns about accessibility and scheduling conflicts dissipated over time

  32. Coaches’ Challenges • Helping teachers to see that literacy strategies were not “add-ons” • Limited opportunities to see student data in schools with less principal support • Learning to mentor on-the-job • Fitting some literacy activities in with mathematics

  33. Impact on Teachers • This study examines matched baseline and follow-up survey data for 30 MCLA teachers and 34 control group teachers • Teachers were asked how prepared they felt to use, and how frequently they used, 24 literacy strategies

  34. Levels of “Preparedness” and Strategy Use • No baseline differences in mean responses about preparedness on 23 of 24 items • Only one difference emerged: MCLA teachers had a higher mean response (3.80) than control teachers (3.12) on how prepared they felt to have students read aloud for at least five minutes per period (F = 4.82, df = 62, p <.05) • No baseline differences between control and MCLA group with respect to reported frequency of strategy use

  35. Changes Over Time • Paired t tests showed a significant increase in mean responses for both MCLA and control group on most preparedness items • ANOVA results showed significant differences for frequency of strategy use, favoring MCLA group: • Showing relationships with graphic organizers • Establishing purpose for reading text • Modeling use of thinking maps • Using cooperative learning groups

  36. Six-month Follow-up: Fall 2008 • Surveys distributed to schools no longer participating in MCLA asked teachers if they had engaged in five specific literacy activities in the past week • Forty-two respondents had completed at least one semester of MCLA • 83 percent of these respondents identified an MCLA activity they had used in the past week

  37. Data source: Fall 2008 RBS WIS Checklist

  38. Student Impact • This study includes 3,612 with baseline and follow-up TCAP test scores • 1,830 students were linked to the 30 MCLA teachers • 1,782 students were linked to the 34 control teachers • Baseline mean number correct were higher in control schools than MCLA schools (F = 5.44, df = 3411, p<.05) but scale scores were not significantly different

  39. TCAP Baseline Reading Scores *

  40. TCAP Follow-up Scores (N=3,520)

  41. ITBS Scores

  42. Overall Student Achievement Results • Although TCAP scores were higher among control students, the magnitude of the difference is very small: two points • ITBS scores followed the same pattern • Initial ANOVAs and linear regression results did not show a positive MCLA impact on tests scores; however, more variables must be added to the model • Did MCLA teachers have more challenging students? Were there more behavioral problems in MCLA schools?

  43. Qualitative Findings about Student Impact • Teachers were generally optimistic that using literacy strategies would improve student achievement “My students have definitely improved a lot. I have a couple of students who haven’t, but I’ve had students who’ve already like, over two years of growth by their mid-year assessment...” “I think some of the strategies have given them– they want to do things. They’re not as apprehensive as they once were, especially when it comes to fluency.”

  44. Qualitative Findings about Student Impact • On average, teachers felt that learning the literacy strategies helped most students to read better; however, several expressed concern that “nonreaders” needed additional help. “Some have improved, but if they are nonreaders, they’re still nonreaders. It did not help. But those that were struggling, it gave them a different avenue to use, a different method, a different strategy.”

  45. Conclusions • Although student-level results showed no effect of MCLA on academic performance in one year, teacher findings suggest an increased use in literacy strategies • Findings mirror those of other educational researchers who have examined the impacts of coaching models on teaching and learning (Murray, Ma, and Mazur, 2008)

  46. Contact Us: • Kelly Feighan: feighan@rbs.org • Elizabeth Heeren: heerenelizabeth@mcsk12.net • Or visit our website at http://www.rbs.org/msrp

More Related