1 / 25

The Role of Program Evaluation in Replicating an Educational Model

The Role of Program Evaluation in Replicating an Educational Model. Tamara M. Walser, Ph.D., UNC-Wilmington Dawn M. Hodges, M.Ed., Hill School of Wilmington Karen S. Wetherill, UNC-Wilmington Emily R. Grace, M.Ed., UNC-Wilmington The University of North Carolina at Wilmington CREATE 2010.

ronny
Télécharger la présentation

The Role of Program Evaluation in Replicating an Educational Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of Program Evaluation in Replicating an Educational Model Tamara M. Walser, Ph.D., UNC-Wilmington Dawn M. Hodges, M.Ed., Hill School of Wilmington Karen S. Wetherill, UNC-Wilmington Emily R. Grace, M.Ed., UNC-Wilmington The University of North Carolina at Wilmington CREATE 2010

  2. Background The purpose of this presentation is to describe the replication of a theory-driven educational model for improving the reading achievement of struggling readers in K-12 public schools. • Hill Reading Achievement Program (HillRAP) • Private school to public school settings • Initial success (Downing, Williams, & Lancaster, 2007; Frome, Bell, & Close, 2005) • Current replication in Carteret County Schools and Brunswick County Schools in Southeastern North Carolina (2008-2011)

  3. Background Replicating an Educational Model • Implementation and evaluation of a model across different sites over time. • Facilitates ability to generalize results (Schafer, 2001). • In addition to deriving knowledge from evaluation, replication is important for building knowledge about evaluation practice (Turnbull, 2002).

  4. Background Theory-Driven Educational Model The theory of a program: “a systematic configuration of stakeholders’ prescriptive assumptions (what action must be taken) and descriptive assumptions (what causal processes are expected to happen) underlying programs, whether explicit or implicit assumptions” (Chen, 2004, p. 136). Theory-Driven Evaluation Uses the program theory to guide the evaluation (Chen, 1990).

  5. Replication of Hill Center Reading Program Model (HillRAP) • Program Model: HillRAP • Contextual Differences of Implementation • Two sites: Carteret County & Brunswick County • Student selection criteria • Scheduling • Grade levels included • Improved student achievement in reading • Components • Orton-Gillingham • National Reading Panel • Implementation • 4 to 1 ratio • 45-50 minutes • Training Outcomes

  6. Program Model • HillRAP • Components based on • Orton-Gillingham (Ritchey & Goeke, 2006) • National Reading Panel Report (2000) • Implementation • IMSLEC • 4:1 ratio • 45-50 minute sessions (Ritchey & Goeke, 2006)

  7. Components

  8. Implementation

  9. Contextual Differences Carteret County • 8,500 students • Demographics Brunswick County • 11,800 students • Demographics 90% White 9% Black 2.5% Hispanic 1.2% Multiracial < 1% Asian or Am. Indian 86% White 12% Black 4% Hispanic 1% Multiracial < 1% Asian or Am. Indian

  10. Contextual Differences

  11. Contextual Differences Carteret County • Teacher and site-based management • No full time on-site coordinator • No planning year • 3 implementation years • Gradual increase of students in program • Small scale implementation prior to current project • Students selected by teachers for program. Brunswick County • District management • Full time on-site coordinator • Large student sample size • Planning year • 2 implementation years • Students selected by team of teachers, administrators and EC staff using pre-determined criteria as a guide.

  12. Evaluation Questions Outcome • Do students who receive HillRAP instruction improve academic achievement in reading, overall, and by student groups? Process/Implementation • Is there a relationship between the number of HillRAP instructional hours a student receives and achievement in reading? • Do teachers who receive HillRAP training effectively implement HillRAP in a public school setting?

  13. Outcome: Reading Achievement Measures • Woodcock-Johnson III Tests of Achievement (WJ-III): Letter-Word Identification, Reading Fluency, Passage Comprehension, and Word Attack • North Carolina End-of-Grade Assessment (NC EOG) in Reading, 4th-8th Grades

  14. HillRAP Components Oral Drill Phonological Awareness Word Attack Fluency Comprehension WJ-III Tests Letter-Word Identification Reading Fluency Passage Comprehension Word Attack Outcome: Reading Achievement

  15. Carteret New HillRAP students pre- and post-tested each of the 3 years of project Continuing HillRAP students tested in the spring of each additional year of project Brunswick HillRAP students pre- and post-tested the first year of project implementation Continuing HillRAP students tested in the spring of the second/final year of project implementation Outcome: Reading Achievement WJ-III Data Collection

  16. Carteret New HillRAP student pre- and post scores Continuing HillRAP students scores for each additional year of project Brunswick New HillRAP student pre- and post scores Continuing HillRAP students scores for the second/final year of project implementation Outcome: Reading Achievement NC EOG in Reading Data Collection

  17. Outcome: Reading Achievement Historical Cohort Comparison Group: Brunswick • Pre- and post scores for 5th and 8th grade HillRAP students in first year of project implementation • Pre- and post scores for a historical cohort comparison group of 5th and 8th grade students from the year prior to project implementation NC EOG in Reading Data Collection

  18. Process/Implementation: Instructional Hours Measures • HillRAP Attendance Record Data • Evaluation Question One Outcome Results

  19. Carteret Collected by HillRAP teachers using standard log Brunswick Collected by HillRAP teachers using Hill database Process/Implementation: Instructional Hours Student Attendance Record Data Collection

  20. Process/Implementation: Teacher Effectiveness Measures • HillRAP Teacher Observation Form • Evaluation Question One Outcome Results

  21. Carteret Completed by Hill Center Master Teachers/Trainers 5 times annually Brunswick Completed by Hill Center Master Teachers/Trainers 5 times annually Process/Implementation: Teacher Effectiveness Teacher Observation Data Collection

  22. Brunswick County Results: Year 1 (N=324)

  23. Carteret County Overall Results: Year 1 (N= 94) Average Score Range 77.7

  24. Carteret County Overall Results: Year 2 (N= 89) Average Score Range

  25. References Chen, H. (2004). The roots of theory-driven evaluation: Current views and origins. In M. C. Alkin (Ed.), Evaluation roots: Tracing theorists views and influences (pp.132-152). Newbury, CA: Sage. Chen, H. (1990). Theory-driven evaluations. Newbury, CA: Sage. Downing, J., Williams, J., Lasater, B., & Bell, K. (2007, April). The Hill Center Reading Achievement Program in Durham Public Schools: Final report. Research Triangle Park, NC: RTI International. Frome, P., Bell, K., & Close, K. (2005, November). The Hill Center student achievement study: 1995-2004. Research Triangle Park, NC: RTI International. National Institute of Child Health and Human Development. (2000). Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. Washington, DC: U.S. Government Printing Office. Ritchey, K. & Goeke, J. (2006). Orton-Gillingham and Orton-Gillingham-based reading Instruction: A review of the literature. Journal of Special Education, 40(3), 171-183. Schafer, W. D. (2001). Replication: A design principle for field research. Practical Assessment, Research & Evaluation, 7(15). Retrieved from http://PAREonline.net Turnbull, B. (2002). Program theory building: A strategy for deriving cumulative evaluation knowledge. American Journal of Evaluation, 23(3), 275-290.

More Related