100 likes | 280 Vues
Lessons Learned About Random Assignment Evaluation Implementation in Educational Settings. SREE Conference March 4, 2010 Raquel Sanchez and Fannie Tseng. Introduction. Hands-on overview of our experiences implementing random assignment evaluations in the classroom.
E N D
Lessons Learned About Random Assignment Evaluation Implementation in Educational Settings SREE Conference March 4, 2010 Raquel Sanchez and Fannie Tseng
Introduction • Hands-on overview of our experiences implementing random assignment evaluations in the classroom. • Extending list of lessons discussed in past literature. • Brief description of the random assignment evaluations upon which our experiences were drawn. • Discuss difficulties with implementing random assignment in classroom settings. • Discussion of lessons learned
Past Literature on Random Assignment Implementation • Gueron (2002) • Ritter and Holley (2008) • Raudenbush (2005) • Burghardt and Jackson (2007)
Overview of Our Random Assignment Evaluation Studies • Two school-level random assignment studies of the effectiveness of professional development programs that focus on developing the reading comprehension skills of English language learners (ELLs) • One center-level random assignment study of a professional development program targeting caregivers of children ages 0-3 • One student-level random assignment study of a curriculum that combines explicit and implicit approaches to instruction in increasing the literacy skills of adult ESL students
Challenges to Implementing RCTs in Educational Settings • Threats to integrity of random assignment • Crossovers and contamination • Recruiting and obtaining buy-in • Dilution of intervention effectiveness • Lack of teacher buy-in • Effect of crossovers and contamination • Documenting treatment dosage • Conflicting interventions on the ground • Local conditions and circumstances
Lessons Learned • Perform in-person recruiting visits at all levels of school administration to maximize buy-in • Foster good communication between school staff, program developers and research team • Follow-up data collection requires persistence, patience and adequate funding • Keep in touch with assessment data administrators, even in off-months of study • If possible, retain local research staff • Use conservative statistical power calculations to factor in potential implementation challenges
Lessons Learned (2) • Throughout the course of the study, establish a separate identity from the program you are evaluating • Hand out separate evaluation study materials with research organization’s logo • Gift cards help • Be proactive about developing plans for documenting treatment dosage • If possible, use more than one source of data • Importance of qualitative studies to accompany impact studies
Contact Us • Raquel Sanchez, Ph.D.raquel@bpacal.com • Fannie Tseng, Ph.D.fannie@bpacal.com Berkeley Policy Associates 440 Grand Ave., Suite 500 Oakland, CA 94610-5085 Ph: 510-465-7884 Fax: 510-465-7885 www.berkeleypolicyassociates.com