310 likes | 328 Vues
The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation. Ellen Roscoe Iverson, Carleton College, eiverson@carleton.edu John A McLaughlin, Managing for Results, macgroupx@aol.com Cathryn Manduca, Carleton College, cmanduca@carleton.edu.
E N D
The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation Ellen Roscoe Iverson, Carleton College, eiverson@carleton.edu John A McLaughlin, Managing for Results, macgroupx@aol.com Cathryn Manduca, Carleton College, cmanduca@carleton.edu This project is supported by the National Science Foundation (NSF) Division of Undergraduate Education under Grants No. 0127310,0127141,0127257, 0127018, and 0618482. Opinions, findings and conclusions or recommendation expressed herein are those of the authors and do not necessarily reflect the views of the NSF. On the Cutting Edge is sponsored by the National Association of Geoscience Teachers (NAGT) and is part of the Digital Library for Earth System Education (DLESE).
Overview On the Cutting Edge program Goals of evaluation Logic models Evaluation methods Results
But first…have you used: • Computer-based interventions? • Pedagogically-based professional development for faculty? • Logic models as part of iterative design?
On the Cutting Edge • Delivered at national level to geoscience faculty • Combines residential workshops and websites for faculty professional development • Supported by grant from National Science Foundation • Began workshops in 2002 • Funded for 3 more years
Workshops (3 to 5 days) Emerging themes (2/year) Teaching X (1/year) Course Design (online and face-to-face) CareerPreparation and Management(2/year) Website Instructional materials Datasets and tools Pedagogical resources Tutorials Course resources Assessment tools Bibliographies Visualizations On the Cutting Edge
According to Guskey (2000)*, evaluators of professional development make three mistakes: Collect and report descriptive information – who was involved. Focus on attitudes of participants – did they think their time was well spent – and not on actual changes in the participant knowledge or skill. Keep evaluations brief and limit opportunities for application. Developing Evaluation Purpose *Guskey, T.R. (2000). Evaluating professional development. Published: Thousand Oaks, Calif.: Corwin Press.
Continuously improve the workshops and website Create information for others in our community about what works and does not work with respect to professional development for the members of our community. Initial Evaluation Purpose *
Goals – 5 basic questions • Was the program implemented as planned? • What was the quality of the implementation? • What was the effect of the program on the participants? • What was the impact of the program? • What caused the observed effects and impacts?
Methodologies Workshops • Road checks • End of workshop surveys • Observations and interviews • Online surveys • Baseline survey • Telephone interviews Website • Web statistics reports • Pop-up survey • Awareness poll • External Heuristic Review of website • Focus groups • Telephone interviews • Pilot • Imbedded assessment • Online discussion artifacts
Implications for future • Snowball sampling to evaluate website-only participants • Imbedded assessment • Repeat baseline survey • Formal leadership program • 100% participants contribute to website
For more information: http://serc.carleton.edu/NAGTWorkshops/evaluation.html