220 likes | 305 Vues
Explore the effects of paper-based vs. web-based student course assessments on faculty ratings at GGC. Understand the shift in evaluation methods and its implications for response rates and ratings evolution over time.
E N D
Effect on Faculty Ratings of Paper Based versus Web Based Student Course Assessments Dr. Juliana Lancaster Director of Institutional Effectiveness Dr. Michael Furick Assistant Professor of Business
Introducing GGC • The 35th member of the University System of Georgia • Opened its doors to an inaugural junior class of 120 students in fall 2006 • The first freshmen class of 337 was admitted fall 2007 with total enrollment of 787. • First graduation was held on June 28, 2008 • Enrollment has grown to over 3000 students • Accredited by SACS in June 2009
Introducing GGC • GGC’s mission supports access to baccalaureate degrees that meet the economic development needs of the growing and diverse population of the northeast Atlanta metropolitan region. • GGC offers majors in: Biology Education Mathematics Exercise Science English History Information Technology Political Science Psychology Business Administration Criminal Justice/Criminology
Importance of Student Course Evaluations • Formative Assessment: Used by faculty to improve and shape the quality of our own teaching • Summative Assessment: Used to determine overall performance, may be used in personnel decisions • Programmatic Assessment: Used to evaluate the role of a course within a degree program
General Interest Questions • Do you do paper or web? • Who gets to see results? • Who should see results?
Is Web really better…..or worse? • Will most students actually do a web based evaluation? • Will the students who do a web based evaluation be different from the overall student population? • Will the evaluation ratings be affected by changes in which students complete evaluations?
Shifting Delivery at GGC Semester Modality Distribution of Form/ Access Information Fall 2007 PaperIn-class Spring 2008 On-line Passwords distributed by faculty in class Summer 2008- On-line Passwords distributed by Summer 2009 system to student email Fall 2009 On-line Active links to course surveys in student portal
Study Design • Compared Fall 2007 (paper), Spring 2009 (web with email delivery) and Fall 2009 (web with portal delivery) • Sampled all evaluations of faculty who have been at GGC continuously since Fall 2007 (N=78) • Used only items that have remained (near) constant on evaluation instrument (N=11)
Questions • Did overall response rates change? • Did profile of respondents change? • Did ratings change in a consistent way?
Did overall response rates change? • Response rates for web based semesters are lower than for paper based semester.
Did profile of respondents change?Expected Grade by Semester
Did profile of respondents change? • Looked at expected grade as reported by respondent. • Overall pattern is highly similar, suggesting web did not attract more unhappy students.
Did ratings change in a consistent way? • Overall, ratings have shown a steady upward trend for this set of faculty. • For 10 of 11 items analyzed, Fall 09 mean ratings were significantly higher than Fall 07 ratings. • For one item, ratings dropped for the Spring 09 term but rose again in Fall 09
General Conclusions • Shifting from paper to online evaluations does cause a reduction in response rates • The reduction is spread across expected grades and across the spectrum of ratings • Overall, our mean evaluations have risen over time. • Next step: Replicate study with some paper and some online surveys in Spring 2011