1 / 24

Assessment Strategies for Vision 2020 grants

Assessment Strategies for Vision 2020 grants. Gwynn Mettetal. G oals for this workshop:. Discuss different ways to assess outcomes Help you decide which methods would be best for your project. Your projects?. Who are you?

aaron
Télécharger la présentation

Assessment Strategies for Vision 2020 grants

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Strategiesfor Vision 2020 grants Gwynn Mettetal

  2. Goals for this workshop: • Discuss different ways to assess outcomes • Help you decide which methods would be best for your project

  3. Your projects? • Who are you? • What sort of Vision 2020 project are you planning? (the two sentence version)

  4. Definitions: • Assessment—evidence that your project is making a difference • You MUST assess the effectiveness of your Vision 2020 grant to get continued funding! • Lots of strategies possible • Depends on your goals • Depends on your situation

  5. Types of data

  6. Two major types of data • Quantitative (numbers) • Grades, attendance, ratings on a scale, retention rate • Qualitative (words) • Interviews, essays, open ended survey questions • Both are fine, just different

  7. Some sources of data • Existing data (easiest, already there) • Student records • Archival data • Student work in course • Conventional sources (easy, but must generate) • Behavioral data—journals, library usage • Perceptual data—surveys, focus groups, interviews • Inventive sources (difficult) • Products or performances

  8. Sources of Data

  9. Ethics • Must treat students respectfully • Must protect privacy • Must “do no harm” • Collecting new data (not coursework) from your own students? • Have someone else collect and hold until grades are in • Can’t force them to participate • Can’t take up too much instruction time • Institutional Review Board (IRB) • If planning to publish

  10. Tips • Add power--compare groups! • Before and after • Different course units • This semester and last • Two sections with different methods • Your class to that of another instructor • Be realistic--start small

  11. Can you trust your data?

  12. Definitions: • Validity—does your evidence (data) mean what you think it means? • Example • test scores = deep learning? • What if just rote memory? • What if students cheated?

  13. Definitions: • Reliability—would you get the same evidence if you collected it again? Or was this just a fluke? • Example: • Test scores = deep learning? • What if you gave again next week and scores were very different?

  14. Dilemma • In general, hard to have both. • Real life is messy (valid, not as reliable) • Experiments are controlled (reliable, not as valid) • Solution is . . .

  15. Triangulate! • Get several different types of data • Different sources: • Instructors, students, advisors, records • Different methods: • Surveys, observations, student work samples • Different times: • Start and end of semester, two different classes, two different semesters

  16. See if data all point to same conclusion Course evaluations final project rubric Comparison to last semester’s class

  17. Brainstorming What data could YOU collect?

  18. What does your data mean?

  19. Analyze data—What did you find? • Qualitative analyses: look for themes in words and behaviors Theme 1: Students understood more abstract concepts after group discussion. (Follow with quotes from student exams, other evidence.)

  20. What did you find? • Quantitative analyses: simple graphs, tables • Simple statistics: means, correlations, t-tests

  21. What did you find? Focus on practical significance, more than statistical significance

  22. Brainstorming: What would convince YOU?

  23. How can you USE your data?

  24. Take action based on findings • If evidence was good, keep your old strategy • If evidence was weak, tinker to improve your strategy • Plan to assess again, after working with a new group of students • You will need to show how you used your data to get continued Vision 2020 funding!

More Related