1 / 25

What “Counts” as Evidence of Student Learning in Program Assessment?

What “Counts” as Evidence of Student Learning in Program Assessment?. Sarah Zappe Research Assistant Testing and Assessment Specialist Schreyer Institute for Teaching Excellence. Workshop Goals. To provide information and guidance on the processes of:

oshin
Télécharger la présentation

What “Counts” as Evidence of Student Learning in Program Assessment?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What “Counts” as Evidence of Student Learning in Program Assessment? Sarah Zappe Research Assistant Testing and Assessment SpecialistSchreyer Institute for Teaching Excellence

  2. Workshop Goals To provide information and guidance on the processes of: • Identifying sources of evidence of student learning • Mapping evidence to program outcomes • Developing reports for stakeholders

  3. Definition of Assessment “Assessment is an ongoing process aimed at understanding and improving student learning. It involves making expectations explicit and public; setting appropriate criteria and high standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards, and using the resulting information to document, explain, and improve performance.” Angelo, T. A. (1995)

  4. Assessment Loop Goals and Outcomes Maki (2001)

  5. Student Learning Outcomes • Measurable and specific goals for what we want our students to know, feel, or be able to do following the program • Knowledge, skills, and attitudes • Drives all other stages of assessment

  6. University Guidelines for the Internal Review of Academic Programs • Background, purpose, and goals • Specify evaluation areas • Data collection plan • Data collection and analysis • Recommendations

  7. Do we already have data that provides evidence of student learning? • Probably, let’s see… • Direct evidence of student learning • Measures of student performance that demonstrate actual learning • What did students learn and NOT learn? • Indirect evidence of student learning • Measures of perception or demographic indicators that imply learning has occurred

  8. Direct Measures of Student Learning • Capstone projects, senior theses, exhibits • Portfolios • Standardized tests • Concept inventories • Employer/internship ratings of students’ performance Middle States Commission, (2003)

  9. Limitations of Direct Evidence • No evidence of why students have learned or not learned • Does not indicate “value-added” • Did students already have the knowledge or skills before completing the program?

  10. Indirect Measures of Student Learning • Focus groups/interviews • Employer surveys • Alumni surveys • Registration/course enrollment information • Department or program review data • Job placement indicators • Graduate school placement rates • Comparisons with other institutions Middle States Commission, (2003)

  11. Limitations of Indirect Evidence • Do not evaluate student learning per se • Should not be the only means of assessing outcomes

  12. Does all evidence need to be quantitative? • No… • In fact, good practice in assessment suggests collecting multiple types of information • Both direct and indirect • Both qualitative and quantitative

  13. Quantitative Evidence • Represented numerically • Examples • Scores on tests • Survey scales • Advantages • Ease of collection • Ease of analysis • Ease making calculations and comparisons (across time or between groups) • Generalizability • Limitations • Often doesn’t answer the question of “why”

  14. Qualitative Evidence • Data represented in narrative or prose format • Examples • Interviews • Focus groups • Open-ended questions on surveys • Advantages • Provides very “rich” information • Limitations • More difficult to analyze and to make direct comparisons • Not generalizable • Methods of ensuring reliability are difficult and time-consuming

  15. Brainstorm Activity • Brainstorm existing types of evidence for your program • Direct evidence • Indirect evidence • What is missing but should be collected? • Discuss these with your table

  16. Isn’t sampling somehow cheating? • No, but… • Sampling should be representative of population • Population– students in your program • Sample should embody important characteristics of population • Stratified random sample • Avoid convenience or accidental sampling

  17. Do Grades Count as Evidence? • Yes! But… • Only if they are linked to learning goals • Score/grade alone does not express the content of what students have learned • Need to define what each score means • Match course assessment to outcomes • Syllabi • Test blueprints

  18. Do Grades Count (Cont.) “If the grades of individual students can be traced directly to their respective competencies in a course, the learning achievements of those students are being assessed in a meaningful fashion.” Middle States, 2003

  19. Embedded Course Assessment • Questions or problems relevant to outcomes are embedded within course assessment • Examples • Specific course projects • Capstone projects • Test and blueprints matched to outcomes • Advantages: • No extra time for student or faculty • Student motivation is greater • Provides both formative and summative data

  20. Institutional Assessment Program Assessment Course Assessment Linking Outcomes Bakersfield College (2006)

  21. Activity: Aligning Courses to Program Outcomes • Using the matrix provided, identify sources of evidence and match to your outcomes. • Evidence embedded in courses • Other evidence

  22. How should we decide what to present in our report? • Consider the stakeholders • External stakeholders • Internal audience • Consider a short and a long form • Get feedback • Sample assessment report

  23. Where can we get help if we need it? • Schreyer Institute for Teaching Excellence • http://www.schreyerinstitute.psu.edu • Office of Institutional Planning and Assessment • http://www.psu.edu/president/pia/index.htm

  24. 7 Common Misperceptions about Assessment • We’re doing just fine without it. • We’re already doing it. • We’re far too busy to do it. • The most important things can’t be measured. • We’d need more staff and money. • They’ll use the results against us. • No one will care about or use what we find. Angel (2005)

  25. Mini-Evaluation of Session • Please complete the mini-evaluation form provided so that we can work on improving OUR efforts! • Thank you for your time!

More Related