1 / 24

Program assessment- where to start?

Assessment of Student Learning: A sustainable model for continuous improvement in program majors Barbara Masi , Ph.D. Director of Assessment Arts, Sciences, and Engineering University of Rochester. Program assessment- where to start?.

graham
Télécharger la présentation

Program assessment- where to start?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment of Student Learning: A sustainable model for continuous improvement in program majorsBarbara Masi, Ph.D.Director of AssessmentArts, Sciences, and EngineeringUniversity of Rochester

  2. Program assessment- where to start? • How can department majors create simple, streamlined program assessment plans that measure student learning? Steps in program assessment: • Define program learning outcomes. • Align curriculum and learning outcomes. • Choose assessment methods for each program learning outcome. • Plan for implementation and data review. • Determine whether action needed as result of data review. • Take action as needed. • Begin next continuous improvement cycle.

  3. Let’s first define a few assessment terms and methods as well as how to quickly shape a strong program assessment plan for a sample learning outcome…

  4. Program learning outcomes • Learning outcome- a statement that describes what a learner should know, be able to do, or develop an attitude about , often by the end of a subject or program. It should be stated so that it can be measured. • Some common areas for program learning outcomes: • Knowledge depth and or breadth areas • Using methods and tools of discipline • Critical thinking and analytic reasoning • Quantitative reasoning • Research, experimentation • Decision making • Oral and written communication • Self and Society, Global Citizenship • Ethics and Responsibility • Leadership and Teamwork • Aesthetic Understanding and Creativity • Development of Personal Potential

  5. Sample program learning outcomes for a Psychology program • Theory and Content of Psychology: Students will demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical trends in psychology. • Research Methods in Psychology: Students will understand and apply basic research methods in psychology, including research design, data analysis, and interpretation. • Critical Thinking Skills in Psychology: Students will respect and use critical and creative thinking, skeptical inquiry, and, when possible, the scientific approach to solve problems related to behavior and mental processes. • Application of Psychology: Students will understand and apply psychological principles to personal, social, and organizational issues. • Values in Psychology: Students will be able to weigh evidence, tolerate ambiguity, act ethically, and reflect other values that are the underpinnings of psychology as a science. • Information and Technological Literacy: Students will demonstrate information competence and the ability to use computers and other technology for many purposes. • Communication Skills: Students will be able to communicate effectively in a variety of formats. • From APA

  6. Sample program learning outcomes for a History program Upon successful completion of the History major, students will be able to: Recognize the processes by which societies, cultures, and institutions change over time. Describe particular historical developments and explain their wider context. Critically read, analyze, and synthesize primary and secondary sources. Use methods of narrative and analysis appropriately for communicating historical phenomena. Identify the various contexts that shape the construction and use of historical sources and knowledge. From UC Merced

  7. Sample program learning outcomes for an Arts program The content of Media Arts courses is designed to guide students to: Understand and acquire (through hands-on projects) the principal attributes and mechanics of art technique(s) in medium of choice. Enhance visual, aural, and physical perception and cognition through the acquisition of art technique . Demonstrate how to communicate critically the aesthetic, historical, cultural, social and contemporary aspects of the medium(media) they are studying. Demonstrate appreciation for the multicultural environment that typifies contemporary art production. Demonstrate understanding of the relationship between the physical aspects of works of art and aesthetic principles in analysis of works and project creation. Understand principles that guide artistic creativity and be prepared to apply them imaginatively, as well as practically. Express ideas through an art medium. Be able to apply their knowledge of art technique and practice outside of the classroom. Edited from UC Merced

  8. Sample program learning outcomes for a chemistry program 1. A student majoring in chemistry will demonstrate his/her mastery of the four principle disciplines: analytical, organic, inorganic, and physical chemistry. 2. A student majoring in chemistry will demonstrate excellent critical thinking and problem solving abilities. S/he will be able to integrate chemical concepts and ideas learned in lecture courses with skills learned in laboratories to formulate hypotheses, propose and perform experiments, collect data, compile and interpret results and draw reasonable and logical conclusions. 3. A student majoring in chemistry will demonstrate technical mastery of fundamental wet laboratory skills, use proper laboratory safety protocols, and demonstrate proficiency in using computers to solve chemical problems. 4. A student majoring in chemistry will apply their experience and knowledge of the discipline in the successful conduct of at least 100 hours of undergraduate research. 5. A student majoring in chemistry will demonstrate effective scientific communication skills – both written and oral. Students will be able to write reports and present the results of their own scientific works or the works of another scientist. From U. San Diego Chemistry and Biochem Dept.

  9. Assessment Measures • Assessment measure- a measure of student performance for a particular learning outcome using a particular mode of measurement. • Direct measure- “objective” measure of student performance for a learning outcome. Eg. Test, portfolio review by team of experts • Indirect measure- A measure that can be an indicator of student performance, but provides an impression of performance rather than “objective” or expert measure. Eg. Senior survey self-assessment of learning, jobs data. • External measure- A measure that gathers data from external constituents or from external testing. Eg. alumni, employers, professional associations, advisory board are external constituents. Eg. GRE, CLA, professional society tests are external tests. • Internal measure- A measure that gathers data from internal Collegeconstituents or from internal pre-graduation testing. Eg. students are internal constituents. Eg. classroom tests are internal tests. • Evaluation measure- measure of the quality of “machinery” of a learning system. Eg. Teaching quality survey.

  10. Mixed assessment measures for measuring a learning outcome “Triangulation” of assessment methods means that 3 measures are needed to measure a learning outcome. At least 1 of 3 methods should be a direct method. A mix of internal & external measures is also of value, but not always possible. Sample table for measuring graduation writing ability:

  11. Choosing assessment methods: qualitative vs. quantitative • Quantitative vs. qualitative methods- which to choose? Rule of thumb: • Quantitative methods are easier and faster to interpret data, but you need a student N of 15 or more. Also, data has limits of value. • (eg. a student provides a numerical rating of learning outcome achievement on a self-assessment senior survey) • Qualitative is richer, of great value in understanding the student experience of learning for a given learning outcome, however, it’s more time consuming to gather if student N < 15, and harder to interpret. • (eg. senior exit interview) • So choose a mix of each.

  12. Assessment methods – a short list

  13. Matrices are key for organizing a program assessment plan! • Three types of assessment plan matrices (using templates provided) are essential for EACH program learning outcome. • Program learning outcome vs. curriculum matrix • For each program learning outcome: assessment method implementation plan • For each program learning outcome: assessment method data report • Initial creation of templates is time-consuming, but you won’t be sorry once they’re done!

  14. Program learning outcome vs. curriculum matrix This matrix shows which curriculum subjects are key to producing a given program learning outcome. If gathered assessment data for an outcome shows a need for improvement, one knows which subjects to target for improvement! One might even note whether outcome is introduced, reinforced, or mastered in a given subject.

  15. Assessment plan implementation matrix • For EACH important program learning outcome, create an assessment plan template that shows: • Type of assessment method • Benchmark criterion for adequate student performance • Where in the curriculum it will be implemented • Who is responsible for implementing the method and how often • Who is responsible for review the method • See attached templates file for program major use! • Sample table for assessment of program learning outcome of writing:

  16. Assessment plan matrix for data reporting • Once data is gathered for a given program learning outcome from a variety of assessment methods, it should be reviewed, at least, every other year by program officers. • A simple format for organizing assessment data in a table for EACH program learning outcome permits a curriculum committee to quickly review results. The table should show: • Assessment method and when it was implemented • Data gathered • Benchmark set for acceptable student/ alumni performance for that assessment method • Determination if gathered data met benchmark and if action needed • If 2 out of 3 methods show benchmark not met, then action is required:

  17. Using scoring rubrics as an assessment method Begin with AAC&U VALUE rubrics for major shared learning outcomes: critical thinking, writing, quantitative reasoning, global citizenship, analytical reasoning, oral presentation (there are 15 outcomes in total). See AAC&U website. Also take a look at professional society sites for rubrics tailored to given discipline. A rubric is designed to contain several criterion that describe a given learning outcome. Decide, as a faculty, which criteria apply to department major. Also, one does not need to use all criteria in a rubric each year for scoring student work.

  18. How to use scoring rubrics to score student work • Faculty must be trained in rubric use. • Short in-house training: all read 2 papers together and use, for example, writing rubric to score work. Group then discusses results, assumptions made in choosing scores, and ease/difficulty of using rubric levels and wording. • Choosing number of student works to choose. Rule of thumb: 15 of 25 students. All students in sample of 10 students. • Scoring representative student work: At minimum, 2 faculty, trained in use of scoring rubric, should score student work for EACH criterion chosen for that ability. Three faculty is better. • An easy to use inter-rater reliability calculator (online) can be used for calculating inter-rater reliability if no one is around to run the data through a statistics package, such as SPSS (eg. www.med-ed-online.org/rating/reliability.html). • Scores are tabulated, and an average score for EACH criterion is calculated. • Inter-rater reliability is also calculated. If IR < 0.4, low IR. If low, there are 2 possible reasons: faculty training in use of rubric still needs work, and/ or student assignment is unreliable or invalid prompt for assessing this ability.

  19. Now, on to how assessment data is used in continuous improvement of each learning outcome….

  20. A closed loop student learning assessment/ evaluation system for a program major • Using writing as an example, indirect and direct measures that assess writing would be gathered, data reviewed, and action taken if data warrants such. • Additional loops are designed for each key learning outcome, but don’t try to do too many at once!

  21. More examplesGlobal citizenship: Example of a program learning outcome assessment plan • A program may have the needed curricular pieces. Here’s an example: • Service learning in courses may be a curriculum development on the rise. Student experiences in these courses (eg. Management/ Economics in local businesses, Engineering/ Design in public housing project) • Several programs have connected internships and academic requirements, some with service learning component (eg. Political Science internship with World Bank project in Dominican Republic).

  22. Global citizenship ability: Measuring success Measuring success in subject level experiments in service learning: student work should be assessed around key learning outcomes related to global citizenship and professional readiness via student self assessment surveys and faculty rating of student work (see attached sample student survey in service learning subject). Indirect measure at the college level can be used by programs for measuring this outcome: Enhance senior and alumni surveys to permit indirect measures of global citizenship (see attached revised senior and alumni survey tools). Direct measure at the program level: Enhance program level direct measures of global citizenship in senior subjects through faculty scoring of student portfolio work for this outcome (see attached example scoring AAC&U Citizenship rubric for this outcome. ***Faculty should choose a few criteria from the citizenship rubric for scoring student work).

  23. Global citizenship: using a range of indirect and direct assessment measures to determine “success” • The work of implementing assessment methods is shared by the institution and faculty. • While faculty maintain their database of department major data, institutional research (IR) implements institution-wide senior, alumni, employer surveys and distributes pertinent data reports on each outcome to department majors. • A sample plan for assessment of student abilities of global citizenship for program majors (responsibility for implementation in parentheses) is below. 23

  24. The key is a package of shared, available assessment tools • Tools for organizing an assessment plan. • Eg. assessment measure templates for each program learning outcome are included here. • Scoring rubrics for key learning outcomes that gen ed and program major faculty can adopt/adapt • Eg. critical thinking rubric options are included here. LEAP VALUE rubrics are the best source for other key outcomes. • Institutional surveys tied to key learning outcomes. • Eg. alumni, senior, employer, internship supervisor surveys are included with this presentation. • Subject survey templates that are tied to key learning outcomes • Eg. service learning student survey, teamwork peer evaluation survey are included with this presentation. • An UofR College-based shared website where above tools are available for all is essential.

More Related