1 / 76

Black Holes & Gaseous Processes: Really Big Mistakes in Assessment :

Black Holes & Gaseous Processes: Really Big Mistakes in Assessment :. Lawrence University November 2008 Susan Hatfield Assessment Director Winona State University SHatfield@winona.edu. Really Big Assessment Mistakes. Assuming it Will Go Away.

nero
Télécharger la présentation

Black Holes & Gaseous Processes: Really Big Mistakes in Assessment :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Black Holes & Gaseous Processes: Really Big Mistakes in Assessment: Lawrence University November 2008 Susan Hatfield Assessment Director Winona State University SHatfield@winona.edu

  2. Really Big AssessmentMistakes

  3. Assuming it Will Go Away

  4. Creating a Culture of Assessment instead of a Culture of Learning

  5. Assessment Learning Instrument Driven Outcome Driven National Norms Targets & Goals Trend Lines Relational Data Collection Analysis How do we compare? What does it mean?

  6. Still doing assessment like it is 1990

  7. 1990 2000 2010 INDIRECT MEASURES DIRECT MEASURES

  8. 1990 2000 2010 PROCESS MEASURES OUTCOME MEASURES

  9. 1990 2000 2010 CLASSROOM ASSESSMENT PROGRAM ASSESSMENT

  10. 1990 2000 2010 INSTITUTIONAL RESPONSIBILITY PROGRAM RESPONSIBILITY

  11. 1990 2000 2010 INSTITUTIONAL EFFECTIVENESS STUDENT LEARNING

  12. Waiting for everyoneto get on board

  13. Attitudes toward Assessment Level of Commitment

  14. Attitudes toward Assessment Level of Commitment 70% 15% 15% Hostile Accepting Enthusiastic

  15. Forgetting to Make aCurriculum Map

  16. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K K K K A S A A S S K A K A S A S K A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  17. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S A A S S K A K A S A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  18. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K S K A S A S S K A K A S S K S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  19. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S K S K A K A S A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  20. Trying to do too much

  21. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S A A S S K A K A S A S K A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  22. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S A A S S K A K A S A S K A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  23. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S A A S S K A K A S A S K A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  24. Program Level Student Learning Outcomes Course 1 Course 2 Course 3 Course 4 Course 5 K A S K A S A A S S K A K A S A S K A S K K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation

  25. Not Aligning Campus Processes withAssessment

  26. Process Alignment Course Proposal - learning outcomes? Program Proposal - curriculum map? Program Review - assessment data? Master Syllabi - program outcomes? Reporting Structure - assessment office? Promotion, Renewal, Tenure - rewarded?

  27. Choosing Assessment Methods before Defining Outcomes

  28. organization P O R T F O L I O √ presentation √ # of submissions √ multi media √ navigation √ captions √

  29. Outcomes: Theories & Theorists Assignment Research Methods Research Writing T E S T Data Analysis

  30. Outcomes: appropriateness Theories & Theorists P O R T F O L I O √ understanding √ Research Methods application √ accuracy Research Writing √ detail √ Data Analysis language √

  31. Too Many Program Level Learning Outcomes

  32. Learning Outcomes • NOT a compilation of your course level student learning outcomes • NOT intended to represent everything that your students learn in the program

  33. Exertion without Intention

  34. Intention without Exertion

  35. Intention and Exertion

  36. Poorly Written Student Learning Outcomes

  37. Student Learning Outcomes • Students should be able to comprehend, interpret, analyze and critically evaluate material in a variety of written and visual formats.

  38. Student Learning Outcomes • Students will demonstrate creative and evaluative thinking in the analysis of theoretical and practical issues in the areas of politics, the economy, and the environment.

  39. Student Learning Outcomes • FORMAT: Students should be able to <<action verb>> <<something>>

  40. Inappropriate Program Level Learning Outcomes

  41. COMPREHENSION EVALUATION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Su rize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe ProducePropose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Lower division course outcomes

  42. COMPREHENSION EVALUATION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe ProducePropose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Upper division Course / Program outcomes

  43. Student Learning Outcomes • RULE OF THUMB: If you have more than one action verb, keep the one that represents the highest order of thinking.

  44. Relying on Indirect Assessment Measures of Learning

  45. Direct Measures of Learning • Capstone experience • Standardized tests • Performance on national licensure certification or professional exams • Locally developed tests • Essay questions blind scored by faculty • Juried review of senior projects • Externally reviewed exhibitions performances • Evaluation of internships based upon program learning outcomes

  46. Indirect Measures of Learning • Alumni, employer, and student surveys (including satisfaction surveys) • Exit interviews of graduates and focus groups graduate follow up studies • Retention and transfer studies • Length of time to degree • ACT scores • Graduation and transfer rates • Job placement rates

  47. 1990 2000 2010 INDIRECT MEASURES DIRECT MEASURES

  48. Making Assessment the Responsibility of One Person

  49. Allowing Each Faculty Member to “Own” the Program Level Outcomes

  50. teacher4 teacher2 teacher1 teacher3 teacher5 Speaking eye contact gestures volume sources transitions style rate poise examples verbal variety appearance evidence conclusion organization attention getter

More Related