1 / 27

The Missing Link: Development of Programmatic Outcomes

The Missing Link: Development of Programmatic Outcomes. Christopher Meseke,Ph.D. Park University. What is Assessment?.

hwalton
Télécharger la présentation

The Missing Link: Development of Programmatic Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Missing Link: Development of Programmatic Outcomes Christopher Meseke,Ph.D. Park University

  2. What is Assessment? • “As a whole, assessment is a framework for focusing faculty attention on student learning and for provoking meaningful discussions of program objectives, curricular organization, pedagogy, and student development” (Allen, 2004). • Assessment is the quality control of the educational process

  3. Assessment is -- first and foremost -- about student learning.

  4. Assessment Levels • Student • Classroom • Course • Program • College • Division • Institution

  5. Common Reactions to Assessment Initiatives • Ignoring it • Bribing someone else to do it • Complaining about it • Losing sleep over it • Sitting down and writing it

  6. Levels of assessment: quality control and assurance • Institutional/Programmatic • National/State Licensure Exams • Certain academic programs (Nursing, Engineering, Physical Therapy, Social Work, etc) • Accreditation Council for Graduate Medical Education (ACGME) • Higher Learning Commission of the North Central Association of Colleges and Schools

  7. Assessment is important to the accreditation process • The Higher Learning Commission of the North Central Association of Colleges and Schools (HLC-NCA) requires it. • Assessment of student learning provides evidence at multiple levels: course, program, and institutional. • Assessment of student learning includes multiple direct and indirect measures of student learning. • Assessment results inform improvements in curriculum, pedagogy, instructional resources, and student services.

  8. Interpretation of the HLC-NCA Statement on Assessment on Student Learning, 2003. • It is the faculty memberswho must take ownership of the assessment process. Their buy-in, ownership, and implementationis directly related to both the mission of the institution andattitudes, knowledge, and skillsrequired for the student to successfully complete the program requirements.

  9. Goals of Program Assessment Planning • Measure studentlearning, not teaching, curricular content, processes or resources • Measure things that are important to us • Involve faculty in development of the assessment program, processes and instruments • Use multiple measures to produce valid and reliable data (triangulation) • Make it manageable and cost affordable

  10. Goals of Program Assessment Planning • Map assessment data back to the curriculum for improvement • Produce annual reports of assessment outcomes that show the data, interpretation of data, and improvement plans where indicated • Same assessment procedures for all campuses and sister programs

  11. Modified Hatfield Assessment Model Meta-Competencies/Institutional Goals Competency Competency Competency Degree to which Competency is achieved objective indicators Learning Outcome objective objective objective Learning events

  12. Programmatic Competencies • A measurable, complex behavioral statement may be written for each Key Component (meta-competency) and competency. • Statements reflect what we think we can actually measure in an educational environment considering time and resources.

  13. Program Assessment Learning Competencies • Develop a set of Learning Goals (Meta-competencies) which represent the attributes of the graduate • General Education (Competencies) • Discipline (Competencies) • Course (Outcomes)

  14. COMPREHENSION EVALUATION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Identify Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe ProducePropose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Lower Order Bloom’s verbs

  15. COMPREHENSION EVALUATION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Identify Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe ProducePropose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Higher Order Bloom’s verbs/Upper division Course / Program outcomes

  16. An Example of Departmental/Programmatic Competencies • Students will be able to: • Demonstrate biological knowledge appropriate for the course level. • Demonstrate a working knowledge of the scientific method. • Demonstrate the ability to communicate scientific concepts and findings, both in oral and written format. • Apply interdisciplinary knowledge to the biological sciences. • Demonstrate an awareness of ethical issues in the life sciences.

  17. Mapping Competencies to Courses

  18. Mapping Outcomes to Courses Program Competencies Course 1 Course 2 Course 3 Course 4 Course 5 x x x x x x x x x x x x x x x x x x

  19. Mapping Outcomes to Courses Course 1 Course 2 Course 3 Course 4 Course 5 Program Competencies x x x x x x x x x x x x x x x

  20. Triangulation of Assessment Strategies: Direct Measures (Pick 2) • Licensure Scores • GRE/GMAT/MCAT/other • Departmental Exit Exams • Portfolio • Core Assessment • Capstone Experience • Others

  21. Triangulation of Assessment Strategies: Indirect Measures (Pick 1) • Advisory Boards • Senior Survey • Employer/Professional School Survey • Focus Groups/Interviews • Other

  22. Three-Year Assessment Plan • All departments/programs should be assessed on three-year cycles • Not all department/program competencies should/can be assessed • Focus on 2-3 competencies at most • Establish meaningful criteria • At end of three-year cycle examine data for meaningful trends • Only after the three-year cycle are changes made, only if needed

  23. An example of a three-year assessment cycle

  24. Summary • Choose an Assessment Model • Identify (develop) programmatic competencies • Write program level competencies based upon the Institutional meta-competencies. • Establish key performance indications (KPOs)based upon the competencies • Identify 3 measures for each competency (2 direct, 1 indirect) and set success criteria (may be different across various university units) • Develop data gathering and reporting mechanisms and templates • Develop close-the-loop mechanisms

  25. Big Mistakes in Assessment • Assuming that it will go away • Trying to do too much, too soon • Expecting to get it right the first time • Not considering implementation issues when creating plans • Borrowing plans and methods without acculturation

  26. Big Mistakes in Assessment • Demanding statistical research standards • Doing it for accreditation instead of improvement • Confusing assessment with student learning • Making assessment the responsibility of one individual • Assuming collecting data is doing assessment

  27. Thank you • A special thanks to Dr. Susan Hatfield, for her gracious input. • Questions?

More Related