1 / 23

Considerations for Assessing Program Objectives

Considerations for Assessing Program Objectives. October 2011. Malcolm LeMay Director of Operations College of Business. Assessment of Learning – Lessons Learned. Considerations and approaches Focus on accreditation requirements and expectations

jaron
Télécharger la présentation

Considerations for Assessing Program Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Considerations forAssessing Program Objectives October 2011 Malcolm LeMay Director of Operations College of Business

  2. Assessment of Learning – Lessons Learned • Considerations and approaches • Focus on accreditation requirements and expectations • Minimize burden on faculty while ensuring participation • Close the loop from assessment to improvement • Ensure value-added activities and focus on tracking/reporting relevant info • Use technology to reduce confusion, ensure consistency and improve process

  3. Process for Assessment of Student Learning • Define goals and objectives • Align curricula with goals • Identify instruments and measures to assess learning • Collect, analyze and disseminate assessment information • Use assessment information for continuous improvement In other words… What will our students learn in our program? What are our expectations? How and when will they learn it? How will we know they have learned it or not? What will we do if they have not learned it?

  4. Concept of Accreditation Maturity Model Capability Maturity Model: Initial - ad hoc Repeatable practices Practices defined Managed and measured Optimizing

  5. Accreditation Maturity Model • Developed interpretation of AMM levels for all AACSB standards • Training during faculty retreat 18 months prior to Accreditation visit • Conducted self-assessment on AMM levels • Results • Increased faculty awareness and buy-in • Process gaps identified • Improved understanding of expectations

  6. Applying AMM for Assessment Standard 16 – Learning Goals • Adapting expectations to the school’s mission and cultural circumstances, the school specifies learning goals and demonstrates achievement of learning goals. • Learning goals address: • General knowledge and skills (i.e., communications, problem-solving, ethical reasoning) • Management or discipline specific knowledge and skills (entrepreneurial perspective, professional behavior) • Basis for judgment: • Demonstrate that students meet the learning goals • If learning goals are not being met, demonstrate efforts to eliminate the discrepancy

  7. Applying AMM for Assessment • Level 1 – General objectives exist and are measured through traditional course evaluation (initial) • Level 2 – Program objectives exist, assessment of learning relies on indirect measures (repeatable) • Level 3 – Program objectives integrated with curriculum development; relies on direct assessment (defined) • Level 4 – Assessment process generates quantitative and qualitative data, improvements are planned and implemented (managed) • Level 5 – Assessment is an ongoing process aimed at continuous improvement of programs, curricula and course delivery. Processes are periodically reviewed and improved (optimizing)

  8. OSU Vision/Mission COB Vision/Mission • Assessment Approaches: • Direct assessment of course outcomes mapped to program objectives • Direct assessment of program objectives through course embedded measures (e.g., exams, projects, papers) • Direct assessment through stand-alone performance (MBA oral exams) • Indirect assessment of program objectives (student surveys) Program Goal #2 Program Goal #1 PO 2.1 PO 2.2 Program Objective 1.1 Results: Exceeded – xx% Met – xx% Below – xx% Program Objective 1.2 Courses Courses Responses/Actions: Changes to curriculum, pedagogy, coverage, prerequisites Impact on program offerings: New courses, changes to sequencing, adjustments to program objectives and course outcomes Faculty Involvement in Assurance of Learning Process

  9. Mapping Core Courses to Program Objectives

  10. Mapping Course Outcomes to Objectives

  11. Example (Business Administration) • Program Goal #1: Students will identify, assess and shape entrepreneurial opportunities in a variety of contexts. • Program Objective 1.1: Recognize entrepreneurial opportunities for new business ventures and evaluate their potential for business success. • Course Outcome (BA-260.6): Apply venture opportunity screening techniques to an actual start-up idea, and subsequently develop and prepare a feasibility plan. • Course Outcome Assessment: Measured by evaluating team projects requiring a presentation analyzing the feasibility of a new business concept.

  12. Measurement methods • Direct and Course Embedded: • Individual/team presentations • Oral exams • Mock interview reviews • Group collaboration assignments • Simulation results • Essay exam questions • Course entrance exams • MBA business plan competition • Indirect • Exit surveys • Recruiter feedback

  13. Use of Technology • Simplify collection, analysis and reporting • Ensure consistency • Improve overall awareness of learning and student success • Ease burden on faculty • Reduce reliance on Excel spreadsheets

  14. To the Cloud  COB Force.com course information

  15. Moving Forward with Program Assessment • Target level 4 for all programs • Ongoing • Involve faculty in program mapping and analysis • Use technology to reduce support requirements, encourage consistency and ease demands on faculty • Improve reporting and follow-up to “close the loop” • Keep in mind “big picture” - continuous improvement • Aspire for “accreditation-ready”

  16. Back up slides

  17. Bloom’s Taxonomy of Learning Domains Knowledge: Recall data or information Comprehension: Understand the meaning, translation, interpolation, and interpretation of instructions and problems. Application: Use a concept in a new situation or unprompted use of an abstraction. Analysis: Separates material or concepts into component parts so that its organizational structure may be understood. Synthesis: Builds a structure or pattern from diverse elements. Put parts together to form a whole, with emphasis on creating a new meaning or structure. Evaluation: Make judgments about the value of ideas or materials

More Related