1 / 49

Setting Program and Learning Outcomes

Setting Program and Learning Outcomes. Carteret Community College Learning Outcomes Workshop June18-20, 2007. We Are Not Doing This for SACS. SACS is our accreditation agency They value outcome assessment – but why? Good institutions are concerned with the following things:

sileas
Télécharger la présentation

Setting Program and Learning Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Setting Program and Learning Outcomes Carteret Community College Learning Outcomes Workshop June18-20, 2007

  2. We Are Not Doing This for SACS • SACS is our accreditation agency • They value outcome assessment – but why? • Good institutions are concerned with the following things: • Student learning in the classroom • Student experiences at the institution • Being good stewards of our resources • Running effective and efficient institutions

  3. At the end of the day, we want to be able to say… • Our students are learning to the best of their ability • We are doing an effective job of teaching • Students are better off once they leave us than they were when they came to us • The experiences they had in student services were beneficial • They are achieving exactly what we expected them to achieve • We have not placed barriers in front of them

  4. What Else Would We Like To Say? • Are we serving our community well? • Are we meeting our mission? • What else?

  5. Let’s Review

  6. Why Outcomes? • Outcomes are program-specific • They measure the effect of classroom activities and services provided. • Outcomes represent a new way of thinking • Outcomes have become widely accepted by our various publics • They are here to stay • We used to measure ourselves by our activities

  7. Outcomes are ……... • Driven by the mission. • Related to overall program goals. • Specific to the teachings/activities of your program/course. • Determined by faculty and front-line staff. • Measured carefully and specifically.

  8. Different Types of Outcomes • Learning Outcomes (can be at course, program or institutional level) • Program Outcomes • Administrative Outcomes

  9. Definitions and Examples • Learning Outcomes: • What changes in knowledge, skills, attitude, awareness, condition, position (etc.) occur as a result of the learning that takes place in the classroom. These are direct benefits to students. • Examples: general learning skills (e.g. improved writing and speaking abilities), ability to apply learning to the work environment (e.g. demonstrate skills in co-op), program-specific skills developed or enhanced (e.g. take blood pressure.)

  10. Definitions and Examples • Program Outcomes: • The benefits that results from the completion of an entire program or series of courses. Are there benefits for students who get the AAS in welding versus those who take a few courses? If so what are they? • Typical examples are: licensure pass rates, employment rates, acceptance into 4-year schools, lifelong learning issues, contributions to society, the profession, etc.)

  11. Definitions and Examples • Administrative Outcomes • Units/programs want to improve services or approach an old problem in a new way. • They want to become more efficient and effective. • Typical examples are: • All faculty will attend one professional meeting annually so they can stay up-to-date in their field, or:

  12. What is an Outcome Objective? • A short-term, measurable, specific activity having a time limit or timeline for completion around a specific outcome • They measure outcomes and are used to show progress toward goals • They specify who, will do what, under what condition, by what standard and within what time period

  13. How to Set Outcome Objectives • There’s no magic number • e.g. 80% or 90% • What is reasonable? • What can you afford? • What realistically can your staff accomplish? • What percent shows you’re not committed and what percent shows you’re naïve?

  14. Why is This Hard? • Because it is education • Because the best results may not happen for years • Because we are so busy doing what we are doing…. we forget why we are doing it

  15. Challenges • Identifying and defining outcomes is the easy part. • The devil is in the details. • How do we track it, where does it all go, how do we score it, compile it, turn it into a comprehensive report. • How do we “demonstrate improvement in institutional quality.”

  16. Things to Remember • Outcome measurement must be initiated from the unit/department level (promotes ownership of process). • Measure only what you are teaching or facilitating. • Measure what is “important” to you or your program. • Be selective (2-3 outcomes only for a course, a select list for programs and institutional outcomes). • Put as much time in to “thinking through” the tracking process as you do into the definition of outcomes. • Spend the time up front in planning and the process will flow smoothly. • It will prove to be energy well spent.

  17. Five Dimensions of Good Assessment Source: Linda Suskie, Middle States Commission on Higher Education

  18. Good Outcome Assessment is Used • The results are used to inform important decisions on important goals. • What is the department most concerned with? • Are there goals for which the department/unit needs data (proof)? • Don’t create data and a document that sits on the shelf – it should be used again and again.

  19. Used, cont. • Good assessment is planned and purposeful. • Who has been/will be involved in the decision about what to assess? • When will assessment occur (best time, best course)? • What will you do with result or what results do you want to see? • What should be measured first? If you add more variables each year, what should come next?

  20. Used, cont. • Good assessment should focus on clear and important goals • Stay away from the vague • It is clear that English should measure writing skills • It is clear that Speech should measure oral communication • What is it clear that you should measure?

  21. Used, cont. • Active participation of stakeholders • Get input from the right people • State agencies, accrediting agencies, other colleges with programs like yours • Your faculty, staff and students • Advisory committees and employers of graduates • Anyone else?

  22. Used, cont. • Assess teaching-learning process as well as outcomes • What is the process and why is it important? • Methods, classroom strategies, online and hybrid – • Be concerned with more than the content – through which processes do student learn best?

  23. Used, cont. • Results communicated widely and transparently • What do you do if they are not learning? • If your results are not good? Is that bad? • Who needs to know the results? • What does it mean to be transparent?

  24. Used, cont. • Results used fairly, ethically and responsibly • Faculty should not penalized • Outcome data makes people territorial • What does it mean to use outcome results ethically and responsibly

  25. Good Assessment is Cost Effective • Cost effective – efficient and economically (especially time) • Don’t kill your faculty • What are they hired to do? • Do as little assessment as possible and still obtain meaningful, good data • What can you afford (both time and money)?

  26. Cost Effective, cont. • Focus on clear and important goals • Pick 3-6 goals at the maximum for which to create outcome assessment • All the courses in your department/program contain hundreds of objectives. • Which ones are most important?

  27. Cost Effective, cont. • Start with what you have • What do you have? • Assignments and tests? • Can you work with those? (examples) • Group projects • Clinical hours • Co-op and internship experiences

  28. Cost Effective, cont. • Simple • Don’t create an elaborate assessment process unless you have enough staff and resources to carry it out • If you create surveys or assessment tools, keep them simple • If you conduct focus groups, get as much out of them as possible with as few questions as possible

  29. Cost Effective, cont. • Realistic expectations • What should your benchmarks be? • Will it all “go down” like you want it to? • What will happen if faculty have trouble with the process? • Should you start with training?

  30. Good Assessment is Accurate and Truthful • Flow from clear and accurate goals • Gives us an accurate perception of what is really happening

  31. Accurate and Truthful, cont. • Represent a balanced sample of key goals including thinking skills • Measures what is important • Measures what are considered “critical” to students in the field • Measures more than what students can memorize – but rather their processing and thinking skills

  32. Accurate and Truthful, cont. • Use a variety of approaches, including direct evidence of student learning • Tests and classroom assessments • Surveys of graduates and employers • Focus groups • Portfolio assessments • Direct observations and skill-based assessments

  33. Accurate and Truthful, cont. • Recognize diverse approaches to teaching and learning and is developed thoughtfully • Faculty who lecture, use small groups, interactive approaches – will it work for all? • Take the time to make sure this assessment will give you the results you wanted.

  34. Accurate and Truthful, cont. • Perpetual works in progress • No one does this well the first time • Health programs have been doing it the longest and can help others • It takes more programs a couple of times before they create good assessment data • Do you have that kind of time? • Our attitude should be “we are always growing and improving.”

  35. Good Assessment is Valued • Assessment results/efforts are recognized/honored • Who should value the results • If they are valued, what should our actions be • What about senior administration (part of a process) • If we uncover needs, what should happen?

  36. Valued, cont. • Innovation, risk-taking and efforts to improve teaching and learning are recognized and honored • If faculty must do assessment and are willing to try new things to improve results, we should honor and value their efforts. • Assessment will have direct effect of a center for teaching and learning.

  37. Valued, cont. • Supported with appropriate resources: time, guidance, support and feedback • What do we need to do this well? • What should happen to the results?

  38. Clear and Important Goals • Have clear and important standards for acceptable and exemplary student performance • Do we test for minimal standards? • What is considered good performance? • Are there clear standards? • Should we compare ourselves with other schools?

  39. An Example – English 111 • Major Competencies • Develop competency in writing brief essays of three types: descriptive, narrative and analytical • Develop competency in writing Standard English also Practicing Grammar and Usage • Explore multiculturalism through analytical reading.

  40. ENG 111 and Good Assessment • Good Outcome Assessment is Used • Good Outcome Assessment is Cost Effective • Good Outcome Assessment is Accurate and Truthful • Good Outcome Assessment is Valued • Good Outcome Assessment Comes from Clear and Important Goals

  41. But Let’s Make Sure We Cover Our Bases with SACS:What Do We Need to Do?

  42. SACS Requirements • Core Requirement 2.5 • The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that 1) incorporate a systematic review of institutional mission, goals and outcomes; 2) results in continuing improvement in institutional quality and 3) demonstrates that the institution is effectively accomplishing its mission (Institutional Effectiveness).

  43. Comprehensive Standard 3.3.1 • The institution identifies expected outcomes, assesses whether it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: • 3.3.1.1 educational programs, to include student learning outcomes • 3.3.1.2 administrative support services • 3.3.1.3 educational support services • 3.3.1.4 research within its educational mission, if appropriate • 3.3.1.5 community/public service within its educational mission, if appropriate

  44. Comprehensive Standard 3.5.1 • The institution identifies college-level general education competencies and demonstrates that graduates have attained them.

  45. What We Need to Have in Place • An institution-wide research-based planning and evaluation processes • A systematic review of institutional mission, goals and outcomes • Results in continuing improvement in institutional quality • The ability to demonstrate that the institution is effectively accomplishing its mission

  46. What We Need to Have in Place. • Identifies outcomes in educational programs (program and individual level), administrative support areas and educational support services. • Assesses whether we achieve these outcomes, and provides evidence of improvement based on analysis of the results in each areas. • Established general education competencies and demonstrates that graduates have attained them.

  47. Your Program Review • Each program will: • Establish learning and program outcomes • Assess those outcomes • Identify benchmarks or objectives for those outcomes • Identify strengths and weaknesses • Make recommendations for changes • Use the results to make improvements (strategies for change)

  48. Your Have • Institutional learning outcomes • Will need outcomes for the rest of the college • Have a program review model (evaluation process) • Need to make sure ESS and Admin Services are on board • After this workshop, you will have program and learning outcomes down to the program level. • Make sure Gen Ed assessments are included

  49. Let’s Begin

More Related