1 / 31

Where We Are in the Cycle

Assessment Instruments For Use With Student Learning Outcomes Dr. Daniel H. Whiteley Andy Zehner April 2012. Where We Are in the Cycle. Types of Assessment Measures And Associated Assessment Instruments. Summative & Formative. Summative & Formative. Qualitative vs. Quantitative.

alida
Télécharger la présentation

Where We Are in the Cycle

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Instruments For Use With Student Learning OutcomesDr. Daniel H. WhiteleyAndy ZehnerApril 2012

  2. Where We Are in the Cycle

  3. Types of Assessment Measures And Associated Assessment Instruments

  4. Summative & Formative

  5. Summative & Formative

  6. Qualitative vs. Quantitative

  7. Qualitative vs. Quantitative

  8. Direct & Indirect Note: Direct assessment methods are preferred over indirect.

  9. Direct & Indirect

  10. Applied Experiences Assessment Learning environments that provide in-the-field, hands-on learning and training experiences can provide valuable assessment information. Applied experiences integrate what the student has learned in their academic studies with “real life” situations and further develop the student as a professions, citizen and life-long learner • Examples of Applied Experiences include: • Practicum • Service Learning • Internship • Experiential Learning • Field Experience • Student-teaching • Co-op • Applied Experiences Assessment Instruments: • Journal • Videotape of student skills • Progress Reports • Portfolio • Midterm Evaluation • Cumulative Report • Reflective Papers • Performance evaluation by mentor • Student evaluation of internship experience • Final project/creative project/presentation

  11. Newer Wave of Assessment Instruments • Rubrics • Learning contracts • Observations with documentation • Reflective journals • Reflective conversations/writings • Case studies • Student interviews • Videotaping

  12. Rubrics • Several sample rubrics have been put in to the Sample Rubric folder on the SharePoint site. These are examples only, and they don’t match the format being used by the Core Curriculum rubric being used by our academic counterparts. • The CAS Domains and Dimensions rubric has been put in to your Assessment Tools folder. This rubric was modified to match the format of the Core Curriculum and has been used by other institutions as a way to map learning outcomes. • Additionally, the CampusLabs presentation “Rubrics 101: A Tool to Assess Learning” has been put in to your Assessment Tools folder. This presentation provides valuable information on the use of rubrics as a tool to measure learning directly and objectively.

  13. Rubrics The Preferred Format

  14. Guidelines for Selecting an Assessment Method and Instrument • Select a method that is appropriate for your goals and objectives. (The method that will provide the most useful and relevant information.) • Not all methods work for all areas or are appropriate to all outcomes. • Use the information you already have available. • Choose an assessment method that allows you to assess the strengths and weaknesses of the program. Effective methods of assessment provide both positive and negative feedback. Finding out what is working well is only one goal of your assessment work. • Remember, the data you collect must have meaning and value to those who will be asked to make changes based on the findings.

  15. Guidelines for Selecting an Assessment Method and Instrument • Use multiple methods to assess each learning outcome. Many outcomes will be difficult to assess using only one measure. The advantages to selecting more than one method include: • Multiple measures can assess different components of a complex task. • There is no need to try to design a complicated all-purpose method. • Greater accuracy and authority achieved when several methods of assessment produce similar findings. • Provides opportunity to pursue further inquiry when methods contradict each other.

  16. Assessment methods • Observation • Performance assessment • Portfolio evaluations • Pre/post evaluation • Quasi-experiments • Reflective essays • Rubrics • Standardized test instrument • Student self efficacy • Surveys • Syllabus analysis • Transcript analysis • Alumni surveys • Culminating assignments • Content analysis • Course-embedded assessment • Curriculum analysis • Delphi technique • ePortfolio • Employer surveys • Focus groups • Institutional data • Matrices

  17. Course grades • You give course grades • The grades measure the specific outcome • The courses are consistent over time suggested for ROTCs

  18. Observation • Your observers are experts • The setting is controlled • The outcome is Ethical Reasoning suggested for CAPS & ODoS Counseling

  19. Rubrics • Diverse people will do the assessments • You want to enforce consistency • You need to document the criteria

  20. Reflective Essays • You want to measure cumulative effects • Your students will write the essays • The outcome can be expressed in an individual’s own words • The outcome is Written Communication suggested for OSRR, DRC & HORIZONS

  21. Peer Evaluations • The outcome is best measured in the eyes of others • The outcome is Leadership and Teamwork • You have opportunity to gather these evaluations • (ideally at the close of a seminar or event) suggested for leadership training programs

  22. Pre/Post Testing • You conduct group sessions • The outcome can be measured as an answer to a question • Some students won’t know the answer at the start • You can use Clickers suggested for CCO workshops & intramural sports training

  23. Point of Contact Surveys • The location is unusual • The learning is best measured in the moment • You don’t care about getting a valid sample • These surveys can be facilitated: • by Clickers • Campus Labs Baseline w/ iPhone or iPod

  24. Focus Group • If you are fishing for details • The outcome is Creative Thinking • The outcome is Oral Communication

  25. Delphi Techniques & Quasi-experiments • You have lots of time • You are determined to get lots of detail • The outcome is Integrative Learning • You trust method more than people • Experiments may create ethical concerns

  26. ePortfolios • the portfolios exist • the portfolios have value • Criteria exist for assessing the portfolios • Staff capacity exists to review students’ efforts Suggested for future assessments

  27. Instrument/Tool Mapping Process

  28. Instrument/Tool Mapping Process • Now, review your original Student Learning Outcome mappings. Although you may have indicated several outcomes students take away after participating in your program, you now want to decide which of them you can assess for 2012 (this academic year). • Your original program worksheet will be where you retain all of the learning outcomes for your program and participants. You will want to map only those learning outcomes you can assess in 2012. Over time, the worksheet will document which year you document each of the outcomes you have mapped. • For each program you originally mapped a Student Learning Outcome for, your SharePoint folder will contain a Assessment Tool spreadsheet that will provide you with a way to indicate which tool you want to use to assess each of the outcomes. • Columns A, B, C, and D will be pre-populated with information on each of the Student Learning Outcomes you mapped back to the CAS dimensions. (Refer to the mapping spreadsheet handout #1)

  29. Instrument/Tool Mapping Process • If after mapping your SLOs you determine you can’t assess one of the outcomes, or if you aren’t going to be able to assess the outcome in 2012, remove it from your mapping tool spreadsheet. • This mapping tool spreadsheet will become your authoritative source for information on the mapping of your outcomes back to the CAS dimensions, the Purdue Core Competencies and the Strategic Plan.

  30. Timeline • Review your current student learning outcome statements…the ones you created in phase 2. Make sure you have indicated in the Student Learning Outcomes section, only those outcomes you will assess in 2012. • Your review will need to be complete by May 25. At that point, and for a period of one week, all information in your folders will be frozen in order for the work to begin on moving your information in to the Tool Mapping worksheet. • You will have the information moved over for you to the Tool Mapping worksheet by June 4, and you will have the entire month of June to begin completing the information requested in the remaining columns of the Tool Mapping worksheet. All mappings will need to be completed by June 29. • At ANY point along the way, if you have any questions or need assistance, please call or email either Dan or Andy • Dan: 4-7416 dan@purdue.edu • Andy: 4-6743 alzehner@purdue.edu

  31. Closing the Loop

More Related