1 / 19

Evaluation and Case Study Review

Evaluation and Case Study Review. Dr. Lam TECM 5180. Summative Evaluation vs. Formative Evaluation. Summative evaluation- Assessment OF learning Formative evaluation- Assessment FOR learning. Summative or Formative?.

stamos
Télécharger la présentation

Evaluation and Case Study Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and Case Study Review Dr. Lam TECM 5180

  2. Summative Evaluation vs. Formative Evaluation • Summative evaluation- Assessment OF learning • Formative evaluation- Assessment FOR learning

  3. Summative or Formative? • You have been asked to to determine how much money your training program has saved over a six month period. • You asked trainees to complete a 50-question paper and pencil exam covering the learning objectives of your course. • You have been asked to observe trainees doing their jobs and write a report that describes their level of knowledge transfer. • You asked trainees for feedback about the content and delivery of the course and the facilitator. • After six months, you have emailed managers and asked them about each trainees performance.

  4. Types of Evaluation • Lots of models of evaluation • Kirkpatrick’s four models of evaluation • Stufflebeam’s four step evaluation process • Rossi’s five-domain evaluation model • Brinkerhoff’s success case model • We’ll talk about Kirkpatrick (1994) because: • It’s widely accepted • It’s easy to grasp

  5. Reactions • What?: Perceptions of the trainees • How?: Questionnaire's and feedback forms • Why?: Gives designer’s insight about training satisfaction, which can be good and bad. • Trainee feedback is relatively quick and easy to obtain; it is not typically very expensive to analyze

  6. Evaluation instrument examples • See Piskurichpages 274-275

  7. Questionnaires • Open-ended items- allows users to express opinions in their own words • Advantages: allows users to give unique, open, and honest feedback • Disadvantages: difficult to analyze; trainees often prefer not to fill them out (biased results) • Close-ended items- allows users to express opinions on a predetermined quantitative scale • Advantages: easy to analyze; fast completion for trainees • Disadvantages: inhibits unique feedback; doesn’t always provide a full picture

  8. Creating close-ended questions • Use a scale that allows for degrees of comparison • Not good: Did you find the course beneficial? Yes or No • Better: On a scale from 1 to 5, how beneficial did you find the course? • Always use the same scale (e.g., 5-point or 7-point likert scale) • Construct questions grammatically consistent • Develop questions for specific purposes (i.e., don’t ask questions if you don’t know what you’ll do with the result)

  9. Creating open-ended questions • Limit your use of these • Use them to supplement close-ended responses • Reserve these for unique responses • Bad use of open-ended: What did like about the presentation slides? • Improved: On a scale from 1 to 5, how useful were like slides in supplementing the facilitator’s content? • Bad use of close-ended: Rate the following on a scale of 1 to 5 with 1 being strongly disagree and 5 being strongly agree: I would make changes to the delivery of this course. • Improved: What changes would you make to the delivery of the course?

  10. Learning • What?: Measure of increase in knowledge before and after training • How?: Formal assessment; interview or observation • Why?: To ensure your trainee’s have learned what you set out for them to learn • Already created if you’ve designed and developed your course properly (See last week’s presentation slides for assessment overview)

  11. Behavior • What?: The extent of applied learning back on the job • How?: Observation and interviews over time; retesting • Why?: To measure the long-term efficacy of your training program • Measuring behavior is difficult and requires the cooperation of management and other overseers of day-to-day operations of a trainee

  12. Piskurich’s “Transfer to the job” Evaluation • Did the training address the requirements of the job? • Were the trainees performing the job requirements competently before the training? • Are the trainees now performing the job requirements competently? • What are the trainees still not doing correctly? • Were there any unintended consequences of the training?

  13. Examples • See Piskurich pages 278-279

  14. Results • What?: The effect on the business or environment of the trainee • How?: Measured with already implemented systems; ROI; Cost-effectiveness analysis; • Why?: To measure the impact training has on organization (on a macro-level) • Difficult to isolate training as a variable

  15. ROI • Return-on-investment • Use ROI to: demonstrate effectiveness; promote importance; suggest refinements; project future costs; measure success • Drawbacks to ROI: can’t compute intangible benefits; can’t measure all variables and data; can be misleading

  16. How to calculate ROI • ROI = Net Benefits/Cost • Net benefits can include: increased productivity; greater customer satisfaction; higher quality work product • Costs can include: design and development costs; ongoing costs; evaluation costs • The hard part is quantifying benefits and costs • Although ROI is a quantifiable metric, there is an interpretative element in calculating ROI • Therefore, your logic and rationale behind the metric is as important as the metric itself

  17. How to determine what evaluations to conduct • Why do I want to evaluate? • What am I going to evaluate? • Who should I involve as part of the evaluation? • How am I going to do the evaluation? • When should I do the evaluation?

  18. Implementing Revisions • As-needed revisions- most common type of revision; reactionary • Planned revisions- less common type of revision; proactive

More Related