1 / 26

Training Evaluation

Training Evaluation. Training evaluation provides the data needed to demonstrate that training does provide benefits to the company. What are the differences among:. Training effectiveness Training outcomes Training evaluation Evaluation design. Types of Evaluation. Formative Summative.

zion
Télécharger la présentation

Training Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TrainingEvaluation

  2. Training evaluationprovides the data needed to demonstrate that training does provide benefits to the company.

  3. What are the differences among: • Training effectiveness • Training outcomes • Training evaluation • Evaluation design

  4. Types of Evaluation • Formative • Summative

  5. Why Evaluate Training Programs? • Objectives • Satisfaction • Benefits • Comparison

  6. Objectives = Foundation • Terminal behavior • Conditions under which terminal behavior is expected • The standard below which performance is unacceptable • --> criteria by which the trainee is judged

  7. The Evaluation Process Conduct a Needs Analysis Develop Measurable Learning Outcomes and Analyze Transfer of Training Develop Outcome Measures Choose an Evaluation Strategy Plan and Execute the Evaluation

  8. Level Criteria Focus 1 Reactions Trainee satisfaction; aka affective 2 Learning Acquisition of knowledge, skills, attitudes, behavior; aka cognitive 3 Behavior Improvement of behavior on the job; akaskills 4 Results Business results achieved by trainees Training Outcomes: Kirkpatrick’s Four-Level Framework of Evaluation Criteria

  9. How do you know if your outcomes are good? Good training outcomes need to be: • Relevant • Reliable • Discriminative • Practical

  10. Good Outcomes: Relevance • Criteria relevance – the extent to which training programs are related to learned capabilities emphasized in the training program • Criterion contamination – extent that training outcomes measure inappropriate capabilities or are affected by extraneous conditions • Criterion deficiency – failure to measure training outcomes that were emphasized in the training objectives

  11. Criterion deficiency, relevance, and contamination: Outcomes Identified by Needs Assessment and Included in Training Objectives Outcomes Related to Training Objectives Outcomes Measured in Evaluation Contamination Relevance Deficiency

  12. Good Outcomes (continued) • Reliability – degree to which outcomes can be measured consistently over time • Discrimination – degree to which trainee’s performances on the outcome actually reflect true differences in performance • Practicality – refers to the ease with which the outcomes measures can be collected

  13. Training Evaluation Practices Percentage of Courses Using Outcome Outcomes

  14. Evaluation Procedures

  15. Utility

  16. [(Ns)*(T)*(r)*(SDy)*(Zs)]-[(N)*(C)] • Ns = number of applicants selected • T = tenure of selected group in years • r = correlation between predictor and job performance (VALIDITY) • SDy = standard deviation of job performance • Zs = average standard predictor score of selected group • N = number of applicants • C = cost per applicant

  17. [(Nc)*(T)*(r)*(SDy)*(Zs)]-[(N)*(C)] • Nc = number of trainees who complete program • T = duration of training benefit • r = correlation between training criterion and job performance (VALIDITY) • SDy = standard deviation of job performance • Zs = average standard criterion score of trainees • N =total number of trainees enrolled • C = cost per trainee

  18. Training Costs • Direct • Indirect • Development • Overhead • Compensation for Trainees

  19. For On the Job Training $81,000 • 50 = Ns = number of trainees who complete program • 1 = T = duration of training benefit • .50 = r = correlation between training criterion and job performance (VALIDITY) • 4800 = SDy = standard deviation of job performance (assume 40% of base pay . . . $12,000 * .40) • .80 = Zs = average standard criterion score of trainees • 100 = N = total number of trainees enrolled • 150 = C = cost per trainee [(Ns)*(T)*(r)*(SDy)*(Zs)]-[(N)*(C)] (50 * 1 * .50 * 4800 * .8) - (100 * 150)

  20. Experimental Designs

  21. Experimental DesignsChoices • Pretest/posttest • Control Groups

  22. Experimental Designs • 1: 1 group, posttest only • 2: 1 group, pretest/posttest • 3: Pretest/posttest control group • 4: Solomon four-group • 5: Time-series • 6: Nonequivalent control group

  23. Experimental Designs Validity • Internal • External

  24. History Maturation Testing Instrumentation Regression toward the mean Differential selection Experimental mortality Interactions Diffusion/imitation of treatments Compensatory equalization of treatments Rivalry/desirability of treatments Demoralization Experimental DesignsThreats to Internal Validity

  25. Experimental DesignsThreats to External Validity • Reactive effect of pretesting • Interaction of selection & treatment • Reactive effects of experimental settings • Multiple-treatment interference

  26. Issues in Training Validity • Training validity • Transfer validity • Intra-organizational validity • Inter-organizational validity

More Related