1 / 54

Evaluating HRD Programs

Evaluating HRD Programs. Chapter 7. Effectiveness. The degree to which a training (or other HRD program) achieves its intended purpose. Measures are relative to some starting point. Measures how well the desired goal is achieved. HRD Evaluation. Textbook definition:

Télécharger la présentation

Evaluating HRD Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating HRD Programs Chapter 7 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  2. Effectiveness • The degree to which a training (or other HRD program) achieves its intended purpose. • Measures are relative to some starting point. • Measures how well the desired goal is achieved. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  3. HRD Evaluation Textbook definition: “The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.” HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  4. In Other Words… Are we training: • the right people • the right “stuff” • the right way • with the right materials • at the right time? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  5. Evaluation Needs • Descriptive and judgmental information needed. • Objective and subjective data • Information gathered according to a plan and in a desired format. • Gathered to provide decision making information. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  6. Purposes of Evaluation • Determine whether the program is meeting the intended objectives. • Identify strengths and weaknesses. • Determine cost-benefit ratio. • Identify who benefited most or least. • Determine future participants. • Provide information for improving HRD programs. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  7. Purposes of Evaluation-2 • Reinforce major points to be made. • Gather marketing information. • Determine if training program is appropriate. • Establish management database. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  8. Evaluation Bottom Line • Is HRD a revenue contributor or a revenue user? • Is HRD credible to line and upper-level managers? • Are benefits of HRD readily evident to all? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  9. How Often are HRD Evaluations Conducted? • Not often enough!!! • Frequently, only end-of-course participant reactions are collected. • Transfer to the workplace is evaluated less frequently. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  10. Why HRD Evaluations are Rare • Reluctance to having HRD programs evaluated. • Evaluation needs expertise and resources. • Factors other than HRD cause performance improvements, e.g., • Economy • Equipment • Policies, etc. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  11. Need for HRD Evaluation • Shows the value of HRD. • Provides metrics for HRD efficiency. • Demonstrates value-added approach for HRD. • Demonstrates accountability for HRD activities. • Everyone else has it… why not HRD? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  12. Make or Buy Evaluation • “I bought it, therefore it is good.” • “Since it’s good, I don’t need to post-test.” • Who says it’s: • Appropriate? • Effective? • Timely? • Transferable to the workplace? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  13. Evolution of Evaluation Efforts • Anecdotal approach: Talk to other users. • Try before buy: Borrow and use samples. • Analytical approach: Match research data to training needs. • Holistic approach: Look at overall HRD process, as well as individual training. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  14. Models and Frameworks of Evaluation • Table 7-1 lists nine frameworks for evaluation. • The most popular is that of D. Kirkpatrick: • Reaction • Learning • Job Behavior • Results HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  15. Kirkpatrick’s Four Levels • Reaction • Focus on trainee’s reactions • Learning • Did they learn what they were supposed to? • Job Behavior • Was it used on job? • Results • Did it improve the organization’s effectiveness? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  16. Issues Concerning Kirkpatrick’s Framework • Most organizations don’t evaluate at all four levels. • Focuses only on post-training. • Doesn’t treat inter-stage improvements. • WHAT ARE YOUR THOUGHTS? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  17. Other Frameworks/Models – 1 • CIPP: Context, Input, Process, Product • CIRO: Context, Input, Reaction, Outcome • Brinkerhoff: • Goal setting • Program design • Program implementation • Immediate outcomes • Usage outcomes • Impacts and worth HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  18. Other Frameworks/Models – 2 • Kraiger, Ford, & Salas: • Cognitive outcomes • Skill-based outcomes • Affective outcomes • Phillips: • Reaction • Learning • Applied learning on the job • Business results • ROI HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  19. A Suggested Framework – 1 • Reaction • Did trainees like the training? • Did the training seem useful? • Learning • How much did they learn? • Behavior • What behavior change occurred? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  20. Suggested Framework – 2 • Results • What were the tangible outcomes? • What was the return on investment (ROI)? • What was the contribution to the organization? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  21. Data Collection for HRD Evaluation Possible methods: • Interviews • Questionnaires • Direct observation • Written tests • Simulation/Performance tests • Archival performance information HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  22. Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed Interviews HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  23. Advantages: Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options Limitations: Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate Questionnaires HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  24. Advantages: Non-threatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers Direct Observation HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  25. Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias Written Tests HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  26. Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations: Time consuming Simulations often difficult to create High costs to development and use Simulation/Performance Tests HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  27. Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes Archival Performance Data HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  28. Choosing Data Collection Methods • Reliability • Consistency of results, and freedom from collection method bias and error. • Validity • Does the device measure what we want to measure? • Practicality • Does it make sense in terms of the resources used to get the data? HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  29. Type of Data Used/Needed • Individual performance • System-wide performance • Economic HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  30. Individual Performance Data • Individual knowledge • Individual behaviors • Examples: • Test scores • Performance quantity, quality, and timeliness • Attendance records • Attitudes HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  31. System-Wide Performance Data • Productivity • Scrap/rework rates • Customer satisfaction levels • On-time performance levels • Quality rates and improvement rates HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  32. Economic Data • Profits • Product liability claims • Avoidance of penalties • Market share • Competitive position • Return on Investment (ROI) • Financial utility calculations HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  33. Use of Self-Report Data • Most common method • Pre-training and post-training data • Problems: • Mono-method bias • Desire to be consistent between tests • Socially desirable responses • Response Shift Bias: • Trainees adjust expectations to training HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  34. Research Design Specifies in advance: • the expected results of the study. • the methods of data collection to be used. • how the data will be analyzed. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  35. Research Design Issues • Pretest and Posttest • Shows trainee what training has accomplished. • Helps eliminate pretest knowledge bias. • Control Group • Compares performance of group with training against the performance of a similar group without training. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  36. Recommended Research Design • Pretest and posttest with control group. • Whenever possible: • randomly assign individuals to the test group and the control group to minimize bias. • Use “time-series” approach to data collection to verify performance improvement is due to training. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  37. Ethical Issues Concerning Evaluation Research • Confidentiality • Informed consent • Withholding training from control groups • Use of deception • Pressure to produce positive results HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  38. Assessing the Impact of HRD • Money is the language of business. • You MUST talk dollars, not HRD jargon. • No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.” HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  39. HRD Program Assessment • HRD programs and training are investments. • Line manager often see HR and HRD as costs, i.e., revenue users, not revenue producers. • You must prove your worth to the organization – • Or you’ll have to find another organization…. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  40. Two Basic Methods for Assessing Financial Impact • Evaluation of training costs • Utility analysis HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  41. Evaluation of Training Costs • Cost-benefit analysis • Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc. • Cost-effectiveness analysis • Focuses on increases in quality, reduction in scrap/rework, productivity, etc. HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  42. Return on Investment • Return on investment = Results/Costs HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  43. Types of Training Costs • Direct costs • Indirect costs • Development costs • Overhead costs • Compensation for participants HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  44. Direct Costs • Instructor • Base pay • Fringe benefits • Travel and per diem • Materials • Classroom and audiovisual equipment • Travel • Food and refreshments HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  45. Indirect Costs • Training management • Clerical/Administrative • Postal/shipping, telephone, computers, etc. • Pre- and post-learning materials • Other overhead costs HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  46. Development Costs • Fee to purchase program • Costs to tailor program to organization • Instructor training costs HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  47. Overhead Costs • General organization support • Top management participation • Utilities, facilities • General and administrative costs, such as HRM HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  48. Compensation for Participants • Participants’ salary and benefits for time away from job • Travel, lodging and per-diem costs HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  49. Measuring Benefits • Change in quality per unit measured in dollars • Reduction in scrap/rework measured in dollar cost of labor and materials • Reduction in preventable accidents measured in dollars • ROI = Benefits/Training costs HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

  50. Utility Analysis • Uses a statistical approach to support claims of training effectiveness: • N = Number of trainees • T = Length of time benefits are expected to last • dt = True performance difference resulting from training • SDy = Dollar value of untrained job performance (in standard deviation units) • C = Cost of training • U = (N)(T)(dt)(Sdy) – C HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

More Related