1 / 70

Training 2003 Presentation: Wednesday, February 26

Training 2003 Presentation: Wednesday, February 26. Conducting Effective Assessment, Measurement & Evaluation of Training. Roger Anderson Senior Consultant Tel (Brussels): +32 2 424 3423. Introduction to the Session. Assessment, Measurement and Evaluation Cycle. Organizational

oren-brock
Télécharger la présentation

Training 2003 Presentation: Wednesday, February 26

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Training 2003 Presentation: Wednesday, February 26 Conducting Effective Assessment, Measurement & Evaluation of Training Roger Anderson Senior Consultant Tel (Brussels): +32 2 424 3423

  2. Introduction to the Session

  3. Assessment, Measurement and Evaluation Cycle Organizational Opportunity/Problem Evaluation Making a decision on basis of the measurement, regarding the effect of the intervention on opportunity/problem. Assessment Determining the correct thing to do in order to address the opportunity/problem. Performance Measures: Efficiency Effectiveness Quality Measurement Quantify the outcome of what is done vis-à-vis the opportunity/problem

  4. Performance Consulting System Model Identify Client, Sponsor, and Stakeholder Question Basic Assumptions Assessment Define the Needed Intervention Plan Evaluation Determine Metrics for Measuring Impact Select Best Intervention Measurement Begin Collecting Data Implement Intervention Evaluation Analyze the Data Evaluate the Impact

  5. Unit 1: Assessment Tools

  6. Objectives By the end of this unit, you will • Identify stakeholders, clients, and sponsors of the projected intervention • Accurately identify the problem or opportunity by questioning basic assumptions • Define needs in terms of business, performance, and/or training as determined through the problem identification process • Identify operational indicators that are linked to the defined needs and establishes measures for these indicators • Plan an evaluation strategy so that resources are properly assigned

  7. Strategic Context of HR Interventions The Cultural Context Strategic Vision/ Purpose Individual and Team Competencies Work Flow/ Process Learning Support Structures/ Systems

  8. Identifying the Client, Sponsor, and Stakeholders • Stakeholders—those whose work will be directly impacted by the changed produced by the intervention • Client—the person who will “own” the program; the person most directly affected by the success of the intervention • Sponsor—the highest-level advocate of the intervention; the person who provides the power to implement the changes required

  9. Identifying the Expressed Level of Need Training Need Performance Need Business Need Work Environment Need

  10. Identifying the Expressed Level of Need • Training Need: A specific knowledge, skill, or process that a participant must learn to perform successfully. • Performance Need: Behaviors that must be performed by participants in order to achieve business needs. • Business Need: Strategic goals of an organization, business unit, or department that are expressed in operational terms. • Work Environment Need: A process, system, or condition that must be changed in order to help participants perform successfully.

  11. Metrics for Measuring Impact • Business: What will attainment of business goals look like? • Performance: What behaviors are being performed effectively? • Training: What knowledge, skill, or process has an employee successfully learned? • Work Environment: What has been influenced as a result of environmental redesign?

  12. Types of Evaluation • Formative Evaluation • Needs Analysis • Stakeholder Analysis • Objective Identification • Program Design • Pilot Testing • Summative Evaluation

  13. Unit 2: Linking HR, Organizational Strategy, and Individual Performance

  14. Objectives By the end of this unit, you will • Differentiate between strategic and program evaluation • Identify the components of a balanced scorecard • Identify organizational measures of high performance work systems

  15. The Impact of High Performance Work Systems on Organizational Performance • It is increasingly important today to build and maintain a skilled and motivated workforce in order to achieve strategic goals. • Organizations face changing demands based on shifts in the marketplace. • Organizational structures are changing to include broader spans of control. • With a shift away from command and control culture, comes the need for employee to have increasing amounts of knowledge and to be able to apply it in a way which adds value to the organization. • Employees are a competitive advantage, not a cost

  16. How is our financial health? Financial Perspective Goals Measures What must we excel at? How do customers see us? Internal Business Perspective Customer Perspective Goals Measures Goals Measures STRATEGY Innovation and Learning Perspective Goals Measures Can we continue to improve and create value? The Balanced Scorecard Links Performance Measures

  17. Financial End User OBJECTIVES MEASURES OBJECTIVES MEASURES New products • Cash flow • Percent of sales from new products Survive • Quarterly sales growth and • Percent of sales from proprietary Succeed operating income by division products Responsive • Increased market share and • On-time delivery (defined by Prosper return-on-equity customer) supply • Share of key accounts’ purchases Preferred • Ranking by key accounts supplier • Number of cooperative engineering Customer efforts partnership Innovation and Learning Internal Business OBJECTIVES MEASURES OBJECTIVES MEASURES • Cycle time Technology • Time to develop next generation Technology • Unit Cost leadership capability • Yield • Process time to maturity Manufacturing Manufacturing • Engineering efficiency learning excellence • Actual Introduction schedule vs. • New product introduction vs. Time to market plan competition Design • Silicon Efficiency productivity • Manufacturing geometry vs. New product competition introduction Example Balanced Scorecard: Hi-Tech Manufacturer

  18. FinancialPerspective ImproveReturns BroadenRevenue Mix ImproveOperatingEfficiency CustomerPerspective IncreaseCustomerConfidenceIn OurFinancialAdvice IncreaseCustomerSatisfaction ThroughSuperior Execution InternalPerspective Understand Customer Segments Develop New Products Cross-Sell the Product Line Shift to Appropriate Channel Minimize Problems Provide Rapid Response LearningPerspective Increase Employee Productivity Develop Strategic Skills Align Personal Goals Access to Strategic Information Telling a “Story” with a Balanced Scorecard

  19. Unit 3: Measurement: Metrics and Operational Indicators

  20. Objectives By the end of this unit, you will • Understand issues related to measurement of employee output • Understand the fundamentals of survey design and Level Three (on-the-job application) evaluation

  21. The total value added to the organization by an employee; the sum of all relevant performance. The Ultimate Measure Ultimate Measure

  22. The Linkage Competency Paradigm PERFORMANCE/RESULTS OUTCOMES BEHAVIOR SKILL KNOWLEDGE BELIEFS VALUES TRAITS MOTIVES Inferred Observed COMPETENCIES

  23. Begin with the End in Mind • What does your client value? • What sorts of decisions need to be made? • How will you summarize the results? • Is qualitative or quantitative data needed?

  24. Types of Data Qualitative • Data collected via open-ended questions: • Subjective Quantitative • Data collected via close-ended questions: • More objective

  25. Qualitative Data Advantages • Enables the participants to express their evaluation of the intervention in their own terms. • Provides greater depth of critique data Disadvantages • Data is more open to interpretation • Can be difficult to collect & compare like to like. • Individual responses cannot be statistically compared one to one.

  26. Quantitative Data Advantages • Data can be statically compared across modules and courses • Data is more efficiently collected and analyzed Disadvantages • Expertise is needed to ensure results are not biased. • Participants are forced to respond in a predetermined way.

  27. 10 Post # of Participants 5 1 No Job Relevant information Highly Applicable Information Rating Summarizing Results - Counts/Frequency

  28. 5 4 Mean Score 3 2 1 Pre Post Summarizing Results - Average (Mean)

  29. What to Measure? • Whole job vs. part of job • One measure vs. several measures • Individual employees vs. group of employees • Reactions, learning, application, impact Criteria: • Important to your client • Linked to project objectives and scope • Subject to relatively short-term change

  30. Kirkpatrick Model Level One: Reaction Level Two: Learning Level Three: Application Level Four: Impact

  31. Level One: Reaction Purpose To gain feedback for course development and/or improvement by measuring participants’ reactions to the intervention. Format • Questionnaire/Survey • Immediate Feedback Data Collected Regarding • Course content/concepts • Instructor style • Applicability of material to current job • Course materials • Facilities • Work environment (transfer facilitators/inhibitors)

  32. Level One: Reaction Step One: Determine specific information you wish to gather. Step Two: Determine the specific questions to get the data. Step Three: Determine the scale. Step Four: Determine the scale descriptors.

  33. Level Two: Learning Purpose To determine participants’ level of mastery of course objectives. Format • Performance Simulations • Written Tests Data Collected Regarding • Participants’ knowledge of course materials • Participants’ ability to demonstrate the skills • Participants’ fluency (speed) at using the skills • Accuracy or quality of participants’ output • Procedural automaticity

  34. Level Two: Learning Step One: Design learning measures. Step Two: Consider test use. Step Three: Develop means of increasing validity or reliability. (Optional)

  35. Level Three: Application Purpose To determine the extent to which skills/behaviors developed during the intervention are being used on the job. Format • Questionnaire/Survey • Behavioral Checklist • Frequency Checklist • 360-Degree Survey • Managerial Appraisal Data Collected Regarding • Participants’ on-the-job application • Factors encouraging application • Factors hindering application • Follow-up support required to assist in continued development

  36. Level Three: Application Step One: Determine basis for evaluating on-the-job application. Step Two: Decide how to collect information, and when to collect the information Step Three: Design the evaluation instrument. Step Four: Pilot the evaluation instrument. Step Five: Develop means of increasing objectivity of data.(Optional) 1. Use multiple observers 2. Train observers 3. Use 360-degree methodology

  37. Level Four: Impact Purpose To determine the impact on the organization of participants’ application of training on-the-job. Format • Review of Records • Observation • Efficiency Ratios Data Collected Regarding • Change in individual/organizational output • Change in economic performance

  38. Level Four: Impact Step One: Develop a system for measuring impact. 1.What are the business needs/objectives of the course? 2. What operational indicators reflect these needs/objectives? 3. What specific knowledge and skills being developed in the intervention are linked to the indicators being tracked? 4. What indicators currently exist; do you want to develop others? 5. What time intervals will you use to determine that the impacts are taking place? Step Two: Develop a means for collecting data. 1. Measure current performance for each operational indicator. 2. Involve line personnel in the collection of Level Four data.

  39. Unit 4: Evaluation Strategy

  40. Objectives By the end of this unit, you will • Understand how to develop evaluation strategies that will allow you to discern the effectiveness of the intervention by: • Separating the impact of your intervention from other organizational forces • Identifying the factors that can destroy your evaluation efforts • Ensuring that your sample matches the situation that you are making decisions about

  41. Separating Effects of Training from Other Influences Proving Causes versus Providing Evidence

  42. Separating Effects of Training from Other Influences • Know the business you are consulting for. • Reach agreement with your client as to what the most appropriate measures are; get your clients to “own” the measures. • Obtain pre-measures. • Use control groups. • Use multiple sources of information • Identify level of performance required for business need/opportunity. • Plan timing of data collection.

  43. + T C Performance T C 0 Pre Training Measure Training Picnic Post Training Measure Control Groups Example 1: Other Organizational Influences

  44. + T C C Performance T 0 Pre Measure Training Post Training Measure Control Groups Example 2: Remedial Training

  45. Classroom Training Self Training Performance Pre Post Training Control Groups Example 3: Alternative Interventions

  46. Defined Needed Performance Trained Group Pre Post Control Groups Example 4: Predetermined Level of Need

  47. Designing Control Groups • Control groups should be as similar as possible in terms of relevant attributes and performance. • Matched Control Groups • Random Control Groups • Convenience Control Groups

  48. Multiple Sources of Information in Evaluation Reaction Knowledge Application Impact

  49. Manager Training Performance Technical Training Pre Training Training Time 1 Time 2 Time 3 Time 4 Time 5 Maintenance of Training’s Effect (Transfer)

More Related