1 / 26

Developing a Hiring System

Developing a Hiring System. OK, Enough Assessing: Who Do We Hire??!!. Who Do You Hire??. Information Overload!!. Leads to: Reverting to gut instincts Mental Gymnastics. Combining Information to Make Good Decisions. “Mechanical” methods are superior to “Judgment” approaches

studs
Télécharger la présentation

Developing a Hiring System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!

  2. Who Do You Hire??

  3. Information Overload!! • Leads to: • Reverting to gut instincts • Mental Gymnastics

  4. Combining Information to Make Good Decisions • “Mechanical” methods are superior to “Judgment” approaches • Multiple Regression • Multiple Cutoff • Multiple Hurdle • Profile Matching • High-Impact Hiring approach

  5. Multiple Regression Approach • Predicted Job perf = a + b1x1 + b2x2 + b3x3 • x = predictors; b = optimal weight • Issues: • Compensatory: assumes high scores on one predictor compensate for low scores on another • Assumes linear relationship between predictor scores and job performance (i.e., “more is better”)

  6. Multiple Cutoff Approach • Sets minimum scores on each predictor • Issues • Assumes non-linear relationship between predictors and job performance • Assumes predictors are non-compensatory • How do you set the cutoff scores?

  7. How Do You Set Cut Scores? • Expert Judgment • Average scores of current employees • Good employees for profile matching • Minimally satisfactory for cutoff models • Empirical: linear regression

  8. Multiple Cutoff Approach • Sets minimum scores on each predictor • Issues • Assumes non-linear relationship between predictors and job performance • Assumes predictors are non-compensatory • How do you set the cutoff scores? • If applicant fails first cutoff, why continue?

  9. Multiple Hurdle Model Finalist Decision Background Interview Test 1 Test 2 Pass Pass Pass Pass Fail Fail Fail Fail Reject

  10. Profile Matching Approach • Emphasizes “ideal” level of KSA • e.g., too little attention to detail may produce sloppy work; too much may represent compulsiveness • Issues • Non-compensatory • Small errors in profile can add up to big mistake in overall score • Little evidence that it works better

  11. How Do You Compare Finalists? • Multiple Regression approach • Y (predicted performance) score based on formula • Cutoff/Hurdle approach • Eliminate those with scores below cutoffs • Then use regression (or other formula) approach • Profile Matching • Smallest difference score is best • ∑ (Ideal-Applicant) across all attributes • In any case, each finalist has an overall score

  12. Making Finalist Decisions • Top-Down Strategy • Maximizes efficiency, but also likely to create adverse impact if CA tests are used • Banding Strategy • Creates “bands” of scores that are statistically equivalent (based on reliability) • Then hire from within bands either randomly or based on other factors (inc. diversity)

  13. Applicant Total Scores 94 93 89 88 87 87 86 81 81 80 79 79 78 72 70 69 67

  14. Limitations of Traditional Approach • “Big Business” Model • Large samples that allow use of statistical analysis • Resources to use experts for cutoff scores, etc. • Assumption that you’re hiring lots of people from even larger applicant pools

  15. A More Practical Approach • Rate each attribute on each tool • Desirable • Acceptable • Unacceptable • Develop a composite rating for each attribute • Combining scores from multiple assessors • Combining scores across different tools • A “judgmental synthesis” of data • Use composite ratings to make final decisions

  16. Categorical Decision Approach • Eliminate applicants with unacceptable qualifications • Then hire candidates with as many desirable ratings as possible • Finally, hire as needed from applicants with “acceptable” ratings • Optional: “weight” attributes by importance

  17. Sample Decision Table

  18. Using the Decision Table 1: More Positions than Applicants

  19. Using the Decision Table 2: More Applicants than Positions

  20. Numerical Decision Approach • Eliminate applicants with unacceptable qualifications • Convert ratings to a common scale • Obtained score/maximum possible score • Weight by importance of attribute and measure to develop composite score

  21. Numerical Decision Approach

  22. Numerical Decision Approach

  23. Summary: Decision-Making • Focus on critical requirements • Focus on performance attribute ratings • Not overall evaluations of applicant or tool • Eliminate candidates with unacceptable composite ratings on any critical attribute • Then choose those who are most qualified: • Make offers first to candidates with highest numbers of desirable ratings

More Related