1 / 64

Psychological Measurement in Industry

Psychological Measurement in Industry. What Do Industrial-Organizational Psychologists Do?. Industrial-organizational psychologists study organizations and seek ways to improve the functioning and human benefits of business. Psychology at Work.

Leo
Télécharger la présentation

Psychological Measurement in Industry

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Psychological Measurement in Industry

  2. What Do Industrial-Organizational Psychologists Do? Industrial-organizational psychologists study organizations and seek ways to improve the functioning and human benefits of business.

  3. Psychology at Work • I/O Psychologist Improve Organizational Functioning by: • Recruiting people that best fit your organization • Selecting the best people • Retaining the best people • Developing fair, legal, and efficient hiring procedures • Improving the skills of the people • Creating a diverse, qualified workforce

  4. Selection Process • Measure applicants’ qualifications • Select the best applicant to hire • For each selection method: • Describe the selection method • Rate the validity of the selection method: • Poor: validity coefficient = r ≈ .00 • Moderate: validity coefficient = r ≈ .25 • Good: validity coefficient = r ≈ .50 • Great: validity coefficient = r ≈ .75 • Evaluation of process

  5. Workplace Testing Settings • The government & the military • 90% tested for federal jobs • 80% tested for state, county, local government • Largest amount of testing in military • Over 1 mil. per year take ASVAB • Military: emphasis on placement not selection

  6. Workplace testing settings (cont.) • Testing for professional licensing • Over 2,000 occupations require licensing • In some states 25% of workforce is licensed • Require written & oral examinations • Tests prepared by licensing boards (local & national) • National exams better than local • “State of near chaos” • For the same job – diff’t exams, diff’t requirements • Little overlap between states

  7. Workplace testing settings (cont.) • Private organizations • Frequency of testing varies • Growing interest in testing w/ ease of web-based testing • Applicants are administered tests online • Info goes directly to hiring manager, no feedback to test-taker • E.g., Target, Blockbuster

  8. Approach for Examining Selection Methods • Describe the selection device and the information that can be collected about applicants • Describe how to develop and use the device appropriately • Describe the frequency of use, reliability, validity, costs, adverse impact, and face validity (applicant reactions) of each device • Depth and breadth of selection plan • Costs, who is involved, time frame, which jobs • Internal vs. external selection

  9. Ability Tests Job Knowledge Tests Performance Tests and Work Samples Personality Tests Integrity Tests Structured Interviews Assessment Center Selection Methods

  10. Application Blanks & Résumés • Routinely used • Focuses on basic factual information: • Education, training, work history, skills, accomplishments, etc. • Used to screen out applicants who don’t meet minimum qualifications in terms of education, experience, etc.

  11. Application Blanks • Validity: poor (typically r < .20) • Why? • Problems: • Lack of agreement what to look for • Possible discrimination • Need to cross validate • Keys are not stable over time, need to update

  12. Biodata Questionnaires • Questionnaire on applicant’s life experiences • Example questions: • Did you ever build a model airplane that flew? • When you were a child, did you collect stamps? • Do you ever repair mechanical things in your home? • Answers are scored according to a scoring key • Validity: moderate (r ≈ .30)

  13. Experience & Accomplishments Questionnaires • Questionnaire focuses on applicant’s job-related experiences & accomplishments • Example questions: • For an Information Systems Analyst position: • Describe the types of IT systems problems you have encountered. • Describe your experience in testing hardware, software, or systems. • Validity: moderate (typically content validity)

  14. Employment Interviews • Universal selection procedure • Strong effect on selection decisions • Preferred by managers • Psychometric problems: • No consistency of questions • Questions unrelated to jobs • No objective scoring system • No interviewer training • Overall, the more standardized the interview, the better

  15. Assess Certain Characteristics Assess Organisational & Team Fit Satisfy Social Exchange Function High Face Validly INTERVIEWS CAN.....

  16. Employment Interview • Validity as typically done: poor (r < .20) • Types of employment interviews: • Unstructured: few (if any) pre-planned questions; commonly used; poor validity • Semi-structured: some pre-planned questions, but with flexibility to pursue lots of follow-ups; moderate validity • Structured: all questions are pre-planned; every applicant gets the same interview; some follow-up probes; answers are evaluated using numeric rating scales; good validity

  17. Structured Interviews • Standardized method of asking same job related questions of all applicants • Carefully planned and constructed based on job analysis • Responses are numerically evaluated • Detailed notes are taken

  18. Example of Structured Interviews • Situational: • Hypothetical scenarios • A sign of how they will behave • How would the interviewee behave in a critical situation? • Behavioral: • Past incidents • A sample of work behavior – better predictor • How did the interviewee behave in a specific job situation in the past? • Multiple raters • Composite ratings used to make decisions

  19. Unstructured & Unplanned Untrained & Biased Interviewers Same Sex Bias Structured Means Standardised or Artificial & Inflexible PROBLEMS WITH INTERVIEWS

  20. Why Interviews are Often not Valid Assessments • Poor wording of questions • No systematic scoring system used by interviewers—very subjective • Applicants have been trained to give the appropriate responses to such open-ended questions • Interviewer has no way to verify this information in short period of time of interview

  21. Attempts to Improve the Interview • Training Interviewers • Development of Appropriate Techniques: • Situational Interview • Behavior Description Interview

  22. Creating an open-communication atmosphere Delivering questions consistently Maintaining control of the interview Developing good speech behavior Learning listening skills Taking appropriate notes Keeping the conversation flowing and avoiding leading or intimidating the interviewee Interpreting, ignoring, or controlling the nonverbal cues of the interview Central Issues in Interview Training Programs

  23. Evaluation of the Interview • Unstructured interviews used frequently; structured ones used less frequently • More structured, more reliable and valid • Structured interviews are highly correlated with cognitive ability tests • Mixed adverse impact • Structured interviews are costly to develop and use • Might be appropriate for measuring person/organization fit

  24. Ability Tests • Measure what a person has learned up to that point in time (achievement) • Measure one’s innate potential capacity (aptitude) • Up to 50% of companies use some ability testing

  25. Ability Tests • Mental (Cognitive) Ability Tests • Mechanical Ability Tests • Clerical Ability Tests • Physical Ability Tests

  26. Cognitive Ability Tests • Main purpose: to determine one’s level of “g” or aptitudes depending on setting • Measure aptitudes relevant to the job • short, group administration • excellent predictor of job and training performance

  27. Memory Span Numerical Fluency Verbal Comprehension Spatial Orientation Visualization Figural Identification Mechanical Ability Conceptual Classification Sematic Relations General Reasoning Intuitive Reasoning Logical Evaluation Ordering Typical Cognitive Abilities

  28. Example of Ability Tests • Wonderlic Personnel Test (measures “g”) • 50 items, 12 minutes • multiple choice • Items cover verbal, math, pictorial, analytical material • Highly reliable (alternate forms > .90) • Correlated with job performance measures • Correlated with WAIS

  29. Examples of Other Frequently Used Mental Ability Tests • Otis-Lennon Mental Ability Test • General Aptitude Test Battery (GATB) used by the US Employment Service • Employee Aptitude Survey (EAS)

  30. Advantages of Cognitive Ability Tests • Efficient • Useful across all jobs • Excellent levels of reliability and validity (.40 - .50) • Highest levels than any other tests • Estimated validity: • .58 for professional/managerial jobs • .56 for technical jobs • .40 for semi-skilled jobs • .23 for unskilled jobs • More complex job = higher validity

  31. Disadvantages of Cognitive Ability Tests • Lead to more adverse impact • May lack face validity • Questions aren’t necessarily related to job • May predict short-term performance better than long-term • can do vs. will do

  32. Frequently Used General Mechanical Ability Tests • Bennett Mechanical Comprehension Tests • MacQuarrie Test for Mechanical Ability • What they generally measure: • Spatial visualization • Perceptual speed and accuracy • Mechanical information

  33. Tests of Mechanical Comprehension • better than “g” for blue-collar jobs • Good face validity • Criterion validity w/ mech. job performance • E.g., Bennett Mechanical Comprehension Test • 68 items • 30 minutes • Principles of physics & mechanics • Operations of common machines, tools, & vehicles • High internal consistency • Good criterion validity w/ job proficiency & training

  34. Clerical Ability Tests Predominately measures perceptual speed and accuracy in processing verbal and numerical data Examples: • Minnesota Clerical Test • Office Skills Test

  35. Clerical Tests • 2/3 of companies use written tests to hire & promote • 60-80% of tests are clerical • Specific vs. general • E.g., Minnesota Clerical Test • 2 subtests: number comparison & name comparison • Long lists of pairs of numbers/names (decide if same) • Strict time limit • Reliable & valid for perceptual speed & accuracy • Good face validity

  36. Physical Ability Tests • Most measure muscular strength, cardiovascular endurance, and movement quality • Areas of concern: • Female applicants • Disabled applicants • Reduction of work-related injuries

  37. Ability Tests and Discrimination • Differential Validity • Are employment tests less valid for minority group members than non-minorities? • Research has found that differential validity does not exist

  38. Comparison of Mental Ability Tests and Other Selection Instruments Mental ability tests have high validity and low costs compared to other methods Biodata, structured interviews, trainability tests, work samples, and assessment centers have equal validity, less adverse impact, and more fairness to the applicant, but cost more

  39. Work Sample Tests • How do you perform job-relevant tasks? • 2 characteristics: • Puts applicant in a situation similar to a work situation – measures performance on tasks similar to real job tasks. • Is it a test of maximal vs. typical performance? • Range from simple to complex

  40. Work Sample Tests • Examples: • For telephone sales job, have applicants make simulated cold calls • For a construction job, have applicants locate errors in blueprints

  41. Work Sample Tests • Advantages: • Highest validity levels (r = .50s) • High face validity • Easy to demonstrate job-relatedness • Disadvantages: • Not appropriate for all jobs • Time-consuming to set up and administer • More predictive in short-term • Cannot use if applicant is not expected to know job before being hired

  42. Measuring Personality • Early research showed no validity • Recent research: 3 of Big 5 are predictive • Criterion validity: .15 - .25 (less than “g”) • Susceptible to faking – does not affect validity in predicting • Useful when dependability, integrity, responsibility are determinants of job success

  43. Myers-Briggs Type Indicator (MBTI) • Dimensions of personality: • Introversion  Extroversion: source of energy • Intuition  Sensation: innovation vs. practical • Thinking  Feeling: impersonal principles vs. personal relationships • Judging  Perceiving: closure vs. open options • Validity: poor for selection; might be okay, if carefully used, to help a team work better together

  44. The Big 5 Personality Dimensions • Validity: typically moderate for selection (r ≈ .25 with measures of overall job performance) • But, validity of personality inventories is hard to generalize • Some dimensions of personality may correlate more strongly with particular aspects of a particular job • Extraversion → success in sales • High conscientiousness & high openness to experience → success in job training • Low agreeableness, low conscientiousness, & low adjustment → more likely to engage in counterproductive work behaviors (e.g., abuse sick leave, break rules, drug abuse, workplace violence)

  45. Advantages of Personality Inventories • Intuitively appealing to managers (e.g., MBTI) • No adverse impact • Don’t show rates of differential selection • Efficient • Moderate reliability and validity • Validity = .20 - .30

  46. Disadvantages of Personality Inventories • Response sets • Lie or socially desirable responding • All traits not equally valid for all jobs

  47. Integrity Testing • Why do it? • Employee theft estimated between $15 and $50 billion in 1990’s • Employee theft rate by industry: 5 to 58% • 2% to 5% of each sales dollar charged to customers to offset theft losses

  48. Integrity Testing • Purpose: • theft is expensive • also want to avoid laziness, violence, gossip • Honesty may not be a stable trait • Honesty testing is controversial • May depend on the situation (perceived unfairness) • Viewed as coercive and inaccurate • Honesty is a strong value in our society

  49. Honesty & Integrity Tests • Employee Polygraph Act (1988) prohibits (with some exceptions) the use of polygraph tests of applicants or employees • Replaced by paper-and-pencil tests: • Overt integrity tests: measures attitudes about dishonest behavior • Question: “Everyone will steal if given the chance.” • Examples: • Pearson Reid London House: Personnel Selection Inventory (PSI)

More Related