1 / 0

Dealing with Large Applicant Pools

Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education. Dealing with Large Applicant Pools . PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013. Let’s do the math…. Internet Recruiting Broad Outreach . +. Online Applications Low-effort applying

mayten
Télécharger la présentation

Dealing with Large Applicant Pools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education

    Dealing with Large Applicant Pools

    PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013
  2. Let’s do the math… Internet Recruiting Broad Outreach + Online Applications Low-effort applying Reduced Self-filtering X High unemployment = Large Applicant Pools - Reduced Public Sector Budgets = Workload Problem for many merit systems
  3. Strategies to reduce pool Will reduce your pool, but… At best, not improve on the average level of ability Possibly skew toward lower end None are “merit” based Negative impact on image as public employer affecting other recruitments Increase protests, appeals How will you defend this? Limit exposure Limit posting sites (no Monsters) Paper bulletins only Shorten posting period Do not accept “interest cards” Limit application window Open for one day or hour Posting cut-off at x-number of applicants (first come first serve for testing slots, self-schedule) Increase burden to apply Paper app, hand pickup and deliver Charge $ per application ($5 is legal) Increase requirements for complete app (proof of diploma, degree, transcripts) Random selection Lottery
  4. The opportunity Large applicant pools increase the potential utility of a valid selection procedure. Utility: The expected organizational gain from your selection procedure Gain #1: Success rate of those hired Gain #2: Monetary value of higher performance Validity: the ability of your selection procedure to identify top talent SelectionRatio: job openings (n) divided by the number of job applicants (N). Lower Ratio = Higher Selectivity
  5. Large versus small pools The likelihood of recruiting exceptional candidates increases with large pools. large Number small Exam Score
  6. Large versus small pools Large pools enable identifying pass points. 6
  7. Put test validity to work Large pools enable high pass points. Selected mean= 36 Overall mean = 26 Job Performance Exam Score, Current Employees
  8. Challenge How to be efficient without sacrificing quality? Analyze where large pools take the most effort Utilize most efficient job-related screening tools first
  9. Job-Related Strategies Self-Selection Efficient first stage screening Broad-based testing
  10. Self-selection Focused Recruitment Clear Job Descriptions and Requirements Realistic Job Previews
  11. Self-selection:Focused Recruitment Occupation-related websites Dice.com, OhSoEZ.com Social Networking websites Professional bulletin boards and discussion groups LinkedIn Groups Regular community-oriented networking Old fashioned paper bulletins and local media distribution Career preparatory schools and training organizations PTA and booster organizations Internal email, intranet, word-of-mouth by employees
  12. Self-selection:Clear Requirements and Expectations Clear duty statements and minimum qualifications “Any combination of education and experience…” ?? Realistic job preview Clear statement of rewards and challenges of the job in bulletin and other job description information. Web-based preview/orientation Paraeducator, Special Education 3 hour, on-line workshop, must be completed as part of the application process Uses Adobe Connect
  13. Efficient first-stage testing Automated objective written tests Un-proctored Internet Testing Auto-scored supplemental questionnaires
  14. Efficient first-stage testing:Objectively Scored Testing Scantron (mark-sense answer sheets) or NCS automated scoring and score upload. Work horse of selection procedures, and still among most efficient. Test BEFORE screening for minimum qualifications Review minimums for only those who pass the test.
  15. Efficient first-stage testing:Un-proctored Internet Testing (UIT) Basic skills testing (reading, writing, math, reasoning, etc.) Offered by all major test publishing and consulting firms Register or apply on-line Computer administered and timed May be computer adaptive testing Candidate cheating and impersonation remain issues Those who pass are invited in for verification testing and subsequent exam parts (performance, orals, etc.) Completion of online workshop post-testing Use with on-line preview/orientation with a multiple choice exam. Online empirically-keyed personality testing, situational judgment, biodata, and occupational preference/suitability Less prone to faking and cheating
  16. Efficient first-stage testing:Supplemental Questionnaires Part of the application process Could be just a few questions focused on MQs Could be automated tally of years experience and level of education Potentially useful but blunt instrument Prone to candidate inflation due to “interpretation” Based on job analysis to identify areas to assess Type and level of duties and related tasks Skills and facets of the skills Each area is broken down into individual questionnaire sections and items Should require some form of verification
  17. Example Items:Skills (MS Office) To what extent have you performed the following EXCEL/spreadsheet tasks? Set up rows and columns of data Cut and paste data from one spreadsheet into another Write formulas in cells that reference more than one worksheet Create pivot tables Create graphs Use built-in statistical functions Write macros to automate routine functions To what extent have you performed the following WORD/word processing tasks? Formatted documents with section breaks Formatted documents in “column” format Created and formatted tables Inserted pictures and graphs Created locked forms with "form fields" Used "track changes" feature in document editing Used mail merge to insert Excel source data into templates
  18. Supplemental questionnairesPoints per response (example) 0 = I have no background in this 0 = I know what this is but I have not done it 1 = I have assisted others (or received training) in this, but not done it independently 2 = I have done this independently but not frequently 4 = I have done this as a regular part of my job
  19. Supplemental questionnaires Supporting Examples Describe an accomplishment that illustrates your skill level in conducting job classification work. (Please limit your response to 200 words.) Describe an example of a task you performed that illustrates your skill level with Excel/spreadsheets. (Please limit your response to 100 words.) Or for skills, actual testing to verify (similar to UIT)
  20. Supplemental questionnaires Two-Stage Pass Points Stage 1 Calculate MAR – Sum of Minimally Acceptable Responses for each item. Set pass point based on self-rated scores. (If based on applicant flow management, pass 10-20% more than you actually need.) Stage 2 For those that exceed the pass point, have SMEs compare the supporting examples to the self ratings. SMEs apply a weight of 0, .5, or 1.0 to the area of self rating. Where: 1.0 = example is consistent with the self ratings .5 = questionable support for the self ratings 0.0 = lacking support for the self ratings Apply the corrected scores to the pass point to determine who actually has passed the hurdle.
  21. Broad-based testing Core test battery for as many classifications as possible Score banking and certification to different exam plans Different weights and cut-off scores for different classifications based upon relevancy and level needed College-Board Model (SAT, GRE, etc.)
  22. Broad-based testing:What problem are we solving? Many related classifications Differ on specific job content and levels of responsibility Many obstacles to class consolidation - logistical, political, fiscal Each class has a unique examination Variable content and quality, not consistently updated Many of the same candidates 2-3 month recruitment & examination cycle time No overall competency structure across the classes Lack of core competencies and competency progression Pass point paradoxes (lower pass points for higher level jobs based on selection ratios)
  23. Broad-based testing:Purpose and Objectives Improve testing reliability and validity Improve coherence of clerical examination plans Reduce redundant testing Increase recruitment efficiency Reduce time to place candidates on eligibility lists
  24. Broad-based testing:Recruitment Method 1: Recruit for broad-based test Annual calendar of assessment (e.g., quarterly) Establishes a pre-tested pool of potential candidates for the relevant classes For most job classifications, limited or no public posting for job specific recruitments Invitations to apply for specific jobs are sent to pre-tested pool based on test cut-off score
  25. Broad-based testing:Recruitment Method 2: Recruit for a classification or class job family Build a pre-tested pool from each recruitment process Candidates apply for each recruitment process Allow non-tested candidates to participate in each process & exams Pre-tested candidates are informed of their status based on test cut-off scores
  26. Broad-based testing – Clerical & Secretarial Classifications: Competencies Assessed - Tests Sequencing and ordering speed and accuracy (timed) Checking and comparing speed and accuracy (timed) Computational speed and accuracy (timed) Following Instructions (MC) English Usage & Grammar (MC) Data Entry speed and accuracy (timed, OPAC) Microsoft Word Skills (Performance, OPAC)
  27. Broad-based testing: Applicable BBC Classifications Clerk Intermediate Clerk Senior Clerk School Clerk Senior School Clerk Data Control Clerk Senior Data Control Clerk Typist Clerk Intermediate Typist Clerk Temporary Office Worker Senior Typist Clerk Department Assistant, Dance Department Assistant, Music Department Assistant, Theater Department Assistant, Visual Arts Receptionist Reader Information Resources Specialist Media Dispatching Clerk Secretary Division Secretary Legal Secretary School Administrative Secretary Senior Division Secretary Executive Legal Secretary Executive Assistant
  28. Broad-based testing:Broad-Based Administrative (BBA) ADMINISTRATIVE ANALYST ADMINISTRATIVE ASSISTANT ADMINISTRATIVE AIDE ASSISTANT ADMINISTRATIVE ANALYST RESOURCE & DEVELOPMENT ANALYST PROJECT COORDINATOR COMPENSATION ANALYST RESEARCH ANALYST LEGISLATIVE ANALYST BUDGET ANALYST HEAD START PROGRAM RESULTS SPECIALIST HUMAN RESOURCES AIDE HUMAN RESOURCES ANALYST HS PROGRAM DEVELOPMENT SPECIALIST ASSISTANT HUMAN RESOURCES ANALYST LABOR RELATIONS SPECIALIST
  29. Broad-based testing:Broad-Based Services(BBS) CUSTODIAN SENIOR CUSTODIAN MAINTENANCE WORKER SENIOR MAINTENANCE WORKER DELIVERY DRIVER REPROGRAPHICS WORKER UTILITY WORKER
  30. The Take Back Message 30
  31. Job-Related Strategies Self-Selection (Pre-Application Stage) Focused Recruitment Clear Job Descriptions and Requirements Realistic Job Previews Efficient first stage screening (Pre-Invite Stage) Automated objective written tests Un-proctored Internet Testing Auto-scored supplemental questionnaires Broad-based testing (Efficient Testing Administration) One Test for Multiple Classifications Score Bank Candidate Scores Improve reliability, validity, and efficiency
  32. What are you doing?
More Related