1 / 14

IAEWS Benchmark Study September 2011

IAEWS Benchmark Study September 2011. Background and Purpose. Effort began in Jan 2011 Create an industry led benchmark study of key metrics for job board operations and effectiveness Challenges in defining the scope and depth of questions

claytons
Télécharger la présentation

IAEWS Benchmark Study September 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IAEWS Benchmark StudySeptember 2011

  2. Background and Purpose • Effort began in Jan 2011 • Create an industry led benchmark study of key metrics for job board operations and effectiveness • Challenges in defining the scope and depth of questions • Significant work in developing common definitions from worldwide community • Use first year results as a baseline to improve and “dig deeper” in future year

  3. Methodology • Input regarding scope and questions solicited from many boards worldwide • Jobg8.com sponsorship allowed for no-cost participation from worldwide boards • Contracted service from professional research company (Critical Insights) to improve online data collection and ensure data security • Relatively short time frame to create/ distribute/collect/correlate and publish results

  4. Participation and Response • Solicited all IAEWS members and other worldwide job boards in June and July • 154 Job boards registered to participate in first year study • 124 boards submitted some response • 101 boards completed the online survey • Invited to participate in discussion of results • Survey closed on August 8th- results published and distributed September 1st • High Level results to all IAEWS members today

  5. Who participated? Reporting region:

  6. Types of Boards

  7. Job Board- Years in business

  8. Performance Metrics • Data collected on 16 different KPI’s • Cross tabs done on 5 different segments • This morning’s participants requested that the 2hr. discussion focus on: • Sources of Traffic- relative importance • Comparison of Job board features and functionality • Application production rates for different postings • Expense ratios for Marketing/Sales/Technology • Sources of Job Postings

  9. Some Interesting Metrics • Average time on Site- Median= 4.0 mins • Wide variations by region and type of board • Types of Traffic – Window Shoppers or Take action • Median= 70% “Window Shoppers” • Median = 30% “Take some action” • Wide variations based on region, type, tenure and size • Bounce Rates- Median = 41% • Wide variations based on region, tenure and type

  10. Some interesting Metrics • Significant variations based on region, type, tenure and size

  11. Some interesting Metrics • Performance!- generating applications or candidates to an ATS • ATS Postings • Median = 5.0 • Large variations based on region, size, tenure, type (range of 1-22) • “Email” postings • Median= 3.3 • Large variations based on region, size, tenure, type (range of 1-28) • 40% of respondents checked “Unknown”

  12. Summary of Discussion Session this AM • Great opportunity for real discussion on common issues facing our industry • Candid discussion - sharing of some best practices • Further individual analysis by each participant to compare results against their niche- set goals as to where they want to be by next year • Confirmation that this study should be done again next year • Great suggestions on how to improve the instrument / process and participation

  13. 2011 Study- Lessons Learned • Solicit members to participate in Steering Committee by late Fall • Scope, questions, promotion, response requirements • Utilize 2011 study and comments from member discussion to better craft questions- probe deeper on certain issues • Complete survey by early summer • Use steering committee to improve the survey • Focus on some additional areas of job board performance • Longer discussion period at Fall IAEWS

  14. Final Thoughts • A great first year effort by the Association and the members who participated • If you did not participate- you missed the opportunity for some good data • Continuous improvement through continuous measurement and analysis • Watch the IAEWS newsletter for an opportunity to participate in the steering committee or as a survey respondent next year.

More Related