1 / 32

When Surveys Collide

When Surveys Collide. May 8, 2007. Artist & Cast Introductions. Gene Baker, Keane Ken Cardinal, Pearl Meyer & Partners Patrice Daprino, IBM Tina Mulligan, Nortel Andrea Sears, Sprint Nextel. The Problem.

Télécharger la présentation

When Surveys Collide

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When Surveys Collide May 8, 2007

  2. Artist & Cast Introductions • Gene Baker, Keane • Ken Cardinal, Pearl Meyer & Partners • Patrice Daprino, IBM • Tina Mulligan, Nortel • Andrea Sears, Sprint Nextel

  3. The Problem Many firms use multiple survey sources and the numbers from these sources may or may not match. Compensation professionals may be challenged by data that is inconsistent with data they are using. One thing is certain, the answers are always as much art as science… +

  4. Today • We’ll explore typical problem areas • Handling multiple survey sources • Dealing with internet data • Differences between types of data • Sales • Executive • Professional • Administrative • Geographic issues • Industry issues • Job scope issues • International data issues

  5. Collisions • We’re talking about collisions of competing information. • In our discussion, we’ll also introduce situations compensation professionals routinely face • Difficult employees & managers • Employees with access to the internet • Employee distrust of “management” • Base salary vs. total cash compensation • Internal equity issues

  6. Our First Example • Concerns looking at survey data from within the same survey…

  7. Actual Survey Data U.S. wide Silicon Valley

  8. Our Next Examples Are A Little More Fun… • A “standard” collision • An internet collision • An international collision • An executive compensation collision • A sales compensation collision

  9. Survey Collision – Product Development

  10. What Causes Collisions? Survey participants • Differences in the participant base • Number • Type • Size • Is there an industry focus? • Are your competitors in the survey?

  11. Job Descriptions • Survey A: Creates, designs and develops the company’s new products and/or services. Translates concepts and technologies into product design. Innovation and creative problem solving are required. May act in a lead role for a product development team. • Survey B: Plans and develops new products. This senior role leads a team of product developers through concept creation, technical development, testing, and rollout of the product to market. Experience required is typically 8 years post-college. • Survey C: Manages the lifecycle of a product through development, rollout, evaluation/maintenance, and retirement. May be assigned a single product or multiple products. Typically reports to a Product Development/Management Director.

  12. What Causes Collisions? Job descriptions • Level of detail • Tasks included • Complexity • Level of experience • Reporting relationships • 70 - 80% content match

  13. What Causes Collisions? Survey cuts • Where is your labor pool? • Which scope is most important for this job? • Revenue • Industry • Headcount • Geography • Pay mix / quota (for sales jobs)

  14. What Causes Collisions? Leveling • How many levels? • What are the level cutters? • Knowledge • Complexity • Experience • Education • Reporting relationships

  15. What Causes Collisions? Survey Reliability • Year-over-year data stability • Survey age • Size of incumbent database Input Insights • Which companies matched that job? • Average / median revenue of matching companies

  16. What Causes Collisions? Survey Factors • Participants / relevance • Age / history • Reliability Job Factors • Job descriptions • Data cuts • Leveling • Companies matching Remember: Comparing data from multiple survey sources is an ART, not a SCIENCE!

  17. Let’s Look At Some Survey Data – Internet Hot Off the Press… Excellent Engineer Announces…

  18. Script 2 – Takeaways Survey Data Reporting • Where data originated • How data is collected • Self-reported • Provided by compensation professionals • Involvement of third-party provider • Confidentiality measures

  19. Script 2 – Takeaways Determining A Reliable Survey Source • Robust benchmark job descriptions • Detailed leveling guide • Job responsibilities • Knowledge • Problem complexity • Impact on business • Education and experience • Job matching, preferably face-to-face • Proven methodology

  20. Script 2 – Takeaways Determining A Reliable Survey Source • Proven methodology • Quality controls in data collection and analysis • Data inspection • Process for handling outliers and data anomalies • Following safe harbor guidelines • Dealing with data dominance • Ensuring no one firm drives the results • Survey stability over time • Historical record • Trending • Confidence in the data

  21. Script 3 – Data

  22. Script 3 – Takeaways • Understand elements included in pay figures and how elements are reported. • Allowances • Allowances (housing, transportation, meals, etc.) in addition to base • Fixed amount or percent of base pay • Total Annual Pay • Monthly pay increments • Commonly referred to as guaranteed or 13th / 14th month bonus

  23. Script 3 – Takeaways • Understand the “make-up” of the overall incumbents • included in the survey results. • Participating Companies • Survey results can vary significantly based on the companies that are included in the survey results. • Companies with small populations may not offer full benefit packages / stock options. • Geography • Like the U.S., significant differences exist within countries.

  24. Let’s Look At Some Survey Data - Executive 266K

  25. Take Away • Examine the nature of the position you’re reviewing and determine how to compare it. • Compensation is about more than looking at a page and finding a number. • The art and science is often as simple as making sure you’re looking at the right page… • And don’t get too bruised if sometimes, somebody goes over your head.

  26. Sales Compensation Benchmark – Key Questions • Did we select the right data to use in our compensation planning? • Where does the market analysis “collide” with our plan design? • Things to look for and compare to your firm • Caps • Market data payout curves • Market difficulty in reaching quota • Assign compensation credit towards commissions

  27. Sales Compensation Benchmark – Market Analysis

  28. Sales Compensation Benchmark – “Collision Page” • What observations can we make • Competitive compensation and aligned to market at Quota • Only 20% of sales employees exceeded quota • Degree of difficulty in reaching quota • Is the risk of missing quota sufficiently dealt with in the plan or the earnings? - We do not have a threshold {Positive} - We do not have a cap {Positive} - Our earnings lag market below quota {Negative} - We credit commissions on revenue {Negative} - Our quotas are higher than market {?}

  29. Sales Compensation Benchmark – “Questions” • What new questions emerge • Do we actually lag the market? Is this a relevant question?Is Biff telling me the truth? • Our pay relationship compared to market? • Plan design problem compared to market? • The right risk versus return when compared to market?

  30. Sales Compensation Benchmark – “Questions” • Recommendations • Consider targeting total cash compensation closer to the 75th Percentile Our risk of missing quota “collides” with the market risk; While we may not lag at target, we lag in actual earnings. • Consider paying on bookings Market analysis shows common practice Paying on bookings improves cash flow where difficult stretch quotas • Look at top performers Retention risk in difficult stretches during the year • Stay within our CEO’s pay philosophy of asking for stretch performance

  31. Sales Compensation Benchmarking Sales Compensation Benchmarking is an art, not a science. It’s often not about the numbers, but about asking the right questions and forming the right solutions.

  32. Summary • Lots of organizations use multiple surveys. • All organizations have multiple sources of information. • Synthesizing survey and other information requires an understanding of • Where the information came from • How the information was developed • Reliability of the information

More Related