1 / 26

Real World Consulting Organizational Performance Measurement

Mark Tregar, MS. Real World Consulting Organizational Performance Measurement. Agenda. Introduction and Bio Career Path What is Consulting Really? Example of Organizational Performance Management Project Questions?. Mark Tregar - Bio. Levels Across Firms: An example of my career path. 4.

hailey
Télécharger la présentation

Real World Consulting Organizational Performance Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mark Tregar, MS Real World ConsultingOrganizational Performance Measurement

  2. Agenda Introduction and Bio Career Path What is Consulting Really? Example of Organizational Performance Management Project Questions?

  3. Mark Tregar - Bio

  4. Levels Across Firms: An example of my career path 4 2 3 1 • My career trajectory has spanned 3 organizations in the past 8 years. • Possible UB graduates entry point: Between consultant and senior consultant!

  5. Why Companies Hire Consultants • Staff augmentation • External change force aka “political cover” • Best practices across industries and functions • Analytical horsepower • Fresh perspective • Training and skillset growth of staff Management consulting is the practice of helping organizations to improve their performance, primarily through the analysis of existing organizational problems and development of plans for improvement. Organizations may draw upon the services of management consultants for a number of reasons, including gaining external (and presumably objective) advice and access to the consultants' specialized expertise. http://en.wikipedia.org/wiki/Management_consulting

  6. A Review of Performance Measures Work at USAID November 21, 2013 USAID HR LOB CHRRPS Technical Integration Support Project

  7. Table of Contents • Background 8 • Purpose and Approach 9 • Conceptual Design 9 • Detailed Design 10 • Data Gathering 13 • Data Reporting and Analysis14 • Next Steps and Lessons Learned 16

  8. The purpose of the Performance Measure effort was to aid the measurement of the overall health of HR operations. • The team identified a series of specific Key Performance Indicators (KPIs) and then selected the appropriate KPIs to provide an accurate measure of current HR operations for USAID. • The identified KPIs were reported and measured in a one-time scorecard using Balanced Scorecard-based categories • Our approach to selecting KPIs was adapted from the 2009 OPM Shared Service Center (SSC) Human Resource Line of Business (HR LOB) Benchmarking Process, which was piloted with 8 agencies and expanded to 12 in 2010 Figure 1. Our 4-Step Approach Adapted from OPM’s SSC HR LOB Benchmarking Steps Conceptual Design Detailed Design Data Gathering Data Reporting and Analysis Steps Nov May-Jul Jul-Sep Dec-May Deliverables Scorecard Template ID of 10 OHR Process and HR LOB System KPIs Initial Performance Report Delivered: 5/24/13 Delivered: 8/9/13 Delivered: 12/12/12

  9. In Conceptual Design, IBM developed a conceptual model Scorecard Template and gathered feedback on inclusions. We envisioned a future state report/scorecard with two levels and four Balanced Scorecard Categories: Level 1 • Level 1 - Top OHR USAID Management Report/ Scorecard reports high-level results (i.e., Customer Service) with an overall status for a given time period. • Level 2 - HR Management and Staff Report/Scorecard reports at a lower level of results (i.e., results for a specific metric) for a given time period. Level 2 Sample Data – Not For Use

  10. In Detailed Design, the team refined our original list of 135 KPIs, to a smaller set that were most relevant to USAID. • We cut our list to 49 preliminary KPIs* based on initial performance measure research and a series of meetings with key stakeholders. • Our original list of 135 was an agency list with additional potential KPIs from IBM’s Benchmarking wizard • In addition, we added KPIs based on our first focus group which included HR Staff members and meetings across HR. • To gather feedback we conducted workshops with the following groups: HR leaders, HR staff (Customer Service Group), 3 HR teams, and one contractor team. End Result: Final KPIs selected and then measured 135 KPIs 54 KPIs 49* KPIs 45 KPIs * The umber of KPIs ranged between 26 and 55 throughout the stakeholder engagement process.

  11. In Detailed Design, we used a series of questions to help leadership stakeholders cut KPIs • Discussion Items: • Which KPIs are most valuable for selection and use? If so, why? • Which KPIs should be removed from our potential list? • Example KPI: HR Servicing Ratio - This metric measures the number of employees receiving HR services from Agency HR employees. The metric provides insight into the size and support level of HR. • UB Discussion: What are some others that could be important to an agency? Figure 1. Items to consider in reviewing KPIs:

  12. Following Detailed Design discussions, the IBM team rated on Relevance and Data Availability • Based on this understanding of USAID, we rated each KPI on Relevance and Data Availability • The sum of these two ratings forms the Overall KPI Usage Score • Each KPI was also cataloged into one of 12 categories (e.g., Time to Hire, Processing) to help understand KPI type • The final KPI determination for use was based on Overall KPI Usage Score and representation in 12 categories. • This allowed for targeting KPIs with available data, while avoiding oversampling in categories • The feedback we received from the HR Director and targeted HR groups helped us understand the key priorities for OHR. • Relevance Rating: (Likert scale of 1-5) • 1: Not Relevant to OHR Priorities • 2: Little Relevance to OHR Priorities • 3: Some Relevance to OHR Priorities • 4: Moderate Relevance to OHR Priorities • 5: Very Relevant to OHR Priorities • Data Availability Rating: (Likert scale of 1-5) • 1: Data is not available in any format • 2: Data is available sporadically and is maintained lessthan monthly • 3: Data is available but in a static spreadsheet format • 4: Data is fully available in a database and available to the team indirectly • 5: Data is fully available in a database and available to the team directly Relevance Rating + Data Availability Rating = Overall KPI Usage Score

  13. In Data Gathering, the team met with groups such as HR Training and Education (HR T&E) to locate and collect data • In USAID, each KPI had a specific HR owner who had access to raw data and could provide data to the team. • The team interviewed the data owners and tracked the status of data collection, the data source (identification of specific data base), and who could help us collect the data. • The team also tracked the specific parameters of each data element (e.g., the date range, and specific queries used in WA, Peoplesoft, or other databases) in order to validate the metrics at a later date • Sample KPI Discussion Questions: • In our current list of KPIs (right), which KPIs do you currently track? • If you track this information, where is it currently stored (e.g., Excel, a database)? • What metrics do you measure monthly? Annually? • Which KPIs are most important to your team? Least important? • In your ideal scorecard, which KPIs would be reported?

  14. In Data Reporting and Analysis, the team successfully collected and analyzed all data • Each KPI was reported across 3 periods of time • We compared each result to IBM’s Benchmark in terms of median and a specific public sector median. • The data was shared with the HR director in the form of the Initial Performance Report. The HR Director reported our findings in multiple sessions with her Agency Leadership Council (ALC) and the USAID Administrator, Dr. Rajiv Shah • Similar data analysis can be found in annual HR LOB Reports • http://www.opm.gov/services-for-agencies/hr-line-of-business/benchmarking/payroll-benchmarking/2011report.pdf • http://www.opm.gov/services-for-agencies/hr-line-of-business/benchmarking/hr-benchmarking/2010report.pdf

  15. Sample of results in 2010 OPM HR LOB Report • Agency HR Servicing Ratio: This metric measures the number of employees receiving HR services from Agency HR employees. The metric provides insight into the size and support level of HR. • Source: MAESC 2010 HR LOB Benchmarking Report - http://www.opm.gov/services-for-agencies/hr-line-of-business/benchmarking/hr-benchmarking/2010report.pdf

  16. HR Leaders are moving forward with a new BI Tool Deployment Next Steps • HR leaders are moving forward with BI Tool Deployment based on our ideas • We hope to work with USAID later this year Lessons Learned Do: • Get leadership’s input and approval before discussing the effort with other leaders and managers. • Use multiple ways of gathering information. Even if you are promised data, it may not be available as needed. • Create ratings based on your understanding of KPIs, and work to validate this understanding throughout the process. • Encourage transparency with the initiative’s goals, expected outcomes, and level of accountability. Don’t: • Hold up your progress for one person. • Look for one person to have all the answers. • Commit to more KPIs or data collection than necessary for your primary objective.

  17. Thanks! Mark Tregarmrtregar@us.ibm.com

  18. Mark Tregar Managing Consultant O O O USAID O O 8 / 9 Months Stephen Hutton O VG O VG WellPoint VG Contributed to IBM with work on over 10 proposals including VA HR LOB, OPM TMA, DHS S&T SETA IDIQ, Dept. of Energy, VA VISN 5, and State of New York. Overall, I was very successful in my first 9 months including solid client contributions, as well as B&P and MOS project success. At USAID, I received “O” ratings and high client satisfaction. At WellPoint, I stabilized a challenging client environment which supported our win. At CPAC, I completed all deliverables on time and on budget. USAID Team Lead / DPM • Led USAID’s Performance Measures development task, while providing competency management expertise in the launch of the USAID LMS. • Received positive client feedback for USAID performance such as: ”I've been impressed with the quality of his work products and his ability to dig right in…and figure it out.“ • Provide project management expertise at USAID including Financial and labor planning, and creating Project Launch documentation. • Led WellPoint communications team in a complex and competitive environment. • Led CPAC team delivering all deliverables on time and on budget, and developed executable plans for future work • Analyzed and presented performance management statistics to NASA HR leaders and received high client praise. Section Lead (Cost Estimate, Admin) Win VA HR LOB (New) 1-2 TBD Section Lead OPM TMA (Recompete) WellPoint TBD Contributor Dept. of Energy (New) Team Lead Section Lead TBD 1 DHS S&T SETA IDIQ Provided expert input across Federal B&P efforts: Led the VA HR LOB cost proposal input development by interviewing proposal leaders, and facilitated large proposal editing sessions. I authored a section of the OPM TMA proposal and co-authored the VA Social Media RFI response. I provided research and analysis for the Dept. of Energy solution and DHS S&T SETA IDIQ response. 3rd Client – VA CPAC, Role: Team Lead; Resources led/mentored: 1 Provided mentorship and leadership to all lower level staff, as well as clear and consistent feedback on all products and deliverables. Developed an improved Public Sector Competency Development Methodology Social Media for Human Capital White Paper Sep-Jan 2013 Nov-2012 (In Review) Lead Lead & Author Project 517/Career Pro – Competency Team Lead Team Lead Department of Energy (DoE) Vforce Contributor Apr-Aug 2012 April 2012 costs) at DoE HQ.

  19. Overview of Competency Based Management Competencies embody an individual’s total experience, including both training and education. A competency is an observable, measurable pattern of skills, knowledge, abilities, behaviors and other characteristics that an individual needs to perform work roles or occupational functions successfully. 19

  20. Relationship between KSAs and Competencies A competency is a “bundle” of the KSAs associated with a given area of work. It is the description of the observable behaviors that makes it different from a single K, S, or A. Competency: Internal Audit and Control Review and validate accounting and administrative controls to safeguard the integrity of programs and achieve compliance in accordance with government auditing standards Knowledge Understand accounting and administrative controls Skills Operate internal audit software programs to validate compliance Abilities Able to audit standards when dealing with financial information A competency is more than just the sum of knowledge, skills, and abilities

  21. Competency Development Process Validate and Apply Plan Initiate Develop and Refine Plan, envision, and communicate competency effort Select and initiate specific competency modeling approach Develop and refine competency models Validate competency models and apply models in various applications 1 • Define Objectives • Build business case for competency model • Link model to strategic plan • Collect Background Materials • Request existing competency models, position descriptions, etc. • Review previous related IBM competency models • Research and Review • Review strategic plan, and organizational background • Review related occupational models • Develop and Execute Communications Plan to Build Sponsor Support • Deliver kick-off brief • Collect feedback and create additional briefings • Identify data collection methodology to meet project objectives • Identify proper mix of interviews, focus groups, and surveys • Identify & Select SMEs • Work with sponsor to identify Resource Panel SMEs and Incumbent SMEs • Identify required SME population details (i.e., numbers of staff, occupations, levels) • Initiate effort by inviting attendees to Resource Panel • Support effort with the use of an executive memo or message of support for effort • Develop/refine competency model using Resource Panels • Hold 2-3 facilitated sessions with SMEs • Output: Straw man competency model • Refine competency model using Incumbent SMEs via focus group or survey [initial validation] • Refine model using SME input (i.e., ratings, behavioral examples) and factor analysis (100+ SMEs required) • Output: Interim Competency Model • Develop final competency model • Analyze data and develop final model • Present final model to sponsor • Output: Final Competency Model • Conduct final validation survey with Incumbent SMEs for preparation in deeper applications • Develop and refine competency survey behavioral anchors • Execute competency validation survey • Note: This can double as a competency assessment (including proficiency items) • Output: Validated Competency Model • Apply model in workforce applications • Includes workforce & succession planning, selection*, training needs assessment) 2 3

  22. An Example Of Data Collection for Competency Modeling • Behavioral Event Interviews (BEI) are a type of structured interview use for collecting competency data from SMEs • BEIs can be used to collect additional qualitative information for use in competency model development. • BEIs often use a Situation-Task-Action-Reasoning/Results format in response to a prompt (i.e., Tell me about a time when…) 22

  23. Class Discussion • Why do IBM consultants exist? What is their job – what is the end result for consultants? • What types of competencies are useful? • What are lacking in the workforce today? • Think of what this looks like on the job: Think of a recent situation where you helped a client, where you saw this happen (i.e., at work, in a movie). What types of behaviors helped? How would you classify those behaviors? 23

  24. As you move up the career leader there is an expectation of deeper involvement, independence, and relationship building. Project Planning Communication Work Product Development Customer Management What is this chart for? Many beginning practitioners do not have a solid idea as to what is expected of them based on their particular level. This chart make the expectations more clear.

  25. Applying Competency Models in Recruiting/Hiring Competency: Financial Management Interview Question: “In regards financial management, can you tell me a time when you dealt with a large budget in a complex project that was delivered on time and on budget? Please be specific as to size of budget, team and any constraints.” Behaviors Behavioral Interviewing: Once a validated model is created with a full set of behavioral anchors (i.e., a model with 1, 3, 5 benchmarks) you can create behaviorally-based interview questions. 1-Not Capable • Understand asset management principles to budgeting and programming projects • Know about available tools (unit price books, other contract estimate tools) in order to conduct complex end-to-end cost estimates • Understand asset management principles for budgeting and programming projects and able to determine rules for use • Apply available tools in basic situations (unit price books, other contract estimate tools) in order to conduct complex end-to-end cost estimates 2-Minimally Capable • Apply asset management principles to budgeting and programming projects • Interpret and apply available tools (unit price books, other contract estimate tools) in order to conduct complex end-to-end cost estimates 3-Capable 4-Fully Capable • Apply asset management principles to budgeting and programming projects • Selects the best available tools (unit price books, other contract estimate tools) in order to conduct complex end-to-end cost estimates • Evaluate asset management principles and execution in budgeting and programming projects • Evaluate the use of available tools (unit price books, other contract estimate tools) in order to conduct complex end-to-end cost estimates 5-Exceeds Capability 25

  26. All Top Consulting Firms Use Competencies in Hiring • Companies are looking for examples of your past work. Remember, history repeats itself! • What you did in the past, chances are you will continue – good and bad • Think about what they are hiring for (look at the job posting) then think about: • What competencies are do they need in their hires? • What BEHAVIORS can I talk about? A pro-tip: Use the STAR response method. It will keep you from going beyond what people want.

More Related