1 / 42

Value Measuring Methodology

Value Measuring Methodology. Council for Excellence in Government: Benefits Assessment Workshop. May 2003. Non-financial benefits are not directly factored into analysis…. No structure to force the development of quantifiable measures ….

yonah
Télécharger la présentation

Value Measuring Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value Measuring Methodology Council for Excellence in Government: Benefits Assessment Workshop May 2003

  2. Non-financial benefits are not directly factored into analysis… No structure to force the development of quantifiable measures… • Assumption that what is good for governmentis good for citizens… Why isn’t “traditional” business case analysis providingthe information OMB is looking for? Primarily focused on financial benefits (e.g., ROI) that impact the government only… • Analysis is viewed as a means to get funding, not a tool for on-going management & evaluation…

  3. How can traditional analysis be supplemented to better address the challenges of the e-Government environment? • In July 2001, the Social Security Administration (SSA), in cooperation with the General Services Administration (GSA), took on the task of developing an effective methodology to assess the value of electronic services that would be: • 1. Compliant with current federal regulations & OMB guidance • 2. Applicable across the federal government • 3. “Do-Able” • A team of Booz Allen analysts and thought-leaders affiliated with Harvard University’s Kennedy School of Government were contracted to support this effort.

  4. The approach used to develop VMM was built upon the foundation of a public / private partnership research & analysis Critical Inputs & Research development presentation discussion

  5. The output of this effort was the Value Measuring Methodology (VMM) • First articulated in Building a Methodology for Measuring the Value of e-Services (1/02) • Refined & tested through application to two cross-agency e-Government initiatives (e-Travel & e-Authentication) • Release of the VMM How-To-Guide and VMM Highlights document by the Best Practices Committee of the CIO Council (10/02) • VMM Roll-Out, held by the Council for Excellence in Government in cooperation with the CIO Council’s Best Practices Committee, OMB, and GSA (4/03)

  6. VMM Overview

  7. It is important to understand what VMM IS and ISN’T… COMPLIANT WITH GPRACCAOMB A-11 CONSISTENT WITH THE PHILOSOPHY OF THE PMA • VMM IS… • A scalable and flexible approach for quantifying and analyzing value, risk, and cost and evaluating the relationships among them • Helps to create a roadmap for on-going management and evaluation • Supports the development of critical management plans • VMM IS NOT… • One Size Fits All • A Way to Avoid Analysis • Only Useful for e-Government Initiatives

  8. value • What Benefits Will It Provide to… • Direct Users • Society • Government value risk cost • What Could Make Costs Go Up or Performance Slip From Projected Levels? cost How MuchWill It Cost? risk The Essential Factors…

  9. A Decision Framework Define User Needs & Priorities Quantifiable Measures of Performance (Metrics, Targets) Foundation for Analysis & On-going Performance Measurement Early Consideration of Risk Risk Inventory Risk Tolerance Boundary

  10. Communicating Value to Customers and Stakeholders What will make an Appropriations Committee staff member or OMB care about an investment in digital Land Mobile Radio (LMR) equipment for public safety agencies across government? The technically superior digital technology offers more bandwidth than analog technology because the signal is…. OR Using digital LMR will prevent the catastrophic communications malfunctions and inefficiencies that cost lives in the aftermath of 9/11 in NYC. Digital LMR will accomplish this by…

  11. Part I – Capital Asset Plan and Business Case (All Assets) • Summary of Spending • Project Description and Justification • Performance Goals and Measures • Program Management • Alternatives Analysis • Risk Inventory and Assessment • Acquisition Strategy • Project and Funding Plan Part II – Additional Business Case Criteria for Information Technology • Enterprise Architecture • Security and Privacy • GPEA OMB 300 • Fully satisfied by VMM outputs • Supported by VMM output and process VMM Effective in Building “WINNING”OMB Exhibit 300s • PMA Imperatives • Captures All Value Factors/Benefits • Analytic Rigor • Clarity • Completeness • Focus On Results

  12. Value

  13. Project ValueDefinition (Measures) Project ValueDefinition (Measures) Project ValueDefinition (Measures) Project ValueDefinition (Measures) Project ValueDefinition (Measures) Identifying and Defining Value L A Y E R 2 Project Value Project Value (Measures) Definition Definition Definition Definition Definition Direct User (Customer) Social (Non-Direct User) Government Financial Government Operational/ Foundational Strategic/ Political L A Y E R 1

  14. Structured Approach to Identifying and Defining Value Measures The way measures are articulated can directly impact the way they are perceived and understood. The definition must consist of four parts: 1 2 4 3

  15. Building A Direct User MeasureWhat Do Users Want?

  16. Prioritizing Value Factors - The Analytic Hierarchy Process Analytical Hierarchy Process (AHP) tools are designed to help groups enhance the quality of their decisions. These tools: • Bring structure to the decision-making process; • Elicit ideas, feelings and the judgments of stakeholders; • Represent those judgments as meaningful numbers; • Synthesize the results; and • Analyze the sensitivity of those judgments to changes. Through the use of pair-wise comparisons, the relative importance of each of the criteria is calculated Attention is focused on areas of disagreement

  17. Summary of VMM Weighting & Scoring for Title XVI “Check Your Benefits” Applying VMM to Title VXI “Check Your Benefits,” we determined the following scores for each of the Value Factors and their respective value measures

  18. Risk

  19. Identifying and Defining Risk Risk that is not identified cannot be mitigated. Risks that are not mitigated can cause a project to fail either in the pursuit of funding or, more dramatically, while the project is being implemented. IDENTIFYING RISKS: • Consider “standard” IT project risks • Identify project specific risks via input from technical & policy staff, representatives of partner agencies collected from: • Working Sessions • Survey Efforts • EXAMPLE OMB RISK CATEGORIES: • Project Resources / Financial • Technical / Technology • Business / Operational • Organizational & Change Management • Data / Information • Security • Strategic • Privacy

  20. Defining Risk Tolerance • Organizational Tolerance For Cost Risk (increased cost) • Organizational Tolerance for Value Risk (slippage in performance) • What is the decision processbehind the following: • Buying a $1 lottery ticket for the chance to win $1 million. Odds are 1 in 1,000. • Buying a $100 lottery ticket for the chance to win $1 million. Odds are 1 in 1,000. • Buying a $100 lottery ticket for the chance to win $10 million. Odds are 1 in 1,000.

  21. VALUE TOLERANCE BOUNDARY 35% 30% 30% 30% Unacceptable Unacceptable 25% 25% 25% Area Area VALUE RISK BOUNDARY 20% 20% 20% VALUE RISK 15% 15% 15% Unacceptable Area Unacceptable Area Value Risk Boundary COST TOLERANCE BOUNDARY 10% 10% 10% 35% 35% Acceptable Area Acceptable Acceptable Area Area 5% 5% 5% 30% 30% COST RISK BOUNDARY 0% 0% 0% 25% 25% 0 0 0 10 10 10 20 20 20 30 30 30 40 40 40 50 50 50 60 60 60 70 70 70 80 80 80 90 90 90 100 100 100 VALUE SCORE 20% 20% Cost Risk Boundary COST RISK 15% 15% 10% 10% Acceptable Acceptable Area Area 5% 5% 0% 0% $ $ - - $5 $5 $10 $10 $15 $15 $20 $20 $25 $25 $30 $30 $35 $35 $40 $40 $45 $45 $50 $50 COST ($M) As the estimated most likely value score increases, risk tolerance is likely to increase. As the estimate most likely cost increases, risk toleranceis likely to decrease. Value and Cost Risk Tolerance Boundaries communicate the upper limit of the range of risk an organization will accept in both areas.

  22. Cost

  23. Identifying & Defining Costs Consider Value and Risk Ensure a complete, comprehensive cost estimate … Alleviate the risk of missing costs or double-counting… by developing a Cost Element Structure Investments made on incomplete or inaccurate estimates are likely to run out of funding and, therefore, require justification for additional funding or a reduction of initiative scope

  24. Estimating and Comparing Value, Cost, & Risk

  25. Identifying and Defining Viable Alternatives Identify viable alternatives that have the potential to deliver an optimum mix of both value and cost efficiency PEOPLE PEOPLE trainingoutreachmanagementstaffingcommunicationsrecruitmentsocializationuser support508 requirementslanguage requirementsEA / FEA TECHNOLOGY TECHNOLOGY • hardware • software • interface • data req. • EA / FEA PROCESS PROCESS Alternatives Must Address People, Process & Technology! BPRAcquisitionOutsourcing/in-sourcingconcept of operationsrisksecurityprogram managementfundingcollaborationcommunicationsevaluationlegislative req.policy req.EA / FEA

  26. The Base Case rising demand Projects the results of maintaining current systems and processes while attempting to keep pace with changes over time. base case workforce attrition customer satisfaction status quo T I M E

  27. Collecting Data Avoid Analysis Paralysis: Match Information to the Phase of Development Data sources and detail depend upon the initiative’s stage of development Use the best information available rather than looking for information that doesn’t exist Update this information as “better” information becomes available ALWAYS DOCUMENT DATA SOURCES & ASSUMPTIONS

  28. Using Ranges USE RANGES TO INCREASE CONFIDENCE IN COST ESTIMATES! Low Med High Inputs 100 150 200 Projected Range of Training Costs Inputs # of Employees to be Trained/year Annual Cost per Employee Trained EXAMPLE $1000 $1200 $1500

  29. Uncertainty and Sensitivity Analysis Conduct Uncertainty and Sensitivity Analyses on Both Cost & Value Estimates Uncertainty Analysis • Based on considerationsof requirement, cost estimating, and technical uncertainty • Increases confidence in the estimate. Doesn’t increase the precision of the estimate • Tool: Monte Carlo Simulation • Output: “Most Likely” or “Expected” Cost & Value • Sensitivity Analysis • Based on the output of the Monte Carlo Simulation • Sensitive variables have a significant impact on the overall estimate • Output: Identification of which variables have a significant impact on the overall estimate. Can be used to determine which variables merit additional research

  30. Analyzing Cost Risk and Value Risk The impact of a single risk factor may differ in magnitude at each point where it interacts with cost and value The probability of a specific risk occurring remains constant through out the analysis of a specific alternative, regardless of where it impacts the value or cost of a particular alternative

  31. Pulling Together the Information • You should be able to answer the following questions… • What is the estimated cost of each alternative? • What is the financial return on investment associated with the alternatives? • What is the value score associated with the alternatives? • What are the cost and value risks associated with this alternative? What effect do they have? (value and cost risk scores) • How do the value, risk and cost of the alternatives compare? • Does the cost risk and value risk associated with the alternatives fall within the range represented by the relevant risk tolerance boundaries?

  32. Comparing Value to Cost Investment Cost To Value (Expected & Risk-Adjusted) 100 90 80 Alt 1 70 Alt 3 Alt 2 60 V A L U E Expected Alt 1 50 Risk Adjusted Alt 1 40 Expected Alt 2 30 Risk Adjusted Alt 2 20 Expected Alt 3 Risk Adjusted Alt 3 10 0 $- $5 $10 $15 $20 $25 $30 $35 $40 C O S T ($M) Based on This Information, Which Alternative Would You Choose?

  33. COMPARING VALUE TO VALUE RISK COMPARING COST TO COST RISK 35% 35% 30% 30% UnacceptableArea UnacceptableArea VALUE RISK BOUNDARY 25% 25% COST RISK BOUNDARY V A L U E R I S K C O S T R I S K 20% 20% $30 17% 70 Alt 1 15% 15% 14% 80 $25 Alt 3 12% 13% Alt 1 $10 10% 10% 10% Alt 3 70 Alt 2 Acceptable Area 7% Alt 2 Acceptable Area Acceptable 5% 5% Area 0% 0% $- $5 $10 $15 $20 $25 $30 $35 $40 0 10 20 30 40 50 60 70 80 90 100 V A L U E C O S T ($M) Comparing Value to Value Risk,and Cost to Cost Risk The risk associated with all of the value scores fall within the acceptable area. Alt. 2 bears the lowest value risk. The only alternative that falls squarely within the Cost Risk Boundary is Alt. 2.

  34. The VMM Guide

  35. The VMM How-To Guide provides best practice analysis techniques, real examples and required resources

  36. VMM Step 1: Develop a Decision Framework Define User Needs & Priorities Quantifiable Measures of Performance (Metrics, Targets) Foundation for Analysis & On-going Performance Measurement Early Consideration of Risk Task 1: Identify & Define the Value Structure Task 2:Identify & Define the Risk Structure Risk Inventory Risk Tolerance Boundary Task 3: Identify & Define the Cost Structure Task 4:Begin Documentation

  37. VMM Step 2: Alternatives Analysis (estimate value, cost, & risk) Task 1:Identify & Define Alternatives Viable Alternatives Base Case - What will happen if nothing changes? Match levelsof information to the phases of development Step 1 V A L U E Value Factors priority S E N S I T I V I T Y U N C E R T A I N T Y value measure(s) metric, target, scale priority Low High Expected Task 2:Estimate Value & Cost R I S K Risk Inventory Risk Tolerance Boundary Risk Analysis Task 3:Conduct Risk Analysis COST CustomizedCost ElementStructure 1.0 … Low High Expected 2.0… Task 4:On-going Documentation U N C E R T A I N T Y S E N S I T I V I T Y 3.0 …

  38. VMM Step 3: Pull Together the Information Task 1:Aggregate the Cost Estimate Step 2 ExpectedValue Score Step 1 Task 2: Calculate the Return-on-Investment V A L U E Value Factors priority S E N S I T I V I T Y U N C E R T A I N T Y Low High Expected value measure(s) metric, target, scale priority Risk Adjusted ExpectedValue and Cost Risk Scores R I S K Risk Analysis Risk Inventory Risk Tolerance Boundary Task 3: Calculate the Value Score Expected Cost COST CustomizedCost ElementStructure 1.0 … Low High Expected 2.0… RiskAdjusted Expected ROI U N C E R T A I N T Y S E N S I T I V I T Y 3.0 … Task 4: Calculate the Risk Scores Expected ROI Task 5: Compare Value, Risk, & Cost Government Cost Savings/Avoidance

  39. VMM Step 4: Communicate and Document Step 2 Task 1: Communicate Value to Customers and Stakeholders Step 1 V A L U E Value Factors priority Step 3 S E N S I T I V I T Y U N C E R T A I N T Y ExpectedValue Score Low High Expected value measure(s) metric, target, scale priority R I S K Risk Adjusted ExpectedValue and Cost Risk Inventory Risk Scores Risk Analysis Risk Inventory Risk Tolerance Boundary Task 2: Prepare Budget Justification Documents COST CustomizedCost ElementStructure 1.0 … Low High Expected Expected Cost 2.0… RiskAdjusted Expected ROI U N C E R T A I N T Y Expected ROI Task 3: Satisfy Ad Hoc Reporting Requirements S E N S I T I V I T Y 3.0 … Government Cost Savings/Avoidance Task 4: Use Lessons Learned to Improve Processes Reporting Consensus Building Investment Planning Management Planning

  40. Q & A

  41. VMM establishes an even scale for quantifying and analyzing value, risk, and cost • Measures tangible and intangible benefits • Accounts for risk in cost and value calculations • Increases reliability of ROI through simulation • Tested and proven in multiple E-Gov projects • Flexible and adaptable • Results and outcome driven • Allows examination of the relationships among Value, Cost and Risk • Feasible for portfolio management Risk Inventory Risk Tolerance Boundary

  42. Building a Methodology for Measuring the Value of e-Services • http://www.estrategy.gov/documents/measuring_finalreport.pdf • VMM How-To-Guide and VMM Highlights • http://www.cio.gov/ – best practices page: • http://www.cio.gov/documents/ValueMeasuring_Methodology_HowToGuide_Oct_2002.pdf • http://www.cio.gov/documents/ValueMeasuring_Highlights_Oct_2002.pdf

More Related