1 / 35

ASW METOC Metrics: Metrics Overview, Goals, and Tasks

ASW METOC Metrics: Metrics Overview, Goals, and Tasks. Tom Murphree Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, and Luke Piepkorn Systems Planning and Analysis (SPA) pvodola@spa.com

leone
Télécharger la présentation

ASW METOC Metrics: Metrics Overview, Goals, and Tasks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASW METOC Metrics: Metrics Overview, Goals, and Tasks Tom Murphree Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, and Luke Piepkorn Systems Planning and Analysis (SPA) pvodola@spa.com CAPT(s) Mike Angove OPNAV N84 michael.angove@navy.mil Brief for ASW METOC Metrics Symposium Two 02-04 May, 2007 Ford B and T. Murphree, Metrics Overview, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  2. ASW METOC Metrics: Key Concepts • Definition • Measures of the performance and operational impacts of the CNMOC products provided to ASW decision makers. • Uses • Improve product generation and delivery processes, product quality, assessments of uncertainty and confidence in products, product usage, and outcomes of ASW operations. • Allow CNMOC to more effectively participate in ASW fleet synthetic training, reconstruction and analysis, campaign analysis, and other modeling, simulation, and assessment programs. • Evaluate new products, including new performance layer and decision layer products. Ford B and T. Murphree, Metrics Overview, May 07, bruce@clearscienceinc.com, murphree@nps.edu

  3. ASW METOC Metrics: Key Concepts • Metrics System • Capable of collecting and analyzing data on actual products, verifying observations, decisions made by users of the products, and outcomes of user operations. • Minimal manpower impacts through extensive use of automation. • Metrics delivered in near real time and in formats that allow CNMOC managers to effectively use the metrics in their decision making. • Includes operations analysis modeling capability to simulate operational impacts of products. Model metrics complement those derived from data. Modeling simulates events difficult to represent with actual data (e.g., rare or difficult to observe events), and can allow experimentation with different scenarios (e.g., different levels of product accuracy, different CONOPS for METOC support).

  4. Tier 3 – Decision Layer Tier 2 – Performance Layer Tier 1 – Environment Layer Fleet Data Initial and Boundary Conditions Satellites METOC Metrics and Battlespace on Demand Tiers METOC metrics address all three tiers --- in particular, the performance and operational impacts of each of the three layers.

  5. Key Questions • In what ways does the METOC community impact the ASW • fight? • How can we measure those impacts? • How can we use those measures to improve our impacts?

  6. Other Questions Metrics Can Help Answer • Where are the gaps in METOC support? • Are we good enough? • Is a product really worth generating? • Is there a more efficient way to produce our products? • What difference do we make to our customers? • How could we improve our impacts on customers? • How much confidence should we and our customers have in our products?

  7. High Priority Metrics for ASW Directorate • Operational impact metrics: main metric • Product performance metrics: stepping stone to main metric

  8. Operational Outcomes Operational Plans METOC Forecasts * METOC Observations Operational Performance Metrics METOC Performance Metrics Metrics of METOC Impacts on Operational Performance Apply this process to both real world data and output from military mission models. Process for Developing METOC Metrics * or other products

  9. Operational Modeling – Overall Intent • Operational model of ASW scenarios: “laboratory” for experiments • Similar to the strike / NSW application of WIAT • Investigate the effect of METOC products and services across a wide range of ASW scenarios • Effect of increased accuracy • Effect of enhanced timeliness • Develop METOC support benchmarks to evaluate real world performance and establish goals Scenario Multiple combinations of METOC products and services support ASW METOC Product

  10. Operational Modeling – Development • Early stages in model development • Identify the scope of the process to be modeled • Identify the METOC data flow / mission planning for the desired scope • Identify the end-user of the model and desired outputs • Later stages in model development • Develop a simulation of the METOC data flow and mission planning • Incorporate real-world metrics as available to improve model fidelity and accuracy • Develop output to describe the impact of improved METOC accuracy and/or timeliness

  11. NAVO / RBC Support Theater Planning Operations METOC Data NOAT / NOAD Supporting Operations ASW Operations ASW Results ASW RBC ASW Mission Planning Process Operational Impacts As more components of the support chain are assessed/modeled, the scope of the project becomes greater METOC Support Chain for ASW

  12. NAVO / RBC Support Theater Planning Operations METOC Data NOAT / NOAD Supporting Operations ASW Operations ASW Results ASW RBC ASW Mission Planning Process Operational Impacts Smallest scope that could be modeled and produce useful results METOC Support Chain for ASW – Small Scale Model

  13. NAVO / RBC Support Theater Planning Operations METOC Data NOAT / NOAD Supporting Operations ASW Operations ASW Results ASW RBC ASW Mission Planning Process Operational Impacts Model starting from the mission planning process through operational impacts (e.g. HVU losses; expected threats killed) Similar scope and LOE as WIAT for STW METOC Support Chain for ASW – Medium Scale Model

  14. NAVO / RBC Support Theater Planning Operations METOC Data NOAT / NOAD Supporting Operations ASW Operations ASW Results ASW RBC ASW Mission Planning Process Operational Impacts Extension of the mission planning process backwards, to capture the RBC / NOAT influence on operational impacts METOC Support Chain for ASW – Large Scale Model

  15. Operational Modeling – Notional Modeling Output • For each combination of METOC support product and scenario, display the payoffs from increasing accuracy and/or timeliness to: • Determine the level of METOC support that meets ASW requirements • Enable decision-maker to balance cost vs. benefit of product improvement Resulting METOC Metrics Scenario No ASW payoff from additional accuracy / timeliness ASW performance is improved METOC Product UNSAT - No impact above historical

  16. Operational Modeling – Notional Metrics Assessment Ex: SLD Prediction Accuracy Perfect Prediction Actual data come from metrics collection processes instituted for VS07 Actual Predicted data come from METOC products distributed by RBC Predicted

  17. Operational Modeling – Notional Metrics Assessment Ex: SLD Prediction Accuracy Acceptable tolerance lines (no significant impact to operations) Some impact to operations – becomes a risk management decision Actual Performance thresholds generated by modeling and simulation Region of unacceptable performance Predicted

  18. 10-20X higher likelihood of missing sfc layer without in-situ sensing “Glider” data included GDEM data only Capstone MetricsErrors in environmental depiction … … lead to errors in tactical decisions Question:What level of environmental sensing is needed to “sufficiently” enable decisions? Slide provided by CDR Mike Angove, N84

  19. ??? Super-invested 100% (Ref Case) Properly invested 80% Target Capability Range (notional) Decision Certainty* Under-invested 65% 50% Oceanographic Sensing Resolution Capstone MetricsROI for Environmental Knowledge StudyNotional Output Curve Deliverable: Analysis will fit POM-08 LBSF&I/PR-09 altimeter investment to this curve * e.g., CZ or Sfc Layer presence Slide provided by CDR Mike Angove, N84

  20. Metrics Steps • Determine what we want to know and be able to do once we have a • fully functioning metrics system. • 2. Determine what metrics we need to in order to know and do these • things. • 3. Determine what calculations need to be done in order to come up • with the desired metrics.  • 4. Determine what data needs to be collected in order to do the • desired calculations (i.e., data analyses).  • 5. Determine the process to use to collect and analyze the • needed data. • 6. Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until you can. • 7. Use the metrics obtained from steps 1-6. • 8. Assess the results of steps 1-7. • 9. Make adjustments to steps 1-8. • 10. Repeat steps 1-9 until satisfied with the process and the outcomes from the process.

  21. Metrics Steps and Symposium Two Tasks • Determine what we want to know and be able to do once we have a fully functioning metrics system. • Determine what metrics we need to in order to know and do these things. • Determine what calculations need to be done in order to come up with the desired metrics.  • Determine what data needs to be collected in order to do the desired calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until you can. • Use the metrics obtained from steps 1-6. • Assess the results of steps 1-7. • Make adjustments to steps 1-8. • Repeat steps 1-9 until satisfied with the process and the outcomes from the process. Steps completed in approximate form in Symposium One and in committee reports.

  22. Metrics Steps and Symposium Two Tasks • Determine what we want to know and be able to do once we have a fully functioning metrics system. • Determine what metrics we need to in order to know and do these things. • Determine what calculations need to be done in order to come up with the desired metrics.  • Determine what data needs to be collected in order to do the desired calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until you can. • Use the metrics obtained from steps 1-6. • Assess the results of steps 1-7. • Make adjustments to steps 1-8. • Repeat steps 1-9 until satisfied with the process and the outcomes from the process. Task 1 for Symposium Two: Review and revise results for these steps.

  23. Metrics Steps and Symposium Two Tasks • Determine what we want to know and be able to do once we have a fully functioning metrics system. • Determine what metrics we need to in order to know and do these things. • Determine what calculations need to be done in order to come up with the desired metrics.  • Determine what data needs to be collected in order to do the desired calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until you can. • Use the metrics obtained from steps 1-6. • Assess the results of steps 1-7. • Make adjustments to steps 1-8. • Repeat steps 1-9 until satisfied with the process and the outcomes from the process. Task 2 for Symposium Two: Outline plan for steps 1-6 for VS07.

  24. Metrics Steps and Symposium Two Tasks • Determine what we want to know and be able to do once we have a fully functioning metrics system. • Determine what metrics we need to in order to know and do these things. • Determine what calculations need to be done in order to come up with the desired metrics.  • Determine what data needs to be collected in order to do the desired calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until you can. • Use the metrics obtained from steps 1-6. • Assess the results of steps 1-7. • Make adjustments to steps 1-8. • Repeat steps 1-9 until satisfied with the process and the outcomes from the process. Task 3 for Symposium Two: Outline plan for steps 1-10 for next several years.

  25. Conceptual Helpers • Fence and Gates Analogy • Hierarchy of Metrics • Bricks and House Analogy

  26. Fence and Gates Analogy: Overall Concept Future capability (pri 1) Future capability (pri 3) Immediate goals Future capability (pri 2)

  27. Fence and Gates Analogy: An Example Real-time metrics display (pri 1) • VS07 capability and methods • experiment • RBC data collection system • NOAT data collection system Exercise leveldata collection (pri 3) MPRA data collection system (pri 2)

  28. Hierarchy of Metrics • Metrics are most useful when they provide information to multiple • levels of the organization • Individual forecaster • SGOT/OA Chief/Officer • METOC activity CO/XO • Directorate • CNMOC • Fact-based metrics are best developed when developed from data • from the lowest levels of the organization • Critical to collect data on the smallest “unit” of support (e.g., forecast, recommendation) • Higher level metrics (directorate, CNMOC) rely on lower level data collection/metrics • Operational modeling is enhanced by quality real world information (e.g., significant numbers of mission data records)

  29. Hierarchy of Metrics Higher Level (Navy-wide SLD accuracy) Metric Symposium Focus Space Larger Spatial and/or Temporal Scale (Exercise forecast location) Performance(Temperature and salinity accuracy) Impacts (Number of positively identified submarines) Smaller Spatial and/or Temporal Scale (Point forecast location) Lower Level (NOATs SLD accuracy)

  30. Hierarchy of Metrics Higher Level (Navy-wide SLD accuracy) Metric Symposium Focus Space CNMOC/FleetMetrics Larger Spatial and/or Temporal Scale (Exercise forecast location) Directorate Metrics Bottom-up approach to developing higher level, larger scale metrics. Bottom metrics support development of fact-based top metrics. NOAC Metrics Performance(Temperature and salinity accuracy) Impacts (Number of positively identified submarines) Exercise Metrics NOAT Metrics Ind. Forecast Metrics Smaller Spatial and/or Temporal Scale (Point forecast location) Lower Level (MOATs SLD accuracy)

  31. Hierarchy of Metrics Higher Level (Navy-wide SLD accuracy) Metric Symposium Focus Space CNMOC/FleetMetrics Larger Spatial and/or Temporal Scale (Exercise forecast location) Lower-level metrics can be input into operational models, which can provide higher-level metrics (e.g. mission model) Directorate Metrics NOAC Metrics Performance(Temperature and salinity accuracy) Impacts (Number of positively identified submarines) Exercise Metrics NOAT Metrics Ind. Forecast Metrics Smaller Spatial and/or Temporal Scale (Point forecast location) Lower Level (MOATs SLD accuracy)

  32. Hierarchy of Metrics Higher Level (Navy-wide SLD accuracy) Metric Symposium Focus Space CNMOC/FleetMetrics Larger Spatial and/or Temporal Scale (Exercise forecast location) Lower-level metrics can be input into operational models, which can provide higher-level metrics (e.g. mission model) Directorate Metrics NOAC Metrics Top-down approach: Higher level, larger scale metrics can also provide useful feedback for improving lower level, smaller scale metrics Performance(Temperature and salinity accuracy) Impacts (Number of positively identified submarines) Exercise Metrics NOAT Metrics Ind. Forecast Metrics Smaller Spatial and/or Temporal Scale (Point forecast location) Lower Level (MOATs SLD accuracy)

  33. Bricks and House Analogy Impact on METOC Customers (higher-level metrics) • Each “brick”: • Each brick represents a different warfare support area or subset of an area (e.g., MPRA, NOAT, RBC) • Takes many records to make good high-level metrics • Each record must be well constructed to make quality high-level metrics • Support Unit Record • Forecast data • Verification data • Customer plans • Customer outcomes • Recommendations • Other data

  34. Data Collection Capacity Capability Techniques Data Entry Archival Systems Level of Effort Where Does the Effort Belong: Early Stages Data Analysis Capacity Capability Algorithms Display Archival Modeling

  35. Data Collection Capacity Capability Techniques Data Entry Archival Systems Level of Effort Where Does the Effort Belong: Later Stages Data Analysis Capacity Capability Algorithms Display Archival Modeling

More Related