1 / 10

Background: Top-level ASW Metrics Overview

Background: Top-level ASW Metrics Overview. Must comply with global ASW CONOPS Published by N aval Mine and Anti-Submarine Warfare Command (NMAWC) at http://fltaswcom.navy.smil.mil/globalasw.htm Study the 10 points. Consider where the METOC community has an impact. Primary questions

maeko
Télécharger la présentation

Background: Top-level ASW Metrics Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Background: Top-level ASW Metrics Overview • Must comply with global ASW CONOPS • Published by Naval Mine andAnti-Submarine WarfareCommand (NMAWC) at http://fltaswcom.navy.smil.mil/globalasw.htm • Study the 10 points. Consider where the METOC community has an impact. • Primary questions • How does the METOC community have an impact on the ASW fight? • How will we measure that impact?

  2. Background: 3-D Metrics Space / Spectrum Higher Level Larger Temporal/ Spatial Scale Performance Proxy Impacts Impacts Smaller Temporal/ Spatial Scale Lower Level

  3. Background: 4-D Metrics Continuum(With Examples) Higher Level (Navy-wideSLD accuracy) Larger Spatial Scale (Whole-exercise box) Impacts (Number of positively identified submarines) Performance(Temperature and salinity accuracy) Proxy Impacts (SLD accuracy) Smaller Spatial Scale (Point forecast location) Lower Level (MOATs SLD accuracy) Long timeframes (e.g., days/months/years/exercise) Short timeframes (e.g., single forecast) Time

  4. Background: Types of ASW Metrics • Performance metrics – likely consists of measurable/verifiable physical quantities (e.g. How well did we forecast 20M temperature vs. observed 20M temperature?) • Proxy Metrics – a verifiable quantity (possibly derived) that is defined by the end-user to be closely tied to user performance (e.g., SLD accuracy) • Impact Metrics – (a.k.a. Campaign Metrics) – a verifiable quantity of the end user’s overall warfighting effectiveness (e.g., Number of submarines destroyed, or percentage of missions accomplished.) • Other types of metrics • Quality Metrics – Measure of the overall quality of a product or service (e.g., TOA grade). Could be a combination of performance metrics • Capacity/Readiness Metrics – e.g., Number of SME’s available. • Efficiency Metrics – e.g., number of SME’s required to publish TOA.

  5. Action: Teams List Four separate exploratory teams were formed with differing focus areas. Each team will receive tasking in their focus area. The overall team will guide the focus teams, and direct the project. • Overall • LEAD: Dr. Tom Murphree • NMAWC - CDR Ash Evans • N84 – CDR Mike Angove • NRL - Pat Hogan, Josie Fabre, Greg Jacobs • ONR – CDR Marble • SPA – Paul Vodola, Matt McNamara, Luke Piepkorn • PDC – Merril Stevens • PMW 180 – Marcus Speckhahn • ASW DOO/DDOO • CNMOC – Steve Lingsch • Mr. Ed Gough • Focus Teams • MPRA • LEAD: Clear Science - Bruce Ford • MPRA NOAD OICs – LCDR (Sel) Danny Garcia, CONUS OIC TBD • CPRG - CDR Sopko • NRL - Pat Hogan • APL- UW – Mr. Bob Miyamoto • FNMOC – LTJG Dave Watson • PDD South - Doug Lipscombe • SPA • RBC • NRL - Jay Shriver, Jim Dykes, Josie Fabre • NAVO TBD • FNMOC – LTJG Dave Watson • LT Heather Hornick • Clear Science – Bruce Ford • SPA • NOAT • LCDR Joel Feldmeier • LT Tim Campo • Clear Science - Bruce Ford • NRL - Jim Dykes, Josie Fabre • SPA

  6. Action: Focus Team Tasks • Determine who is the end-user of METOC products produced by this focus area (e.g., warfighter or other METOC personnel). • Document the METOC support process (diagram if necessary) in terms of the end-user’s mission timeline (planning/execution/debrief). Include: • Unclassified examples of the all products issued • Annotate how METOC support is incorporated • Indicate how TTPs used for METOC support, if so how? • Determine what metrics (see previous definitions) should be collected (1st guess). Include: • Performance metrics • Proxy metrics • Must be closely correlated with customer performance • End user must agree that proxy metric values directly impact their ability to complete their mission successfully • Examples: Accuracy of SLD, Sea Surface Height, MDR, Buoy Depth Setting recco • Impact metrics • Determine what data is required for calculate each metric and recommend methods for it’s collection. Determine data: • Source (METOC products, mission debriefs) • Availability (hourly, 12-hourly, weekly) • Verifiability (verified with observations, reconstructions) • Collection points/methods (within end-user mission timeline) • Recommended metrics (performance, proxy or impact) collection techniques • Map recommended metrics to 4D Metrics Continuum • Estimate the additional man-hours required to collect metrics on this focus area • Identify existing metrics within the end-user’s realm that are already being calculated.

  7. Action: POA&M Draft • Focus Team tasking released – 17 Jan • Focus Team tasking completion – 21 Feb – Findings due to LT Parker COB • ASW Metrics Meeting and Funding Proposal– 7 to 9 March • Three levels of funding requests • Bare bones • Adequate to complete project • Completely funded • Strategic Plan – Ongoing • RTP - ?? • List of potential resources – Thesis students, ONR? • Draft plan for Valiant Shield metrics support – TBD • NMAWC R&A Visit – 22 – 26 Jan • Provide input for METOC data collection for DCM. • SWDG R&A Visit - TBD • DEVRON 12 R&A Visit - TBD • CPRG Visit - TBD • ARL-UT Visit - TBD • IUSS Visit - TBD • ONI Visit - TBD • Form list of SPA, CNA, John Hopkins projects that have already been completed that we could learn from - TBD • Investigate ASW Cross Functional Team findings for potential metrics information - TDB

  8. Slides that could be deleted • The information in these following two slides • Didn’t seem appropriate to the overall document, OR • The information was incorporated into another slide

  9. Baseline Current Models • Baseline current operations and simulations • Can we model an actual operation or exercise accurately • Validation data set to baseline the models • NMAWC R & A Data • SWDG R & A Data • Differences b/w campaign modeled and actual outcomes

  10. ASW Process Diagrams • NOAT Support – Strike Force Support • RBC Support Metrics • Fronts, eddies, SLD, below layer gradient accuracy? • MPRA Level • Include P-3 Expert…Dr. Miyamoto, CDR Sopko? • Pre-brief vs post brief • Need details of pre-brief data • What products are we producing? • Can we work with data that’s already collected • Proxy metric – buoy depth setting, MDR, buoy drop pattern, were recommendations utilized?

More Related