1 / 148

ASW METOC Metrics: Today’s Agenda

ASW METOC Metrics: Today’s Agenda. ASW METOC Metrics Project Overview. Tom Murphree, David Meyer Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford, Manuel Avila Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, Luke Piepkorn, and Ed Weitzner

rusty
Télécharger la présentation

ASW METOC Metrics: Today’s Agenda

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASW METOC Metrics: Today’s Agenda

  2. ASW METOC Metrics Project Overview Tom Murphree, David Meyer Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford, Manuel Avila Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, Luke Piepkorn, and Ed Weitzner Systems Planning and Analysis (SPA) pvodola@spa.com Bob Miyamoto Applied Physics Laboratory – University of Washington (APL-UW) rtm@apl.washington.edu Presented to ASW METOC Metrics Symposium III CNMOC, Stennis Space Center, MS 31 January 2008 Ford B and T. Murphree, ASW Metrics ,08, Jan 08, bruce@clearscienceinc.com, murphree@nps.edu

  3. ASW METOC Metrics Outline of This Brief Part 1: Project Overview Part 2: Detailed Project Description Part 3: Project Personnel and References Part 4: Back-Up Slides ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  4. ASW METOC Metrics Part 1: Project Overview ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  5. ASW METOC Metrics: Key Concepts • The Purpose of Metrics • Numerous Navy leaders* have emphasized the importance for Navy • transformation of metrics, also known as objective, quantitative measures of • an organization’s performance and impacts. • We propose to develop and transition to operational use a system for • generating such metrics for METOC support for ASW. The output from this • system will greatly enhance our ability to determine how well we support • ASW operations, and how we can do better. • Two of the main goals of the METOC community are: • Increase our understanding of, and ability to predict, the environment • Use our knowledge of the environment to increase warfighter effectiveness • The purpose of this project is to develop tools that help us: • Monitor our progress toward both of these goals • Identify specific methods for improving our progress toward these goals * e.g., Assistant SECNAV, ADM Mullen, RADM Tomaszeski, RDML McGee, CAPT Titley ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  6. ASW METOC Metrics: Key Concepts The Motivation for ASW METOC Metrics “Our leaders don’t want to hear how important it is to describe the environment, or to provide accurate information. Rather, they want to hear how our ‘enabling capabilities’ translate into speed, access, or persistence – how our skills result in advantages in force posture (having the right assets in the right place at the right time for optimal effect), fewer ships sunk, more enemy killed, fewer Blue Force casualties, less time spent in harm’s way, more accurate placement of munitions, etc.” RADM Tomaszeski Oceanographer of the Navy Naval Oceanography Program Status Report 2005 ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  7. ASW METOC Metrics: Key Concepts The Motivation for ASW METOC Metrics “ASW is a complex issue.  The Navy consistently scores poorly in execution.  We know the environment has huge implications.  The products and services my team delivers must be on mark.  The only way to know this is with measures and analysis.” CAPT Jim Berdeguez NOOC ASW/MIW 09 August 2007 ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  8. ASW METOC Metrics: Key Concepts • Definition of ASW METOC Metrics • Objective, quantitative measures of the performance and operational impacts of CNMOC products* provided to ASW decision makers. • Operational Value of ASW METOC Metrics • Identify strengths and weaknesses in METOC support* for ASW. • Evaluate new products, including new performance layer and decision layer products. • Improve ASW METOC product generation and delivery processes, product quality, assessments of uncertainty and confidence in products, product usage, and outcomes of ASW operations. • Calculate return on investment (ROI) from METOC R&D, product development, education and research, and other METOC community expenditures. • Make management decisions based on objective data (conduct fact-based management). • Allow CNMOC to more effectively participate in ASW fleet synthetic training, reconstruction and analysis (R&A), campaign analysis, and other modeling, simulation, and assessment programs. • * We define the terms products and support broadly to include all types of METOC products, services, and other types of support for ASW operations (e.g., analyses, forecasts, recommendations, contributions to ASW R&A and M&S, etc.) ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  9. Tier 3 – Decision Layer Tier 2 – Performance Layer Tier 1 – Environment Layer Fleet Data Initial and Boundary Conditions Satellites ASW METOC Metrics: Key Concepts ASW METOC metrics address all three tiers --- in particular, product performance and operational impacts at each of the three layers. ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  10. ASW METOC Metrics: Goals and Major Questions • Project Goals • Design, develop, test, and implement objective, quantitative tools to assess our ability to provide effective and efficient METOC support in all three BonD tiers. • Identify gaps in METOC support. • Identify and prioritize improvements needed to fill gaps. • Major Questions* We Will Address • In what ways does the METOC community impact the ASW fight? • How can we measure those impacts? • How can we use those measures to improve our impacts? • These are the primary questions that the ASW Directorate has determined it • needs to be able to answer on a routine and continual basis. ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  11. ASW METOC Metrics: Additional Questions • Additional Questions We Will Address • How good is METOC support for ASW operations? • In what ways, if any, is METOC support good enough? • What are the gaps in METOC support? • Which ASW METOC products are really worth generating? • Is there a more efficient way to produce these products? • What is the uncertainty in our products? • How much confidence should we and our customers have in our products? • What difference does METOC support make in the planning and execution of ASW operations? • How could we improve the impacts of METOC support on ASW operations? • How do we monitor the performance and impacts of METOC support on warfighter effectiveness in an efficient, routine, and continual manner that facilitates fact-based management of personnel, funds, and other resources? ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  12. ASW METOC Metrics: Metrics Types • Major types of metrics we will generate: • METOC Performance Metrics: Quantitative measures of METOC products & services (e.g., accuracy of, and uncertainty in, METOC forecasts of sonic layer depth (SLD)). • Customer Performance Metrics: Quantitative measures of customer success (e.g., number of screen penetrations by threat submarines). • Operational Impacts Metrics: Quantitative measures of the impacts of METOC products on customer planning and performance (e.g., correlation of SLD forecast accuracy to screen penetrations). ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  13. ASW METOC Metrics: Steps for Developing a Metrics System • Determine what we want to know and be able to do once we have a fully • functioning metrics system. • 2. Determine what metrics we need in order to know and do these things. • Determine what calculations need to be done in order to come up with the • desired metrics.  • Determine what data needs to be collected in order to do the desired • calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • 6. Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until it can be. • Use metrics obtained from steps 1-6 to improve processes, products, and • operational impacts. • 8. Assess the results of steps 1-7. • 9. Make adjustments to steps 1-8. • 10. Repeat steps 1-9 until satisfied with the process and the outcomes from the • process. Steps above describe the process for the real world data component of the proposed ASW METOC metrics program. The steps are the same for the operational analysis and modeling component of the proposed program, except for steps 4-6 in which data collection is replaced by modeling and generation of model output (model proxy data). ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  14. ASW METOC Metrics: Project Components • Component One: Data Collection, Analysis, and Reporting • Key deliverables: online data collection-analysis-reporting systems and resulting metrics • Primary operational users of deliverables: ASW Directorate, NAVO, ASW RBC, NOATS, NOADs, CNMOC PDC • Component Two:Operational Analysis and Modeling • Key deliverables: model-based mission modules and resulting metrics • Primary operational users of deliverables: ASW Directorate, NAVO, ASW RBC, OPNAV/N84 • Component Three:Collaborations, Evaluations, & Recommendations • Key deliverables: collaborative databases; coordinated analyses of processes, products, and impacts; evaluations of METOC support; recommendations for improving support • Primary operational users of deliverables: ASW Directorate, NAVO, ASW RBC, OPNAV/N84, CNMOC PDC, NMAWC R&A ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  15. ASW METOC Metrics: Main Deliverable • Our main deliverable will be an ASW METOC Metrics System: • System will be an integrated toolset composed of SIPRNET based software, website, database, ASW operations analysis tools, data collection and analysis tools, and metrics calculator and display tools, with accompanying assessment reports, documentation, and training materials. • System will be capable of collecting and analyzing data on actual products, verifying observations, decisions made by users of the products, and outcomes of user operations. • System will make extensive use of automation to minimize impacts on manpower. • System will use Navy-accepted operations analysis and modeling tools to simulate operational impacts of products. Model metrics will complement those derived from data. Modeling will simulate events difficult to represent with actual data (e.g., rare or difficult to observe events), and allows experimentation (e.g., varying levels of product accuracy, varying CONOPS for METOC support). • System will deliver metrics in near real time and in formats that allow METOC managers to efficiently use metrics in decision making. ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  16. ASW METOC Metrics: Uses and Benefits of Main Deliverable • Who in broader ASW community will use and/or benefit from the ASW METOC • Metrics System? • ASW Customers • NMAWC R&A • ASW Education and Training Organizations • ASW Modeling and Simulation Organizations • ASW Assessment and Resource Allocation Organizations • How will they benefit from using the system? • Increased awareness of ASW METOC products • Increased awareness of the actual and potential impacts of METOC phenomena on ASW operations • Increased awareness of the actual and potentialimpacts of METOC products on ASW operations • Increased awareness of the uncertainty in ASW METOC products • Improved assessment and management of METOC risks • Improved use of METOC products in planning and conducting ASW operations • Increased use of METOC information in ASW R&A • Improved use of METOC information in ASW education and training • Improved use of METOC information in planning, developing, and assessing new ASW technologies • Improved use of METOC information in allocating ASW resources ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  17. ASW METOC Metrics: Return on Investment (ROI) Returns on Investment in This Project: Some Key Returns 1. Objective, quantitative, automated, near real time assessments of existing and emerging METOC support for ASW, including assessments of: a. METOC product generation and delivery processes b. Product quality c. Uncertainty and confidence in products d. Product usage e. Impacts of METOC support on outcomes of ASW operations 2. Datasets, models, data analyses, and assessments that Increase CNMOC’s ability to: a. Evaluate and mitigate environmental risks to ASW b. Effectively participate in ASW fleet synthetic training, reconstruction and analysis, campaign analysis, and other modeling, simulation, and assessment programs c. Conduct fact-based decision making in planning and in management of CNMOC personnel and other resources ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  18. Component One: Data Collection, Analysis, and Reporting Overview of ASW Metrics Data Collection System NOAT Metrics Node MPRA Metrics Node Customer MEP Recco Measures Builder Green Information of Success Contact Information Purple In Situ Freeform Data Data Entry Quality FCST/Anal Control Metrics Server Exercise Intentions Watch Officer Log Planning Impacts NOAT Exercise Survey Flag Objective Outcomes Exercise Data Records Collection RBC Metrics Node R&A Metrics Node METOC Data Source Non - METOC Data Source Design and implement a scalable, SIPRnet based metrics data collection, computation, and display system customizedfor RBC, deployed NOATs, MPRA, and overall exercise level support of ASW operations. ASW METOC Metrics Overview, Jan 08, murphree@nps.edu

  19. Acknowledgements ASW RBC team VS07 METOC metrics data collectors: • CDR(s) Tony Cox • LCDR(s) Tara Lambert • LT Eric MacDonald • LT Scott Parker • Bob Miyamoto Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  20. Goals • Conduct real-world test of prototype data collection process • Collect and analyze exercise level data • Refine verification and impact metrics • Refine plans for overall ASW METOC metrics project Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  21. Methods • Collect data : • Forecasts/analyses for all three BonD tiers • Verifying observations • Product usage • Recommendations • Customer performance/impressions • METOC impacts on planning, execution, and outcomes Collect and analyze additional data on ocean and acoustic models, performance surfaces, and customer performance (R&A) Develop revised data collection and analysis process for overall ASW METOC metrics project Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  22. METOC Data and Performance *Daily commodore’s brief nc – Not computed due to low number of verifying observations # of sensor performance predictions provided to at-sea staffs and CTF74 = 160 Source: VS07 NOAT data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  23. METOC Performance SLD: Actual vs. Predicted MPRA NOAT predictions and AXBT measurements of sonic layer depth (SLD) from Valiant Shield 2007 (VS07; preliminary data analysis).Note tendency to under-predict SLD by about 50 ft. Actual Predicted Source: VS07 MPRA data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  24. METOC Performance BLG: Actual vs. Predicted SLD: Actual vs. Predicted MPRA NOAT predictions and AXBT measurements of below layer gradient (BLG) from Valiant Shield 2007 (VS07; preliminary data analysis).Note tendency to over-predict BLG Actual Predicted Source: VS07 MPRA data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  25. Customer Performance Source: VS07 NOAT data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  26. NOAT Recommendations Source: VS07 NOAT data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  27. NOAT Recommendations *Commonly reported by CTF74 due to CTF74 not being in a position to mandate recommendations Source: VS07 NOAT data collection forms Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  28. Post-Exercise NOAT Survey • Extensive post-VS07 survey sent to NOAT metrics collectors and forecasters on 12Aug07 • 8 responses received • Divided into sections entitled: • Collecting Data on METOC Performance • Collecting Data on Customer Performance • Collecting Data on Tactical Recommendations • Sources of Information for NOAT Briefings • Overall Impressions Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  29. Post-Exercise NOAT Survey Results • Most commonly verifiable items in briefs: SLD, SST • Most accurate forecasts: SLD, SST, active ranges • Least accurate forecasts: COF, passive ranges • Most valued forecast: sensor performance (100% agreement) SQS-53(c) • Customer satisfaction with sensor performance predictions: Very satisfied • Best source of sensor performance verification: contact reports, contact logs • On scale of 1 to 10, what was the accuracy of VS07 sensor performance predictions: average 7.3 Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  30. Post-Exercise NOAT Survey Results • Customer measures of success • Kills – contact reports/logs • Contacts: - contact reports/logs • Tactical recommendations • Most passed on in briefs • Most passed on by team lead or forecaster • How should tactical recommendations/outcomes be recorded by NOATs? • Directly from NOAT (SITREP) • Common reasons for not implementing tactical recommendations: • Flight ops • Not believed • SOE • Best way for NOAT to determine tactical recommendation outcomes: Interaction with the ASW watch Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  31. Post-Exercise NOAT Survey Results TOFA Impact • Estimate of percentage of commodore briefs in which TOFA information was included: Average 82% • Least liked about TOFA – WX interaction with ocean not addressed, extra slides, posted time-late • Impact of TOFA on scale of 1 to 10: Average: 7.6 Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  32. Post-Exercise NOAT Survey Results Performance Surface Impact • Estimate of % of commodore briefs in which performance surface information was included: Average 82% • Least liked about performance surface – Consistent views, focus on highest interest areas • Impact of performance surface on scale of 1 to 10: Average: 6.75 Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  33. Post-Exercise NOAT Survey Results Decision Layer Impact • Estimate of % of commodore briefs in which decision layer information was included: Average 18% • Least liked about products– not fully understood by NOAT personnel, low resolution • Impact of decision layer products on scale of 1 to 10: Average: 1 Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  34. Post-Exercise NOAT Survey Results • Shining moments: • “[NOAT] fully integrated into planning process” • “every slide [in brief] had force impacts” • “convincing DESRON to…adapt screen to the environment” • “we helped with the success of the exercise” • Biggest busts: • “very poor start” • “slow at catching dropping SLD” • “too much faith in the model” • “[overly] optimistic screen formation” Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  35. Post-Exercise NOAT Survey Results RBC Responsiveness • Responsiveness of RBC on scale of 1 to 10: Average: 7.8 • Comments on RBC effectiveness • More customized products for NOATs • 24/7 SME availability • Fully answer questions Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  36. Post-Exercise NOAT Survey Results NOAT Effectiveness • Impact of NOAT on scale of 1 to 10: Average: 9.6 • Comments on improving NOAT impact • Improved understanding of products • Improved training • Better integration with ASW personnel Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  37. VS07 Data Collection Lessons Learned Lesson Learned: Data collection was labor intensive • Highlights the need for unobtrusive collection methods • Automated, or • At-your-fingertips access • Potential human data sources should understand the purpose and goals of metrics data collection Application: • Automate or embed data collection methods and actions into already-existing processes/routines/behaviors where possible • Strive for rapid institutionalization of metrics data collection behaviors • As we change behaviors, training and “marketing” become important transition factors • Decisive action by leaders in mandating change • Address “what’s in it for me?” at all levels Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  38. VS07 Data Collection Lessons Learned Lesson Learned: Data collection was labor intensive Axiom: Deploying data collectors (VS07) is an inefficient and non-cost-effective method for collecting data Application: • Maximize effort toward: • Creating the tools to facilitate collection by support providers • Changing support provider behavior to facilitate data collection • Use time and travel to increase our knowledge of the METOC support process and not for conducting data collection Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  39. VS07 Data Collection Lessons Learned Lesson Learned: Collectors’ skill in gathering data increased as exercise progressed • Partially due to bringing in “outsiders” to collect metrics data • Partially due to unfamiliarity with carrier/ASW ops Application: • Make manual data collection the responsibility of those “in” the support process • With transition phases, distribute high quality training • Multimedia DVD training • Live site visits Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  40. VS07 Data Collection Lessons Learned Lesson Learned: It is difficult to verify some METOC predictions • E.g., sensor performance predictions • Partially due to unfamiliarity with ASW ops/data sources Application: • For key metrics, use the most reliable/available data • E.g., BT records • For each node, employ consistent data sources • Data “drawn from the same well” • Use corporate knowledge of existing data sources • Set policy as to what a verifying ob is (e.g., BT w/i 10 NM of forecast area) • If necessary, forge alliances that will provide potential METOC metrics data • E.g., computer use records • E.g., DESRON data records/message traffic Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  41. VS07 Data Collection Lessons Learned Lesson Learned: It is difficult to collect some customer performance data • E.g., Submarine kills – differing definitions • Reporting differences among different levels • Customer may not collect complete performance data Application: • For key metrics, use the most reliable/available data • Collect performance data directly from our customer (DESRON), or smallest metric-able unit (bricks and house analogy) • Assemble data from multiple customers to formulate exercise-wide data • Employ consistent data sources • Data “drawn from the same well” • If necessary, forge alliances that will provide potential customer performance data • E.g., DESRON data records/staff resources • Exploit commonalities among staffs to ensure data consistency Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  42. VS07 Data Collection Lessons Learned Lesson Learned: List of key data to be collected and metrics to be calculated should be refined Application: • First, define the most important metrics desired within the context of readily attainable data • Better to get a few metrics accurately, then to get a lot of metrics wrong, or seldom • Revisit list frequently • Resist the temptation to add onto metrics collection: • While applications are in development • Prior to institutionalization taking place Murphree and Ford, VS07 Findings, 31 Jan 08, murphree@nps.edu, bruce@clearscienceinc.com

  43. Apparent Errors in RELO NCOM and COAMPS Forecasts for VS07 Region and PeriodApparent Error (AE) Defined as:Tau06 fcst minus Tau30 fcst for same valid time for two sequential 00Z runsExample: 07Aug00Z Tau06 - 06Aug00Z Tau30Focus on Tau06 minus Tau30, since data is assimilated at at Tau03 and Tau06 but not afterwards.Forecasted Variables Considered: T at 2.5 m (T2.5) T at 7.5 m (T7.5) Surface T Flux (STF) Surface Wind Stress Vector (WSV) Surface Wind Stress Magnitude (WSM)Period: 06-12 Aug 2007 (00Z runs)Model Forecast Data Provided by Frank Bub (NAVO)

  44. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in COAMPS Wind Stress Tau06 10Aug07 Minus Tau30 09Aug07

  45. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in COAMPS Wind Stress Magnitude Tau06 10Aug07 Minus Tau30 09Aug07

  46. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in COAMPS Surface T Flux Tau06 10Aug07 Minus Tau30 09Aug07

  47. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in Relo NCOM T 2.5 m Tau06 10Aug07 Minus Tau30 09Aug07

  48. Apparent Error in Relo NCOM T 7.5 m Tau06 10Aug07 Minus Tau30 09Aug07

  49. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in COAMPS Wind Stress Tau06 12Aug07 Minus Tau30 1Aug07

  50. Analyzed values > forecasted Analyzed values < forecasted Apparent Error in COAMPS Wind Stress Magnitude Tau06 12Aug07 Minus Tau30 1Aug07

More Related