1 / 20

Dr. James Elele, Jeremy Smith NAWCAD BSMVV Branch James.Elele@navy.mil

Applying Modeling and Simulation Verification, Validation and Accreditation (VV&A) Techniques to Test and Laboratory Facilities. Dr. James Elele, Jeremy Smith NAWCAD BSMVV Branch James.Elele@navy.mil David Hall, Charles Pedriani SURVICE Engineering Company Dave.Hall@survice.com.

fgehlert
Télécharger la présentation

Dr. James Elele, Jeremy Smith NAWCAD BSMVV Branch James.Elele@navy.mil

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying Modeling and Simulation Verification, Validation and Accreditation (VV&A) Techniques to Test and Laboratory Facilities Dr. James Elele, Jeremy Smith NAWCAD BSMVV Branch James.Elele@navy.mil David Hall, Charles Pedriani SURVICE Engineering Company Dave.Hall@survice.com ASME V&V Conference 3 May 2012

  2. Introduction • Tasking to support accreditation of Test and Evaluation Facilities in support of IFF Program • Develop an accreditation case for T&E facilities for operational testing • Applied risk-based M&S VV&A approach to facilities • Approach applied successfully to M&S for over 20 years • Test case for future T&E facility accreditation efforts • Successful application can support efforts to institutionalize process for T&E as well as M&S IFF = Identification Friend or Foe T&E = Test and Evaluation M&S = Model and Simulation

  3. M&S VV&A Definitions • Verification:The process of determining that a model implementation and its associated data accurately represent the developer's conceptual description and specifications. • Does the model do what the originator intended, and is it relatively error free? • Validation:The process of determining the degree to which a model and its associated data are an accurate representation of the real worldfrom the perspective of the intended uses of the model. • How well do model results match real world data, in the context of your needs? • Accreditation:The official certification [determination] that a model, simulation, or federation of models and simulations and its associated data are acceptable for use for a specific purpose • Does the accreditation authority have adequate evidence to be confident that a model is fit for purpose? Did you build the model right? Did you build the right model? Did your customer accept it? • Definitions from DODI 5000.61 dated 13 May 2003

  4. Best of Show Good Enough Underlying Principles • The ultimate goal of VV&A efforts is to form a foundation for making good decisions • Nature and extent of information required to support accreditation decision is at the discretion of the accreditation authority and is generally based on assessment of risk • Role of M&S results in decision making process • Importance of decision that M&S is supporting • Severity of the consequences of making incorrect decisions because M&S were wrong • Probability that analysis results based upon M&S will be challenged “Better is the Enemy of Good Enough!”

  5. Fit for Intended Use Steps to an Accreditation Decision Analyze Intended Use Intended Use Statement Accreditation Plan Develop M&S Reqts and Accreditation Info Reqts Develop/Execute Accreditation Plan V&V Develop Accreditation Case A Review Accreditation Case Make Accreditation Decision

  6. M&S User How Much Credibility Is “Enough”? It Depends on Risk A Makeshift Bridge is Good Enough If You Need To Cross a Meandering Shallow Stream M&S BUT

  7. Greater Risks... Indicate the Need for Evidence of Greater Credibility M&S Supporting Evidence CREDIBLE SOLUTION PROBLEM

  8. V&V: The Central Pillars of Simulation Credibility S/W Accuracy Data Accuracy Output Accuracy Simulation meets design requirements, operates as designed and is free of errors in software Simulation input data, validation data and data manipulations are appropriate and accurate Simulation outputs match the real world “well enough” to be of use in a particular problem Software (S/W) Accuracy Data Accuracy Output Accuracy Verification Validation "V" & "V" But, V&V are Just the Middle of the Bridge!

  9. The “Other Pillars” of Simulation Credibility Usability Capability Simulation has adequate user support to facilitate correct operation and interpretation of its outputs Ties the M&S to a Useful Solution Simulation possesses all required functionality and fidelity for the problem being solved Anchors the M&S to the Problem M&S Capability Usability Accuracy M&S Requirements User Capabilities Credible Solution Problem • Accuracy of: • Software • Data • Outputs Can I Be Sure I’m Not Mis-Using the M&S? Does the M&S Do What I Need It To Do?

  10. The Essence of Accreditation M&S REQUIREMENTS M&S INFORMATION IDENTIFY M&S DEFICIENCIES • Capability • Accuracy • Usability • Data Quality • M&S Documentation • Design Documentation • Configuration Mgt • V&V Results • Etc. IDENTIFY WORK-AROUNDS, USAGE CONSTRAINTS, REQUIRED IMPROVEMENTS AND RISKS Provided by the Model Developer or Model Proponent Defined by the User (Formally or Implied) ACCREDITATION DECISION PROBLEM CONTEXT TO PROVE THE M&S IS SUITABLE FOR THE NEED REQUIRES AN OBJECTIVE COMPARISON OF M&S INFORMATION WITH M&S REQUIREMENTS WITHIN THE CONTEXT OF THE PROBLEM

  11. HIGH LOW How Much V&V is Enough? It Depends on Risk • Risk means something “bad” might happen because you believed an incorrect simulation result • Decisions based on M&S results are at risk • VV&A reduces that risk RISK = PROBABILITY x IMPACT RISK MODERATE PROBABILITY IMPACT

  12. Quantifying Risk Level • RISK LEVEL VALUES ARE: • Subjective • Consistent with MIL-STD-882 • Tailorable to each application HIGHER RISK MEANS MORE CREDIBILITY EVIDENCE IS NEEDED TO ACHIEVE ACCREDITATION

  13. Risk Reduction Strategies 5 4 Likelihood Improved M&S Credibility 3 2 1 1 2 3 4 5 Reduced reliance on M&S results Impact Low Moderate High Risk = Likelihood x Impact

  14. Application to Test and Laboratory Facilities • This was a trial application of the M&S VV&A approach to test facilities • Identification Friend-or-Foe (IFF) system testing to evaluate new IFF system performance • Accreditation of test facilities required by Commander, Operational Test and Evaluation Force (COMOPTEVFOR) • Facilities used for system assessment include: • All-up ship radar and related system representations • Simple stimulators • Engineering Test Equipment (ETE) facility specific to IFF system testing

  15. ETE Facility • Generates waveform signals to stimulate a production IFF transponder in the laboratory • To evaluate system requirement for resistance to signals from transmitters other than the desired transmitter • Critical technical parameter “susceptibility of the IFF system to false interrogations” • Metric: Probability of resistance to false signals = Pr

  16. Initial ETE Risk Assessment

  17. Initial ETE Assessment and Recommendations* CAPABILITY: • The intended use is clearly stated: to evaluate the probability of responding to a false signal (Pr) • No formal design documents exist: recommend laboratory design be adequately documented ACCURACY: • Input data are provided by actual hardware • Recommend documenting a complete set of test cases and results and any previous verification activities • Recommend independent subject matter expert (SME) review of laboratory results USABILITY: • Recommend facility develop and implement an overall configuration management plan • The test approach appears to have been successfully used over a span of many years to support a variety of identification programs for DOD and the FAA: recommend the facility provide documented results of previous uses * Incorporated into ETE Facility Accreditation Plan

  18. Observations • Application to test facility similar to M&S • Biggest issue for both: getting good documentation • Lack of configuration management plans • Poor documentation of prior V&V results • They did V&V, they just didn’t write it down • Accreditation Support Package (ASP) document format works equally well for both • Differences: • Developing intended use statement more natural for test facilities than for M&S • Use of T&E facilities seems to have more focused objectives initially than use of M&S

  19. Accreditation Support Package • ACCURACY • Software Accuracy • Software Verification Results • Design Verification • Implementation Verification • Software Development and Management Environment • Software Development Environment • Configuration Management • Software Quality Assessment • Implications for M&S Use • Data Accuracy • Input Data • Data Transformations • Implications for M&S Use • Output Accuracy • Sensitivity Analysis • Benchmarking • Face Validation • Results Validation • Implications for M&S Use • EXECUTIVE SUMMARY • ASP OVERVIEW • CAPABILITY • M&S Description • Functional Capabilities • Development History • Assumptions and Limitations • Implications for M&S Use • USABILITY • Documentation Assessment • User Support • Usage History • Implications for M&S Use

  20. Thoughts on Broader Application • Risk-based approach seems as applicable to test facilities as to M&S • Risk assessment can also help prioritize which facilities justify spending more VV&A resources • Suggest standardizing and institutionalizing risk-based VV&A process for both M&S and T&E • No consistent application across DOD for either • Risk-based VV&A promotes cost-effective VV&A for both M&S and T&E facilities

More Related