1 / 56

Bernard P. Zeigler

M&S Based System Development and Testing in a Joint Net-Centric Environment Simulation-based Testing of Emerging Defense Information Systems. Bernard P. Zeigler

zuwena
Télécharger la présentation

Bernard P. Zeigler

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. M&S Based System Development and Testing in a Joint Net-Centric EnvironmentSimulation-based Testing of Emerging Defense Information Systems Bernard P. Zeigler Professor, Electrical and Computer Engineering University of ArizonaCo-Director, Arizona Center for Integrative Modeling and Simulation (ACIMS) and Joint Interoperability Test Command(JITC) Fort Huachuca, Arizona www.acims.arizona.edu

  2. Overview • Modeling and simulation methodology is attaining core-technology status for standards conformance testing of information technology-based defense systems • We discuss the development of automated test case generators for testing new defense systems for conformance to military tactical data link standards • DEVS discrete event simulation formalism has proved capable of capturing the information-processing complexities underlying the MIL-STD-6016C standard for message exchange and collaboration among diverse radar sensors • DEVS is being used in distributed simulation to evaluate the performance of an emerging approach to the formation of single integrated air pictures (SIAP)

  3. Arizona Center for Integrative Modeling and Simulation (ACIMS) • Mandated by the Arizona Board of Regents in 2001 • Mission – Advance Modeling and Simulation through • Research • Education • Technology Transfer • Spans University of Arizona and Arizona State University • Maintains links with graduates through research and technology collaborations

  4. Unique ACIMS/NGIT Relationship • A long term contractual relationship between Northrop Grumman IT (NGIT) with ACIMS • Employs faculty, graduate/undergrad students, others • Perform M&S tasks for/at Fort Huachuca • Challenges: • Rationalize different ways of doing business • Deliver quality services on time and on budget • Benefits: • NGIT Access to well-trained, creative talent • Students gain experience with real world technical requirements and work environments • Transfer ACIMS technology

  5. Genesis • Joint Interoperability Test Command (JITC) has mission of standards compliance and interoperability certification • Traditional test methods require modernization to address • increasing complexity, • rapid change and development, and • agility of modern C4ISR systems • Simulation based acquisition and net-centricity • pose challenges to the JITC and the Department of Defense • must redefine the scope, thoroughness and the process of conformance testing

  6. Response – ATC-Gen • JITC has taken the initiative to integrate modeling and simulation into the automation of the testing process • Funded the development of Automated Test Case Generator (ATC-Gen) led by ACIMS • In R&D of two years, proved the feasibility and the general direction • The requirements have evolved to a practical implementation level, with help from conventional testing personnel. • ATC-Gen was deployed at the JITC in 2005 for testing SIAP systems and evaluated for its advantages over conventional methods The ACTGen Development Team, NGIT & ACIMS was selected as the winner in the Cross-Function category for the 2004/2005 M&S Awards presented by the National Training Systems Association (NTSA).

  7. Test Driver SUT DEVS Simulator HLA HLA Network ATC-Gen Goals and Approach • Goals: • To increase the productivity and effectiveness of standards conformance testing (SCT) at Joint Interoperability Test Command (JITC) • To apply systems theory, modeling and simulation concepts, and current software technology to (semi-)automate portions of conformance testing Objective: Automate Testing Capture Specification as If-Then Rules in XML Analyze Rules to Extract I/O Behavior Synthesize DEVS Test Models Test Driver Executes Models to Induce Testable Behavior in System Under Test (SUT) Interact With SUT Over Middleware

  8. Link-16: The Nut-to-crack • Joint Single Link Implementation Requirements Specification (JSLIRS) is an evolving standard (MIL-STD-6016c) for tactical data link information exchange and networked command/control of radar systems • Presents significant challenges to automated conformance testing: • The specification document states requirements in natural language • open to ambiguous interpretations • The document is voluminous • many interdependent chapters and appendixes • labor intensive and prone to error • potentially incomplete and inconsistent. • Problem: how to ensure that a certification test procedure • is traceable back to specification • completely covers the requirements • can be consistently replicated across the numerous contexts • military service, inter-national, and commercial companies

  9. MIL-STD-6016C Excerpt: Original: 4.11.13.12 Execution of the Correlation. The following rules apply to the disposition of the Dropped TN and the retention of datafrom the Dropped TN upon origination or receipt of a J7.0 Track Management message, ACT = 0 (Drop Track Report), for the Dropped TN. The correlation shall be deemed to have failed if no J7.0 Track Management message, ACT = 0 (Drop Track Report), is received for the dropped TN after a period of 60 seconds from the transmission of the correlation request and all associated processing for the correlation shall be cancelled. a. Disposition of theDropped Track Number: (2) If own unit has R2 for the Dropped TN, a J7.0 Track Management message, ACT = 0 (Drop Track Report), shall be transmitted for the Dropped TN. If the Dropped TN is reported by another IU after transmission of the J7.0 Track Management message, own unit shall retain the dropped TN as a remote track and shall not reattempt to correlate the Retained TN and the Dropped TN for a period of 60 seconds after transmission of the J7.0 Track Management message. XML Translation: <rule trans="4.11.13" stimulus="4.11.13.12" reference="4.11.13.12.a.2" ruleName="R2 Unit transmits J7.0"> <condition txt="Check for R2 own unit" expression="AutoCor==True and (CRair.TNcor.CORtest==3 and J32.TNref.CORtest==3) and CRair.R2held==1 AND J72.MsgTx==True"> </condition> <action txt="Prepare J7.0 Drop Air Track message" expression="J70.TNsrc=TNown; J70.TNref=TNdrop; J70.INDex=0; J70.INDcu=0; J70.ACTVair=0; J70.SZ=0; J70.PLAT=0; J70.ASchg=0; J70.ACTtrk=0; J70.ENV=0; MsgTx(J70)"> </action> <output txt="Transmit J7.0" outType="Message" outVal="J70"></output> </rule> <QA> <revisit name="DHF" date="10/16/04" status="Open">need to add timer for a period of 60 seconds in which correlation is not reattempted</revisit> </QA>

  10. Transaction Level - example P.1.2 = Drop Track Transmit 1 Preparation 2 Processing 3 Modify C2 Record for TN Transmit Msg Rule Processing Constraints (Exception) Rules Validity checking Track Display Time outs Operator decisions Periodic Msg Other ConsequentProcessing Jumps (stimuli) to other Transactions of specification Stop Stop, Do Nothing, Alerts, Or jump to other Transaction Output from Input to system system DEVS t t t t 1 2 3 4 Discrete Event Nature of Link-16 Specification System Theory Provides Levels of Structure/Behavior

  11. Rule Capture Component Rule Formalization Component Analyst Subject Matter Expert Rule Set Property Analyzer Logic Engineer Test Generation Component SUT Sensor Test Engineer Other Federates ATC-Gen Top-Level Architecture

  12. ATC Gen Overview • Standard to XML Translation • Analyst interprets the requirements text to extract state variables and rules, where rules are written in the form: If P is true nowCondition Then do action A laterConsequence Unless Q occurs in the interimException • Dependency Analysis & Test Generation • DependencyAnalyzer (DA) determines the relationship between rules by identifying shared state variables • Test Model Generator converts Analyst defined test sequences to executable simulation models • Test Driver • Test Driver interacts with and connects to SUT via HLA or Simple J interfaces to perform conformance testing • Validated against legacy test tools

  13. ATC Gen Tool Process

  14. Rule Interpretation Example

  15. ATC Gen Tool Process

  16. Shared State Variables Dependency AnalysisAutomated Visual Display of Rule Relations Input Variables Output Variables • Output: • CAT Alert • Message Transmission • Operator Display

  17. Test SequencesManually Derive Paths

  18. Example: Correlation Test Rules

  19. Correlation Test Sequence Examples “Correlation was Successful ”

  20. “Correlation was Successful ”

  21. 1st Prohibition Failed “Correlation Failed”

  22. 2ndProhibition Failed “Correlation Failed”

  23. 3rd Prohibition Failed “Correlation Failed”

  24. Test SequencesCreate Paths Through the ATC Gen GUI

  25. Determining initial state and message field values required to drive SUT through sequence Analyst: • Determine the data needed to execute a test sequence • Set state variables and field values accordingly

  26. Test CaseData Input Through the ATC Gen GUI

  27. Reception of J3.2 Messages Transmission of J7.2 Message Transmission of J7.0 Message Test CaseGenerated XML I/O Initial State Values

  28. ATC Gen Tool Process

  29. Test Driver for Controlled Testing Coupled Test Model Component Test Model 1 Component Test Model2 Component Test Model 3 Jx1,data1 Jx2,data2 Jx3,data3 Jx1,data1 Jx2,data2 Jx3,data3 Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 Jx4,data4 Jx4,data4 Middleware SUT

  30. Test Model • holdSend(Jx1,data1,t1) • holdSend (Jx2,data2,t2) • holdSend (Jx3,data3,t3) • waitReceive(Jx4,data4) Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 t1 t2 t3 t4 time receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2) receiveAndProcess(Jx3,data3) transmit(Jx4,data4) SUT Model Test Model Generation for Controlled Testing Mirroring (flipping) the transactions of a SUT model (system model behavior selected as a test case) allows automated creation of a test model

  31. Associated Test Sequences in XML TraceabilityView Through the ATC Gen GUI Selected Rule in XML

  32. Test ModelValidation & Generation Test Case TEST MODEL GENERATOR GENERATED TEST CASE MIRROR (XML) Test Model #include "hierSequence.h" #include "PPLI.h" #include "RemoteTNdrop.h" const port_t hierSeqDigraph::start=0; const port_t hierSeqDigraph::inJmsg=1; const port_t hierSeqDigraph::pass=2; const port_t hierSeqDigraph::outJmsg=3; hierSeqDigraph::hierSeqDigraph():staticDigraph() { PPLI *pp = new PPLI(); add(pp); couple(this, this->start, pp, pp->start); couple(pp, pp->outJmsg, this, this->outJmsg); RemoteTNdrop *p1 = new RemoteTNdrop(); add(p1); couple(this, this->start, p1, p1->start); couple(this, this->inJmsg, p1, p1->inJmsg); couple(p1, p1->outJmsg, this, this->outJmsg); }

  33. Test ModelExecution Test Model (C++) #include "hierSequence.h" #include "PPLI.h" #include "RemoteTNdrop.h" const port_t hierSeqDigraph::start=0; const port_t hierSeqDigraph::inJmsg=1; const port_t hierSeqDigraph::pass=2; const port_t hierSeqDigraph::outJmsg=3; hierSeqDigraph::hierSeqDigraph():staticDigraph() { PPLI *pp = new PPLI(); add(pp); couple(this, this->start, pp, pp->start); couple(pp, pp->outJmsg, this, this->outJmsg); RemoteTNdrop *p1 = new RemoteTNdrop(); add(p1); couple(this, this->start, p1, p1->start); couple(this, this->inJmsg, p1, p1->inJmsg); couple(p1, p1->outJmsg, this, this->outJmsg); } TEST DRIVER SYSTEM UNDER TEST

  34. SIAP/IABM —Successor to Link-16 • SIAP (Single Integrated Air Picture) objective • Improve the adequacy and fidelity of information to form a shared understanding of tactical situation • Aegis ships would be able to target their missiles using information obtained by an Air Force Airborne Warning and Control System (AWACS) plane. • All users in the battlespace will be able to trust that data to make decisions faster • Integrated Architecture Behavior Model (IABM) requires that all sensors utilize a standard reference frame for conveying information about the location of targets. • Navy is the lead – IABM will be integrated into its and other services’ major weapons systems • Developed by the Joint SIAP System Engineering Organization (JSSEO), Arlington, Va., a sub-office of the Assistant Secretary of the Army for Acquisition, Logistics and Technology. • JITC is the test agency mandated to do test and evaluation – initial test was Config05 end of last year

  35. ATC Gen Test CasesMIL-STD-6016C • December Objective: • Complete 17 Link-16 test cases for IABM Config05 • Produced 28 test cases and reported results ahead of schedule • Currently: 29 passed, 4 failed which prompted corrections on new time box releases by JSSEO (8 untested or in progress)

  36. Test Manager for Opportunistic Testing • Replace Test Models by Test Detectors • Deploy Test Detectors in parallel, fed by the Observer • Test Detector activates a test when its conditions are met • Test results are sent to a Collector for further processing Test Manager Jx1,data1 Jx2,data2 Jx3,data3Jx4,data4 Test Detector 1 Results Collector SUO Observer Test Detector 2 Other Federates Test Detector 3

  37. Test Detector • processDetect(Jx1,data1,t1) • processDetect(Jx2,data2,t2) • processDetect(Jx3,data3,t3) • waitReceive(Jx4,data4) Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 t1 t2 t3 t4 time receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2) receiveAndProcess(Jx3,data3) transmit(Jx4,data4) SUO Test Detector Generation for Opportunistic Testing The Test Detector watches for the arrival of the given subsequence of messages to the SUO and then watches for the corresponding system output • Define a new primitive, processDetect, that replaces holdSend • Test Detector • Tries to match the initial subsequence of messages received by the SUO • When the initial subsequence is successfully matched, it enables waitReceive (or waitNotReceive) to complete the test

  38. Observer & System Under Observation (SUO) System (e.g. DEVS) inports inports outports outports Tap into inputs and outputs of SUO Observer outports Observer For System Gather input/output data and forward for testing

  39. Example: Joint Close Air Support (JCAS) Scenario Natural Language Specification JTAC works with ODA! JTAC is supported by a Predator! JTAC requests ImmediateCAS to AWACS ! AWACS passes requestImmediateCAS to CAOC! CAOC assigns USMCAircraft to JTAC! CAOC sends readyOrder to USMCAircraft ! USMCAircraft sends sitBriefRequest to AWACS ! AWACS sends sitBrief to USMCAircraft ! USMCAircraft sends requestForTAC to JTAC ! JTAC sends TACCommand to USMCAircraft ! USMCAircraft sends deconflictRequest to UAV! USMCAircraft gets targetLocation from UAV!!

  40. Observer of AWACS with JCAS Observer is connected to SUO and monitors itsI/O traffic Data gathered by Observer addObserver(USMCAircraft, JCASNUM1);

  41. Test Detector Prototype:Sequence Matcher processDetect(J2.2,data1,t1) processDetect(J3.2,data2,t2) Sequential triggering, same as test models waitReceive (J7.0,data3,t3)

  42. Example of Effect of State: AWACS Rules R1: if phase = “passive” & receive= "ImmediateCASIn“ then output = "CASResourcesSpec" & state = "doSurveillance“ R2: if state = "doSurveillance“ & receive= "sitBriefRequestIn“ then output = "sitBriefOut“ & phase = “passive” i1 i2 matchsequence 1: initial state = passive processDetect(ImmediateCASIn,””,1) waitReceive(CASResourcesSpec,””) o1 need to know the state to enable this sequence o2 matchsequence 2: initial state = doSurveillance processDetect(sitBriefRequestIn,””,1) waitReceive(sitBriefOut,””) state = doSurveillance state = passive

  43. Solution: make activation of matchsequence2 conditional on matchsequence1 matchsequence2 can only start when matchsequence1 has successfully been performed

  44. Observation Test Of AWACS Observer of AWACS AWACS Test Manager

  45. Problem with Fixed Set of Test Detectors • after a test detector has been started up, a message may arrive that requires it to be re-initialized • Parallel search and processing required by fixed presence of multiple test detectors under the test manager may limit the processing and/or number of monitor points • does not allow for changing from one test focus to another in real-time, e.g. going from format testing to correlation testing once format the first has been satisfied Solution • on-demand inclusion of test detector instances • remove detector when known to be “finished” • employ DEVS variable structure capabilities • requires intelligence to decide inclusion and removal

  46. Dynamic Test Suite: Features • Test Detectors are inserted into Test Suite by Test Control • Test Control uses table to select Detectors based on incoming message • Test Control passes on just received message and starts up Test Detector • Each Detector stage starts up next stage and removes itself from Test Suite as soon as the result of its test is known • If the outcome is a pass (test is successful) then next stage is started up

  47. Dynamic Inclusion/Removal of Test Detectors Test Manager Active Test Suite Test Control removeAncestorBrotherOf(“TestControl"); message arrives test detector subcomponent removes its enclosing test detector when test case result is known (either pass or fail) add induced test detectors into test set addModel(‘test detector”); addCoupling2(" Test Manager ",“Jmessage",“test detector", “Jmessage");

  48. AWACS Opportunistic Testing in JCAS CAS Model with AWACS observation Test Control Initially empty Test Suite

  49. AWACS Opportunistic Testing in JCAS (cont’d) Test Control observes CAS request message to AWACS Test Control adds appropriate Test Detector and connects it in to interface,

  50. AWACS Opportunistic Testing in JCAS (cont’d) First stage detector verifies request message receipt and prepares to start up second stage Test Control passes on start signal and request message

More Related