390 likes | 443 Vues
Model-Validation in Model-Based Development. Kurt Woodham L-3 Communications. Ajitha Rajan, Mats Heimdahl University of Minnesota. OSMA SAS ’08 September 8-12. Problem: Model Validation. Model-Based Development (MBD) is here to stay Use of MBD is accelerating
E N D
Model-Validation in Model-Based Development Kurt Woodham L-3 Communications Ajitha Rajan, Mats Heimdahl University of Minnesota OSMA SAS ’08 September 8-12
Problem: Model Validation • Model-Based Development (MBD) is here to stay • Use of MBD is accelerating • Estimate 50% of NASA development projects using some form of MBD • Many advantages: model-checking, code generation, desktop testing, closed-loop simulation • Enhances early detection of requirement, design, or implementation defects • “Executable Specifications” enable evaluation of behavior that might otherwise be relegated to Inspections and Testing • How do we know the models are “right”? • Manually develop black-box tests • When have we validated enough? • Measure test coverage on an implementation/model SAS_08_Model_Val_Tech_Heimdahl
Problem : Current Practice • Measure black-box test coverage over the model • Indirect measure • Defects of omission in model not exposed. • Executable artifact is necessary • Adequacy can only be determined late in the development process Incomplete Model Weak Black-Box Test set SAS_08_Model_Val_Tech_Heimdahl
Goals of Project • Define metrics for objective, implementation-independentmeasure of adequacy of a black-box test suite • Develop tools to measure validation adequacy based on the defined metrics • Provide capability for autogeneration of black-box test suites SAS_08_Model_Val_Tech_Heimdahl
Assertions Does it implement? Specification Does it implement? Model Implementation Model-Based Testing (MBT) to Verify Code Model Does it implement? Source Code Testing – What does it mean? Assertion Based Testing (ABT) to Validate Model In General Our contribution is in providing novel ABT capabilities SAS_08_Model_Val_Tech_Heimdahl
What are Assertions? Properties/ Formal Assertions Assertions Can also be over components, interfaces,... Defined over System in1 out1 ink outm SAS_08_Model_Val_Tech_Heimdahl
Validate Assess Model and Assertion Completeness 3 Model Contributions - ABT 2 Auto-generate Black-Box Tests Assertions Measure Adequacy 1 • We provide the following contributions in the Assertion-Based testing domain (indicated by in the above figure): • Objective, implementation-independentmeasure of adequacy of a black-box test suite • Auto-generation of black-box validation tests directly from assertions • Objective assessment of completeness of model as well as assertions SAS_08_Model_Val_Tech_Heimdahl
G (FD_On -> Cues_On); G((¬ Onside_FD_On Λ ¬ Is_AP_Engaged) → X(Is_AP_Engaged → Onside_FD_On)) Temporal Logic The Idea Write assertions in a formal notation… Synchronous Observers …then define structural coverage metrics to directly and objectively describe coverage of assertions SAS_08_Model_Val_Tech_Heimdahl
LTL Temporal Operators A A A A A Si S0 S1 S2 S3 B SAS_08_Model_Val_Tech_Heimdahl
Formalizing Assertions “If the onside FD cues are off, the onside FD cues shall be displayed when the AP is engaged” G((¬ Onside_FD_On ¬ Is_AP_Engaged) → X(Is_AP_Engaged → Onside_FD_On)) • Possible Coverage Metrics • Assertion coverage: single test case that demonstrates that assertion is satisfied • Prone to “dumb” tests, e.g., execution in which AP is never engaged. • More rigorous metric is necessary SAS_08_Model_Val_Tech_Heimdahl
Task - 1 • Define a collection of assertion coverage criteria • Formalize the assertion coverage obligations SAS_08_Model_Val_Tech_Heimdahl
Antecedent Coverage • Many of the assertions in the FGS are of the form : • Globally if ‘A’ occurs then ‘B’ will occur G (A → B) • Two ways of satisfying (A → B) • A is false • A is true and B is true • Antecedent Coverage–test cases will exercise the antecedent. What if:AÚCÚD → B S0 S1 Sn Not A Not A A, B SAS_08_Model_Val_Tech_Heimdahl
Modified Condition/Decision Coverage (MC/DC) • To satisfy MC/DC • Every point of entry and exit in the model should be invoked at least once, • Every basic condition in a decision in the model should take on all possible outcomes at least once, and • Each basic condition should be shown to independently affect the decision’s outcome Basic Conditions Independent effect of A Independent effect of B SAS_08_Model_Val_Tech_Heimdahl
Unique First Cause (UFC) Coverage “System shall eventually generate an Ack (A) or a Time Out (B)” Req. LTL property - F(A B) . ¬A, B ¬A, ¬B ¬A, ¬B ¬A, ¬B A, ¬ B Si S0 S1 S2 S3 Path satisfies UFC obligation for A but not B. ¬A, B ¬A, ¬B ¬A, ¬B To show independence of B, Si S1 S0 Formal UFC obligation for A : ¬(A B) U (A ¬B) for B : ¬(A B) U (B ¬A) SAS_08_Model_Val_Tech_Heimdahl
UFC Coverage • G(A)+ = {A U(a G(A)) | a є A+} G(A)- = {A U a | a є A-} • F(A)+ = {¬A U a | a є A+} F(A)-= {¬A U (a G(¬A))| a є A-} • (A U B)+ = {(A ¬B) U ((a ¬B) (A U B)) | a є A+}{(A ¬B) U b | b є B+} (A U B)-= {(A ¬B) U (a ¬B) | a є A-} {(A ¬B) U(b ¬(A U B)) | b є B-} • X(A)+= {X(a) | a є A+}X(A)-= { X(a) | a є A-} Michael Whalen, Ajitha Rajan, Mats Heimdahl and Steven Miller. Coverage Metrics for Requirements-Based Testing. In Proceedings of ISSTA 2006. SAS_08_Model_Val_Tech_Heimdahl
Task 2 – Validation Adequacy Measurement Tool • We currently support the following coverage metrics: • Assertion Coverage • Assertion Antecedent Coverage • Assertion UFC Coverage SAS_08_Model_Val_Tech_Heimdahl
Task - 3 • Automatically generate requirements-based tests from … • Formal assertions • Abstract model called Assertion Model created using assertions and environmental constraints (specified as invariants) … to provide the defined assertion coverage. SAS_08_Model_Val_Tech_Heimdahl
Automatically Generating Requirements-Based Tests Common with the Adequacy Measurement Tool Assertions and environmental constraints specified as invariants SAS_08_Model_Val_Tech_Heimdahl
What Are Model Checkers? • Breakthrough technology of the 1990’s • Widely used in hardware verification • (Intel, Motorola, IBM, …) • Several different types of model checkers • Explicit, Symbolic, Bounded, Infinite Bounded, … • Exhaustive search of the global state space • Consider all combinations of inputs and states • Equivalent to exhaustive testing of the model • Produces a counter example if a property is not true • Easy to use • “Push button” formal methods • Very little human effort unless you’re at the tool’s limits • Limitations • State space explosion SAS_08_Model_Val_Tech_Heimdahl
Preliminary Evaluation Interested in determining: • Feasibilityof generating assertion-based tests from a set of assertions • Generated assertion-based tests to provide UFC coverage over the assertions • Effectiveness of these test sets in validating the system model • Measured MC/DC achieved by the test sets over the system model Used three realistic sized examples: • Flight Guidance System (FGS), • and two models related to the Display Window Manager system (DWM1 and DWM2) SAS_08_Model_Val_Tech_Heimdahl
Results Ajitha Rajan, Michael Whalen, and Mats Heimdahl. Model Validation using Automatically Generated Requirements-Based Tests. In Proceedings of 10th IEEE High Assurance Systems Engineering Symposium, Nov 2007.
Results and Analysis • UFC test suites achieved high MC/DC coverage over DWM models – well defined set of assertions • Test-suite generated for UFC achieved very low MC/DC over the FGS model “When the FGS is in independent mode, it shall be active”. G(m_Independent_Mode_Condition.result → X(Is_This_Side_Active = 1)) RSML–e Macro Structure of Independent_Mode_Condition is not captured in the property Independent_Mode_Condition = ((Is_LAPPR_Active & Is_VAPPR_Active & IS_Offside_LAPPR_Active & Is_Offside_VAPPR_Active) | ( Is_VGA_Active & Is_Offside_VGA_Active)) SAS_08_Model_Val_Tech_Heimdahl
Benefits of ABT • Saves time and effort in generating validation test suites from assertions • Effective method for generating model validation tests when the assertions are well defined • Helps in identifying missing assertions and over constrained models SAS_08_Model_Val_Tech_Heimdahl
Model Assertions Useful ?? Conformance Tests Code Bonus Task – Adequacy of Conformance Testing Measure Adequacy Measure Adequacy Run • Direct assessment of how well tests exercise the assertions • Will expose defects of omission • Assertion coverage could necessitate longer test cases than for model coverage SAS_08_Model_Val_Tech_Heimdahl
Assertion Coverage as an Adequacy Measure for Conformance Testing Hypothesis 1(H1): Conformance tests providing assertion UFC coverage are more effective than conformance tests providing MC/DC over the model Hypothesis 2(H2): Conformance tests providing assertion UFC coverage in addition to MC/DC over the model are more effective than conformance tests providing only MC/DC over the model SAS_08_Model_Val_Tech_Heimdahl
Experiment • Used four industrial systems : • Two models from the display window manager • Two models representing the mode logic of a flight guidance system • Assessed effectiveness of test suites in terms of their fault finding ability Ajitha Rajan, Michael Whalen, Matt Staats, and Mats Heimdahl. Requirements Coverage as an Adequacy Measure for Conformance Testing. To Appear in Proceedings of 10th International Conference on Formal Engineering Methods, Oct 2008. SAS_08_Model_Val_Tech_Heimdahl
Results – Hypothesis 1 Hypothesis 1 rejected at 5% statistical significance on all but the Latctl system SAS_08_Model_Val_Tech_Heimdahl
Analysis – Hypothesis 1 • Model coverage better than assertion coverage for measuring adequacy of conformance test suites • Assertion UFC coverage is heavily dependent on the nature and completeness of the assertions • Rigor and robustness of assertion coverage metric used is important • UFC metric gets cheated when assertions are structured to hide the complexity of conditions SAS_08_Model_Val_Tech_Heimdahl
Results – Hypothesis 2 Hypothesis 2 accepted at 5% statistical significance on all but the DWM2 system SAS_08_Model_Val_Tech_Heimdahl
Analysis – Hypothesis 2 Does UFC really help in revealing additional faults? SAS_08_Model_Val_Tech_Heimdahl
Summary – Bonus Task • UFC > MC/DC FALSE • 3 of the 4 case examples at 5% statistical significance • UFC + MC/DC > MC/DC TRUE • 3 of the 4 case examples at 5% statistical significance • Combine rigorous metrics for assertion coverage and model coverage to measure adequacy of conformance test suites • UFC metric is sensitive to structure of assertions • Need assertion coverage metrics that are robust to structure of assertions SAS_08_Model_Val_Tech_Heimdahl
Technology Readiness Level • “Requirements-Based Test Generation Tool” TRL = 6 • System/subsystem model or prototype demonstration in a relevant environment • “Validation Adequacy Measurement Tool” TRL = 6 • System/subsystem model or prototype demonstration in a relevant environment SAS_08_Model_Val_Tech_Heimdahl
Relevance to NASA • MBD is here - estimate one-half of all NASA missions in development or on the books will use model-based subsystem development • Extensive use in avionics industry • How do we know the models are right? • Model validation problem • We provide the capability to • Objectively measure the “quality” of assertion-based black-box validation tests • Objectively assess the completeness of a model • Does the model address all assertions? • Objectively assess the adequacy of a set of assertions • Are there enough assertions to adequately describe the model? • Automatically generate truly assertion-based tests SAS_08_Model_Val_Tech_Heimdahl
Achievements to Date • Formal assertion notation identified • Most work with LTL • Extended to work with Live Sequence Charts (LSC) • Objective validation metrics defined • Requirements, Antecedent, Unique First Cause, and Unique Cause • Test case generation tool developed • Developed tool generating tests from LTL • Capable of generating tests to all metrics defined • Prototype tool working on LSC developed • Developed test-adequacy measurement tool for the defined validation metrics • Evaluation of metrics and tool • 12 papers and one PhD dissertation (Ajitha Rajan) SAS_08_Model_Val_Tech_Heimdahl
Next Steps • Investigate alternative requirements notations to LTL • Complete empirical evaluation of the effectiveness in model validation • Flight Guidance Sensor (FGS) evaluation complete • Display Manager (DM) evaluation in work • Coordinate evaluation on NASA IV&V project • Coordinate technology transfer SAS_08_Model_Val_Tech_Heimdahl
Discussion SAS_08_Model_Val_Tech_Heimdahl