1 / 34

Component-Based Abstraction and Refinement

Component-Based Abstraction and Refinement. Juncao Li 1 , Xiuli Sun 2 , Fei Xie 1 , and Xiaoyu Song 2 1 Dept. of Computer Science 2 Dept. of ECE Portland State University. Agenda. Problems and Challenges Component-Based Abstraction and Refinement Case Studies and Evaluations

tivona
Télécharger la présentation

Component-Based Abstraction and Refinement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Component-Based Abstraction and Refinement Juncao Li1, Xiuli Sun2, Fei Xie1, and Xiaoyu Song2 1Dept. of Computer Science 2Dept. of ECE Portland State University

  2. Agenda • Problems and Challenges • Component-Based Abstraction and Refinement • Case Studies and Evaluations • Conclusions and Future Work System Verification Laboratory, Portland State University

  3. Goal • Correctness of Development and Reuse in Component-Based Development (CBD) • In CBD: • A system is developed by components • Components do not share states • Communicate through interfaces System Verification Laboratory, Portland State University

  4. Problems of CBD • Same interface, but different behaviors • Literal specification is not accurate • Whether sub-components together can implement system functionalities Explosion of Ariane 5 rocket on June 4, 1996 (cost $500 million) System Verification Laboratory, Portland State University

  5. Solution: Model Checking • Checking whether a given model conforms to given formal specifications • Model, e.g., hardware or software design • Formal specification, e.g., temporal logic formula System Verification Laboratory, Portland State University

  6. State Space Explosion • Model checking tries all possibilities • State space explosion • Possible states and execution paths in a real-world system could be too large to explore • Compositional reasoning for CBD • Decompose system into modules • Check module properties locally • Derive system properties from the module properties • Potential to relieve the problem System Verification Laboratory, Portland State University

  7. Research Challenges • How to reuse verification efforts • Verified properties should not be checked again • How to build the abstraction for verification • Important to reduce the complexity • How to determine the causes for a compositional reasoning failure • Real error or abstraction inaccuracy System Verification Laboratory, Portland State University

  8. Our Contributions • Verification reuse • Verified properties as component abstractions • Automatic component-based abstraction algorithm • Mechanized assistant for abstraction refinement • Application • Co-verification of embedded systems: consider both HW and SW in verification System Verification Laboratory, Portland State University

  9. Agenda • Problems and Challenges • Component-Based Abstraction and Refinement • Case Studies and Evaluations • Conclusions and Future Work System Verification Laboratory, Portland State University

  10. Unified Component Model • Unifying hardware and software component models • Component = (Design, Interface, Properties) • HW, SW, and bridge components • Different design and interface specifications • Same property specification • Verified properties are associated with components Software Component Bridge Component Hardware Component Software Component Software Component Bridge Component Hardware Component System Verification Laboratory, Portland State University

  11. A(p) Assumptions = Assumed Properties Environment of C (Components interacting with C ) Component Property • A property of a component C is a pair (p, A(p)) • p is a temporal assertion • A(p) is a set of assumptions on environment of C • pis verified assumingA(p)holds. A(p) p holds on C p C System Verification Laboratory, Portland State University

  12. xUML Object Instance Input Message Type Component Boundary Output Message Type Example: Software Sensor Component System Verification Laboratory, Portland State University

  13. Software Sensor Properties /* Properties on overall component functionality */ IfRepeatedly (C_Intr) Repeatedly (Output); /* Properties on interactions with software components */ After (Output) Never (Output) UnlessAfter (OP_Ack); /* Properties on interactions with hardware and responses to scheduling */ After (C_Intr) Eventually (C_Ret); Never (C_Ret) UnlessAfter (C_Intr); After (C_Ret) Never (C_Ret) UnlessAfter (C_Intr); After (A_Intr) Eventually (A_Ret); Never (A_Ret) UnlessAfter (A_Intr); After (A_Ret) Never (A_Ret) UnlessAfter (A_Intr); After (ADC.Pending) Never (ADC.Pending) UnlessAfter (A_Ret); After (S_Schd) Eventually (S_Ret); Never (S_Ret) UnlessAfter (S_Schd); After (S_Ret) Never (S_Ret) UnlessAfter (S_Schd); After (STQ.Empty = False) Never (STQ.Empty = False) UnlessAfter (S_Ret); System Verification Laboratory, Portland State University

  14. Software Sensor Assumptions /* Assumptions on interactions with software components */ After (Output) Eventually (OP_Ack); Never (OP_Ack) UnlessAfter (Output); After (OP_Ack) Never (OP_Ack) UnlessAfter (Output); /* Assumptions on interactions with hardware and on scheduling */ After (C_Intr) Never (C_Intr+A_Intr+S_Schd) UnlessAfter (C_Ret); After (ADC.Pending) Eventually (A_Intr); Never (A_Intr) UnlessAfter (ADC.Pending); After (A_Intr) Never (C_Intr+A_Intr+S_Schd) UnlessAfter (A_Ret); After (A_Ret) Never (A_Intr) UnlessAfter (ADC.Pending) After (STQ.Empty = FALSE) Eventually (S_Schd); Never (S_Schd) UnlessAfter (STQ.Empty = FALSE); After (S_Schd) Never (C_Intr+A_Intr+S_Schd) UnlessAfter (S_Ret); After (S_Ret) Never (S_Schd) UnlessAfter (STQ.Empty = FALSE); System Verification Laboratory, Portland State University

  15. Unify hardware and software semantics via translation Semantics Mapping Semantics Mapping Asynchronous Interleaving Message-passing Semantics Synchronous Clock- driven Semantics ω-automaton Semantics Semantics Conformance Semantics Conformance Semantics Conformance xUML-to-S/R Translation Verilog-to-S/R Translation Executable UML (xUML) S/R Verilog System Verification Laboratory, Portland State University

  16. Constraints: properties of C2 related to pand whose assumptions hold Constraints: properties of C1 related to pand whose assumptions hold ω-automaton ω1 (simulating the interface of C1; non-deterministic) ω-automaton ω2 (simulating the interface of C2; non-deterministic) ω-automaton env (simulating the interface of the composition’s environment; non-deterministic) Constraints: the composition’s environment assumptions related to p Note: Circular reasoning must be ruled out by appropriate compositional reasoning rules. Properties as Component Abstractions Abstraction for checking p on the composition of C1 and C2: System Verification Laboratory, Portland State University

  17. Key Challenges in Abstraction (1) • What component properties are related? • ABV tends to introduce many properties • Construct property dependency graph • Add dependency arcs of (q, A(q)) based on A(q) • Dependency analysis based on variables • Optimizations based on property templates • Differentiating safety and liveness properties • Utilizing template semantics to remove false arcs System Verification Laboratory, Portland State University

  18. Dependency Graph Example System Verification Laboratory, Portland State University

  19. Key Challenges in Abstraction (2) • What component properties can be included? • Properties have assumptions • Circular dependencies among properties • Enable component properties optimistically • Follow the dependency graph • Check whether their assumptions are satisfied • Assume that dependency cycles do not cause problems • Detect cycles of liveness properties • No cycle with both safety and liveness properties • Cycles of safety properties not a problem System Verification Laboratory, Portland State University

  20. Automatic Abstraction Algorithm System Verification Laboratory, Portland State University

  21. Automatic Abstraction Algorithm (Cont.) System Verification Laboratory, Portland State University

  22. Mechanized Refinement Assistant • Unsatisfied assumptions of component properties • Identification • Breadth-first search on the dependency graph • All nodes marked “directly unsatisfied” and reachable from (true, {p}) only via “indirectly unsatisfied” nodes • Automatic remedies • Verify unsatisfied assumptions of identified properties • Manual remedies • Modify existing component properties • Introduce new component properties System Verification Laboratory, Portland State University

  23. Mechanized Refinement Assistant (Cont.) • Liveness property dependency cycles • Identification • Done in abstraction algorithm • Automatic remedies • Exclude properties on the cycles • Apply CR rules with automatic checks [Amla, et al. 01] • Manual remedies • Conduct temporal inductions [McMillan 99] • Modify component properties involved System Verification Laboratory, Portland State University

  24. Agenda • Problems and Challenges • Component-Based Abstraction and Refinement • Case Studies and Evaluations • Conclusions and Future Work System Verification Laboratory, Portland State University

  25. Bottom-Up Verification of Basic Components Verification of primitive HW/SW components • Direct application of model checking • Verification of properties shown before • Properties are verified under their assumptions System Verification Laboratory, Portland State University

  26. Top-Down Verification of Basic Sensor System • Properties of bridge components • Derived from properties of HW/SW components they connect • Verified in 3.76 seconds and 6.03 MB and 0.66 seconds and 4.07 MB • System property • Verified on an abstraction constructed from component properties • Using 0.1 seconds and 3.40 MB S-SEN S-NET Repeated Transmission Property: Repeated (H-NET.flag); Repeated (H-NET.flag = False) Bridge Bridge H-NET H-CLK H-SEN System Verification Laboratory, Portland State University

  27. Top-Down Verification of Basic Sensor (Cont.) • Abstraction construction and verification • No verified component properties are included • The property does not hold on the abstraction • Abstraction refinement • Introducing and verifying new component properties • Facilitating detection of design errors S-SEN S-NET No Consecutive 1’s Property: Never ((S-NET.RFM.Rev=1) and (S-NET.RFM.Buf=1) and (S-NET.RFM.Status=Transmitting)) Bridge Bridge H-NET H-CLK H-SEN System Verification Laboratory, Portland State University

  28. Top-Down Verification of Multi-Sensor System • Properties of new bridge component • Derived from properties of HW/SW components it connects • Verified in 10.24 seconds and 6.05 MB • System property • Verified on an abstraction constructed from component properties • Using 0.1 seconds and 3.40 MB S-SEN S-NET Repeated Transmission Property: Repeated (H-NET.flag); Repeated (H-NET.flag = False) Bridge Bridge H-CLK H-SEN 1 H-SEN 2 H-NET System Verification Laboratory, Portland State University

  29. Top-Down Verification of Encryption-Enabled Sensor • Properties of S-ENC and H-ENC • Verified in 0.24 seconds and 3.57 MB and 0.22 seconds and 3.39 MB • Properties of new bridge component • Verified in 10.24 seconds and 6.05 MB • System property • Verified on an abstraction constructed from component properties • Using 0.1 seconds and 3.40 MB S-SEN S-ENC S-NET Repeated Transmission Property: Repeated (H-NET.flag); Repeated (H-NET.flag = False) Bridge Bridge Bridge H-CLK H-SEN H-ENC H-NET System Verification Laboratory, Portland State University

  30. Integrated Verification of New Reusable Components A new reusable composite component: encryption-enabled network is constructed bottom-up Properties IfRepeatedly (Raw) Repeatedly (HNET.flag); IfRepeatedly (Raw) Repeatedly (HNET.flag=False); After (Raw) Eventually (Raw_Ack); Never (Raw_Ack) UnlessAfter (Raw); After (Raw_Ack) Never (Raw_Ack) UnlessAfter (Raw); Assumptions: After (Raw) Never (Raw+E_Intr+N_Schd+R_Intr) UnlessAfter (Raw_Ack); After (E_Intr) Never (Raw+E_Intr+N_Schd+R_Intr) UnlessAfter (E_Ret); After (N_Schd) Never (Raw+E_Intr+N_Schd+R_Intr) UnlessAfter (N_Ret); After (R_Intr) Never (Raw+E_Intr+N_Schd+R_Intr) UnlessAfter (R_Ret); S-ENC S-NET Bridge Bridge H-ENC H-NET System Verification Laboratory, Portland State University

  31. Scalability Evaluation on Small-Size Systems • Verification of the repeated transmission property on three systems • Conducted on a SUN Workstation with 1GHZ CPU and 2GB memory CBCV: Time (or memory) usage = sum (or max) of time (or memory) usages for verifying new components and abstractions System Verification Laboratory, Portland State University

  32. Agenda • Problems and Challenges • Component-Based Co-Verification • Case Studies and Evaluations • Conclusions and Future Work System Verification Laboratory, Portland State University

  33. Conclusions and Future Work • An important step towards component-based HW/SW co-verification of embedded systems • Preliminary results are promising • Achieved major verification reuse • Led to order-of-magnitude verification reductions • Future work • Heuristics for automating property formulation • Further evaluation and cost quantification System Verification Laboratory, Portland State University

  34. Further Information • Website: • http://www.cs.pdx.edu/~xie/co-ver/co-ver-home.htm • Email: • juncao@cs.pdx.edu Questions? System Verification Laboratory, Portland State University

More Related