150 likes | 281 Vues
Project Overview. Wolfgang Herzner Smart Systems Division. Objectives. To significantly enhance testing and verification of dependable embedded systems by means of automated generation of efficient test cases
E N D
Project Overview Wolfgang Herzner Smart Systems Division
Objectives • To significantly enhance testing and verification of dependable embedded systems • by means of automated generation of efficient test cases • relying on development of new approaches as well as innovative integration of state-of-the-art techniques Goal: reduce testing effort by at least 20% • To address both testing of • non-functional issues like reliability, e.g. by system stress or overload tests • and functional safety tests • To apply these technologies in large industrial systems, • simultaneously enabling application domain experts (with rather little knowledge and experience in usage of formal methods) to use them with minimal learning effort
Approach • Define common modelling languages and semantics for domain specific requirements and (partial) models of the demonstrators • Define test theory that defines • conformance relation between the model and implementation • notion of success and failure of a test case • Define fault models • and extend the modelling languages to allow the integration of the representation of faults into the (application) models • Define new coverage criteria, and use existing TCG techniques to generate efficient test cases that achieve this coverage • Use model-based fault injection (MBFI) to extend models for automatically calculating minimal cut sets • Validate the defined fault models (and thus the generated test cases) with physical fault injection • Use (bounded) model checking techniques to generate stress test scenarios • Provide a framework for semantics-aware transformations from system models to inputs of specific tools, e.g. to enable interaction of generated models with existing simulation environments for allowing evaluation of model coverage
Key Figures • Duration: • 36 months (Jan.2008 – Dec. 2010) • Costs: • 4,4 M€ Total • 3,1 M€ Funding • Efforts: • 400,5 PM RTD • 48,5 Demo • 12,0 Mgmt
Consortium Research organisation University Tool developer End user (industrial demonstrator)
WP 1 – Requirements and State of the Art • Tasks T1.1 Definition of Scope (select requirements relevant for modelling and TCG) T1.2 Preparation of Requirements and Support to Partners T1.3 Requirements Analysis and Features Identification T1.4 State of the Art Survey (know-how of all partners) T1.5 Demonstrators Background (including outline of first specification of the final demonstrators) • Deliverables D1.1a (M 6) Requirements and Needed Model Elements for Tests and the Industrial Demonstrators (report) • including information about existing demonstrator (component)s, as well as an high-level spec. of demonstrators (WP6/7) D1.2 (M 6) Survey on Test Case Generation Techniques (report) D1.1b (M 18) Updated version of D1.1a (report)
WP 2 - Framework • Tasks T1.1 Tool Interoperability: model transformations • from core modelling languages to analysis and TCG tools • between tools selected or newly developed in WP3 and WP4 • to achieve • semantics preservation • requirements traceability • seamless integration into existing development processes T1.2 Interfaces to Existing Development and Testing Environments, e.g. • TRSS's ELEKTRA HMI (to visualise and operate the created models) • PROL's ELPULT or FFA's AVS (to apply generated test cases) T1.3 Framework Implementation (of model transformations) • (select environment, e.g. VIATRA, Eclipse?) • Deliverables D2.1 (M 12) Framework Specification (report) D2.2a (M 18) First Framework Implementation (software, user guide, report) D2.2b (M 24) Pre-final Framework Implementation (software, user guide, report) D2.2c (M 36) Updated Framework Implementation (software, user guide, report)
WP 3 - Modelling and Testing Theory • Tasks T3.1 Modelling Languages • considering MDA (PIM PSM) T3.2 Formal Verification of Application Specific Models (considering ontologies) T3.3 Fault Models, considering • Systematic insertion of faults into the model • Fault meta-modelling • Abstraction T3.4 Testing Theory T3.5 Test Coverage Criteria • Deliverables D3.1a/b (M 12/24) Fault Models (report) D3.2a/b (M 18/30) Modelling Languages (report, software) D3.3a/b (M 18/30) Ontology-Based Model Verification (report, software) D3.4a/b (M 18/30) Testing Theories and Coverage Criteria (report)
WP 4 - Algorithms, Tools, and Fault Injection • Tasks T4.1 Fault-based Methods for Test Case Generation • investigate whether fault effects produced with formal verification methods (e.g. model checking) can be used when generating test suites T4.2 Minimal Cut Sets Based Fault Injection (MCSBFI) T4.3 Combined Approach using Fault Injection and Formal Methods • using T4.1 and T4.2 results, e.g. mutation-based testing T4.4 Model-based Fault Injection Mechanisms • develop FI methods on model elements (signals and operators) T4.5 Development of Tools • Deliverables D4.1 (M 12) Fault-based Test-Case Generation Methods (report) D4.2 (M 18) Minimal Cut Sets Based Fault Injection (report, software) D4.3 (M 18) Mutation Testing Based Test Case Generation (report, software) D4.4 (M 24) Integrated Testing Methods and Techniques (report) D4.5 (M 30) Test Case Generators and Fault Injection Tools (report, software)
WP 5 - Test-Case Generation and Assessment • Tasks T5.1 Simulating Hardware Related Faults at Model Level • investigate whether results derived with physical fault injection could be derived already at the modelling phase T5.2 Modelling and TCG of Rail Signalling Control Systems (TRSS) T5.3 Modelling and applying TCG Techniques for State Decoders of Railway Interlocking Systems (PROL) T5.4 Applying TCG tools for ISOBUS-testing in Off-Highway Machines (RELAB) T5.5 Applying TCG for In-the-Loop Testing of In-Car Systems (FFA) • Deliverables D5.1 (M 30) Simulating Hardware Related Faults at Model Level (report, test vectors) D5.2a/b (M 24/33) TCG for Functional and Stress Tests of Rail Signalling Interlocking Systems (report, test cases) D5.3a/b, D5.4a/b, D5.5a/b … analogue for T5.3, T5.4 T5.5 D5.6 (M36) Assessment report
TRSS PROL WP 6 – Railway Demonstrators • Tasks T6.1 Setup of Railway Interlocking Demonstrator T6.2 Execution and Evaluation of Generated (Stress) Tests T6.3 Setup of Railway Objects State Decoder and Simulator Demonstrator T6.4 Execution and Evaluation of Test Cases Generated for State Decoder and Simulator • Deliverables D6.1 (M 30) Setup of Railway Interlocking Demonstrator. (HW/SW) D6.2 (M 30) Setup of Railway State Decoder Demonstrator. (HW/SW) D6.3 (M 36) Test Results and Final Methods Evaluation of Railway Interlocking Demonstrator (report) D6.4 (M 36) Test Results and Final Methods Evaluation of Railway State Decoder Demonstrator (report)
FFA PROL WP 7 – Automotive Demonstrators • Tasks T6.1 Setup of the Automotive Demonstrator T6.2 Evaluation of Test cases Generated for the Automotive Demonstrator T6.3 Setup of the ISOBUS Off-Highway Demonstrator T6.4 Evaluation of the TCG (for the ISOBUS Off-Highway Demonstrator) • Deliverables D7.1 (M 30) Setup of Automotive Demonstrator. (HW/SW) D7.2 (M 30) Setup of Off-Highway Demonstrator. (HW/SW) D7.3 (M 36) Test Results and Final Methods Evaluation of Automotive Demonstrator (report) D7.4 (M 36) Test Results and Final Methods Evaluation of Off-Highway Demonstrator (report)
WP 8 - Dissemination and Exploitation • Tasks T8.1 Dissemination • website • publications, workshop participation, … T8.2 Exploitation • mainly by industrial partners T8.3 Standardisation • input about result usability to functional safety stds. T8.4 IPR Management • Deliverables D8.1 (M 3) Project website installed and operative D8.2a/b/c (M 12/24/36) Dissemination report D8.3 (M 24) Public Workshop presenting MOGENTES Intermediate Results and panel discussion for stakeholders feed back
c c f c 6 5 4 3 3 4 4 2 . 2 3 . . . 6 . . . . . . 5 8 6 6 7 7 3 2 8 0 8 D D D D D D D D D D 5 3 4 3 b b b b 2 4 5 3 . . . . 3 5 5 5 5 3 D D D D 2 3 1 3 b b b e 1 2 1 2 5 . . 2 3 4 4 . . . 0 7 7 . . . . 6 6 4 3 3 3 3 8 D D D D D D D D D 9 2 8 2 7 2 6 2 5 2 b b b b d 1 4 3 3 2 1 2 4 . . . . . . . . 4 4 5 8 0 2 3 8 8 2 D D D D D D D D 3 2 2 2 1 2 0 2 9 1 a a a a a a a a c b 2 3 2 2 3 4 2 3 4 5 . . 1 4 . . . . . . . . . 8 . 4 4 2 3 3 3 5 5 5 5 1 1 8 D D D D D D D D D D D D 7 1 6 1 5 1 4 1 3 1 a b a a 1 1 2 4 3 1 . . . . 2 . . 2 4 8 8 1 3 0 D D D D D D 1 1 0 1 9 8 7 a a 2 4 1 . . . 6 1 8 1 D D D 5 4 1 1 2 . . . 8 0 0 3 D D D 2 1 . n . s o e t . n l G n m p o . k e o i e r x e t o m a E D s I m w F a n e . e & y r i e , C r m d i D u s . r o m t l o s o q l e o i s t a s a u o e h o e r i A R C R D F T T T : : : : : : : : : 6 7 8 0 1 2 3 4 5 P P P P P P P P P W W W W W W W W W Schedule