1 / 25

D ependability benchmark applied to documents

D ependability benchmark applied to documents. Ana Maria Ambrosio. Partners: Eliane Martins, Fátima Mattiello-Fransciso, Emília Villani, Marco Veira, Henrique Madeira, Paulo Véras, Rodrigo Pastl, Marcelo Essado. Why to propose a Dependability Benchmark for documents?.

gefen
Télécharger la présentation

D ependability benchmark applied to documents

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dependability benchmark applied to documents Ana Maria Ambrosio • Partners: • Eliane Martins, Fátima Mattiello-Fransciso, Emília Villani, Marco Veira, Henrique Madeira, Paulo Véras, Rodrigo Pastl, Marcelo Essado.

  2. Why to propose a Dependability Benchmark for documents? What kind of documents?  space systems requirement specification

  3. Motivation The processes of space software development still requires reviews performed by people, then the review result evaluation depends on the skills of each reviewer. The proposed benchmark will provide a quantitative evaluation of the completeness of the document it is focused on the particular needs of an On-Board Data Handling (OBDH) Software Requirement Specification Document

  4. Space software life-cycle processes, reviews, documents System engineering processes related to software SW requirements & architecture engineering process SW design & implementation engineering process SW delivery and acceptance process CDR DDR AR QR SRR PDR Required Reviews Required Documents ICD IRD DJF TS DJF DDF TS DJF DJF ICD DJF DDF DJF DDF ICD DDF RB DDF Ref. ECSS-E-ST-40C – SpaceEngineering - Software. ESA, March 2009

  5. The benchmark context System engineering processes related to software SW requirements & architecture engineering process SW design & implementation engineering process SW delivery and acceptance process SRR QR PDR DDR CDR AR ON-board Data Handling Requirement Specification Document RB SRR (Software Requirements Review) PDR (Preliminary Design Review) DDR (Detailed Design Review) CDR (Critical Design Review) QR (Qualification Review) AR (Acceptance Review) target

  6. Dependability Benchmark for OBDH requirement documents OBDH_DBench Number of “yes” answers PUS Checklist Procedures Measurements CoFI Criticalityof the question Field Study Common Errors found in reviews Assessment and Validation of the Benchmark repeatability portability representativeness scalability, simplicity

  7. The checklist scope Packet Utilization Standard – ECSS-E_70-41 - Defines a set of services common to On-Board Data Handling software for satellites. ECSS – European Cooperation for pace Standardization PUS CoFI Conformance and Fault Injection testing methodology - guides a tester to create FSMs representing the behavior of a system according to 4 different focuses: normal behavior, specified exceptions, sneak path (inopportune events) and fault tolerance (hardware faults). Checklist Field Study Common Errors found in reviews of space on-board sw A field study has been performed through the analysis of reports generated during reviews of different space software requirement documents.

  8. PUS STANDARD focuses on the ground and satellite systems operations – through Telecommand and Telemetry packets TELEMETRY TELECOMMAND Ref. ECSS-E-70-41A – SpaceEngineering – Ground systems and operations – Telemetry and telecommand packet utilization. ESA, January 2003

  9. The PUS STANDARD and its services

  10. For each mandatory statement of the PUS standard, one or more questions were defined. Examples of questions for the TC verification service of the PUS: Does the requirement specification define the telecommand verification service type as 1? Does the requirement specification state that this service shall check whether the Packet Length field value is at least 16 octets and at most 65535 octets? Does the requirement specification state that the code 0 of the failure acceptance report means “illegal APID (PAC error)”? Examples of questions obtained from the PUS standard

  11. PUS CoFI Conformance and Fault Injection testing methodology - to create partial FSMs representing the behavior of a system according to 4 different focuses: normal behavior, specified exceptions, sneak path (inopportune events) and fault tolerance (hardware faults). Checklist Field Study Common Errors found in reviews of space on-board sw Ambrosio, A M., Martins, E.; Santiago, V; Mattiello-Francisco, MF; N.L.Vijaykumar, S.V. de Carvalho. A methodology for designing fault injection experiments as an addition to communication systems conformance testing Proceedings of the First Workshop on Dependable Software - Tools and Methods in the DSN, Jul 2005, Yokohama, Japan

  12. How CoFI, a conformance and fault injection testing methodology, can contribute to the checklist?

  13. CoFI – Conformance and Fault Injection - overview Services Textual Specifications Test cases + Fault cases Use cases Testing System Scenarios IInicialization Diagnosys Housekeeping Memory dump Hardware Faults

  14. The Models Cenarious Services Specifications Normal S1 Specified exceptions S2 Sneak path S3 Fault tolerance: . . . processador, memory & comunication fauts Sn Inputs/Outputs Models Event names Commands for the fault injector Testing System Timers and fault parameters

  15. Test cases generation Each model is an input to ConDado tool From each model or scenario a set of test cases is generated ConDado ... File.seq File.seq CoFI sequence consist of the union of the test case s generated from each model

  16. Example of a test case Test case Service= TesteData Scenario= TF_memproces

  17. Each possible transition of a FSM represents an expected input/output relationship and originates a question. Examples of questions obtained from the COFI models representing the TC verification service of the PUS standard: Is there any requirement which states unequivocally when the verification made by the telecommand verification service results in the sending of a failure report of checksum-acceptance? Is there any requirement which defines the action of the service in the case of receiving a confirmation of “telecommand execution conclusion“ from the application process instead of a confirmation of “execution starting”? Does the requirement specification define the action of the service if some answer of the application process is not received? Examples of questions obtained from the models created according to COFI

  18. PUS CoFI Checklist Field Study A field study was performed through the analysis of reports generated during reviews of different space software requirement documents. Common Errors found in reviews of space on-board sw

  19. Reports produced from formal reviews of space embedded software documents were analyzed Errors found were classified and new questions will be proposed Field study P. C. Véras, E. Villani, A. M. Ambrosio, N. Silva, M. Vieira, and H. Madeira, "Errors on Space Software Requirements: A Field Study and Application Scenarios", 2010 IEEE 21st International Symposium on Software Reliability Engineering (ISSRE 2010), 1st-4th November, San Jose, CA, USA, pp. 61-70, doi 10.1109/ISSRE.2010.30.

  20. Errors classification and experimental evaluation results

  21. Steps to improve the benchmark • To improve clearness of the questions • To associate criticality to each question, according to the severity of the aspect treated by that question • To define how to measure the result of the benchmark considering each question has a weight • To determine a minimum threshold value to decide whether the requirement specification is good enough to pass to next project phase.

  22. Other scenarios • The proposed benchmark can be used as a feedback to the development team. • The results of applying the benchmark provide a measure of the completeness, robustness and accomplishment with the followed standards (ECSS standards). • The negative answers given to the checklist can be used to improve these aspects in the requirements specification

  23. Fifth LADC – Latin-American Symposium on Dependable Computing Invitation: São José dos Campos Brazil in 2011 http://www.inpe.br/ladc2011

  24. INPE’ main activities http://www.inpe.br Space Science Meteorology Earth Observation Space Eng. & Technology SCD2 CBERS

  25. INPE’s branches São Luís (MA) Eusébio (CE) Natal (RN) Cuiabá (MT) Brasília (DF) Cachoeira Paulista (SP) São José dos Campos (SP) São Martinho da Serra (RS)Santa Maria (RS) Atibaia (SP) CRAAM (SP)

More Related