1 / 21

CONTENTS 1. Method description 2. Qualitative assessment 3. Quantitative evaluation 4. Conclusions

VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project. CONTENTS 1. Method description 2. Qualitative assessment 3. Quantitative evaluation 4. Conclusions. 1. METHOD DESCRIPTION. 1. METHOD DESCRIPTION

Ava
Télécharger la présentation

CONTENTS 1. Method description 2. Qualitative assessment 3. Quantitative evaluation 4. Conclusions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project

  2. CONTENTS 1. Method description 2. Qualitative assessment 3. Quantitative evaluation 4. Conclusions

  3. 1. METHOD DESCRIPTION

  4. 1. METHOD DESCRIPTION Quantitative analysis 1. Development of the map of evidence • identification of pieces of evidence to be included in the model (actually a result of qualitative safety evaluation) • identification of the relationship between the pieces of evidence by engineering judgement • identification of the relationship between the evidence and the failure probability of the system

  5. 1. METHOD DESCRIPTION Quantitative analysis 2. Definition the structure of the Bayes network model • definition of the variables (or the nodes) of the model; i.e. the definition of the variables measuring the degree of quality for evidence analysed in the tasks of the qualitative analysis • definition of the measurement of rating scales for each variable (usually a discrete or ordinal scale), by using expert judgement • definition of the probabilistic relationships and dependencies between the variables (e.g. the relationship between coverage of testing and failure probability) by expert judgement

  6. 1. METHOD DESCRIPTION Quantitative analysis 3 Quantification of the model • quantification of the variable ratings by expert judgement • quantification of the needed probability distributions by expert judgement (i.e. quantification of the weight of different pieces of evidence) • propagation of uncertainties through the Bayes network model

  7. 1. METHOD DESCRIPTION Quantitative analysis 4 Interpretation of the results • sensitivity analyses • importance analyses and analysis of need for further data or information

  8. 2 QUALITATIVE ASSESSMENT • the evidence from each life cycle phase was analysed qualitatively on the basis of the material provided by the vendor • additional information was asked during two meetings with the vendor • no formal/computer-based tools were used • the material was compared with the requirements of the STUK regulatory guide YVL-5.5 • the aim of the qualitative analysis was to create “a map of evidence” • evidence from platform development process was analysed separatelly

  9. 2 QUALITATIVE ASSESSMENT Analysis of the platform software • Three types of information was introduced to the assessment teams: • Overall documentation of the platform • Reference list of the operational experience • References to the type tests

  10. 2 QUALITATIVE ASSESSMENT Analysis of the platform software • Analysis of operating experience • Detailed knowledge of the operational experience of the software was presented to the assessment team. Additional information was obtained during assessment meetings • Operational experience of the platform is stored in the database with the developmental proposals and requests. The vendor was asked to demonstrate the use of database, detailed analyses were not possible

  11. 2 QUALITATIVE ASSESSMENT Analysis of the platform software • Analysis of type-testing documentation • the quality oftype-testing documentation was evaluated (partially with the vendor personnel) • Analysis of development tools

  12. 2 QUALITATIVE ASSESSMENT The analysis of life-cycle phases analysed • requirement specification • concepts design (in documentation called also as system specification) • detailed design • code generation • SIVAT simulation testing • Code compiling and linking • Testing

  13. 2 QUALITATIVE ASSESSMENT Analysis of the equirement specification • critical analysis of documentation • analysis of different blocks used in the block diagram presentation • analysis with respect to IEEE 830 • relevant events, response times, input and output signals,relation to other safety functions, signal identification scheme • analysis of redundancy issues • specification of system states • ambiguity of notation

  14. 2 QUALITATIVE ASSESSMENT Analysis of the application software development process • definition of the application development lifecycle • assessment was made against software engineering references • the end product is software and the development propcess is analogous to the software engineering process • software engineering standards offer rigorous references to the assessment purposes (IEC60880) • existence of V&V checklists etc. • existence of quality targets • existence of quality guidelines

  15. 2 QUALITATIVE ASSESSMENT Analysis of concept design • dependence of the concept design on the platform: description consequences of the assumptions about the platform properties • documentation of the selection of design solutions • documentation of the test plan Analysis of detailed design • documentation of design solutions • documentation of the verification procedures

  16. 2 QUALITATIVE ASSESSMENT Analysis of code implementation and generation • analysis of resource metrics • analysis of simulation testing • analysis of the possibilities of engineers to influence for the end result C-code simulation testing • analysis of documentation practices and test strategy • functional coverage of testing Code compiling and linking • tools and their operational experience

  17. 2 QUALITATIVE ASSESSMENT Analysis of testing • Although actual test results does not exist for the benchmark, information was gathered about the ‘typical’ test execution for assessment purposes • test strategy, test acceptance criteria, test coverage analysis

  18. 3 QUANTITATIVE EVALUATION • the "Bayes network model" is based on the life-cycle of the system • the model was created as a expert judgement process • the quality characteristics were quantified in expert panels • the quantification is based on the observations from the qualitative analysis

  19. The Bayes network

  20. 2 QUANTITATIVE EVALUATION • a “good standard” for rating the steps of the life cycle is needed • the reliability estimates were not determined • not enough information • lot of uncertainties => difficult to interprete the evidence => difficult to make probability estimates

  21. 4, CONCLUSIONS • the analysis of the (system and) application development was mainly qualitative • no software based tools were used • a map of evidence was created (as a Bayes network) • makes it possible to see the relationships between different pieces of evidence • the quality of the requirement specification is important in the case of automated code generation • quantitative analysis was experimented

More Related