1 / 18

Critical Systems Engineering

Critical Systems Engineering. Processes and techniques for developing critical systems. What is a system?.

fox
Télécharger la présentation

Critical Systems Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Critical Systems Engineering • Processes and techniques for developing critical systems

  2. What is a system? • A collection of elements which are assembled to fulfil some defined purpose. Elements may be hardware or software components, organisational policies and procedures and operational processes. • Systems have properties which are emergent i.e. they only come to light when the parts are put together, they have structure and mechanisms for communication and control.

  3. Socio-technical computer-based systems • Systems which some of the elements are software-controlled computers and which are used by people for some purpose. They typically include: • Computer hardware • Software • Policies and procedures • Operational processes

  4. Emergent properties • Properties which are properties of the system AS A WHOLE rather than of the collection of parts. • Not determined solely from the properties of the system parts but also from the system’s structure. • Examples • The reliability of a computer depends on the reliability of the processor, memory, keyboard, monitor, disk, etc. • A mobile phone has the emergent property of being a communication device.

  5. Critical systems • A critical system is any system whose ‘failure’ could threaten human life, the system’s environment or the existence of the organisation which operates the system. • ‘Failure’ in this context does NOT mean failure to conform to a specification but means any potentially threatening system behaviour.

  6. Critical system classes • Safety-critical systems • A system whose failure may result in the loss of human life, injury or major environmental damage • Mission-critical systems • A system whose failure may result in the consequent failure of a goal-directed activity • Business-critical systems • A system whose failure may result in the failure of the business that is using that system

  7. The concept of dependability • For critical systems, it is usually the case that the most important system property is the dependability of the system • The dependability of a system reflects the user’s degree of trust in that system. It reflects the extent of the user’s confidence that it will operate as users expect and that it will not ‘fail’ in normal use • Usefulness and trustworthiness are not the same thing. A system does not have to be trusted to be useful

  8. Dimensions of dependability

  9. Availability and reliability • Reliability • The probability of failure-free system operation over a specified time in a given environment for a given purpose • Availability • The probability that a system, at a point in time, will be operational and able to deliver the requested services • Availability and reliability are related but distinct • Availability takes into account the time that the system is out of service • Unreliable systems can have a high availability if there is a short restart time

  10. Safety • Safety is a property of a system that reflects the system’s ability to operate, normally or abnormally, without danger of causing human injury or death and without damage to the system’s environment • It is increasingly important to consider software safety as more and more devices incorporate software-based control systems

  11. Safety and reliability • Safety and reliability are related but distinct • In general, reliability and availability are necessary but not sufficient conditions for system safety • Reliability is concerned with conformance to a given specification and delivery of service • Safety is concerned with ensuring system cannot cause damage irrespective of whether or not it conforms to its specification

  12. Security • The security of a system is a system property that reflects the system’s ability to protect itself from accidental or deliberate external attack • Security is becoming increasingly important as systems are networked so that external access to the system through the Internet is possible • Security is an essential pre-requisite for availability, reliability and safety

  13. Damage from insecurity • Denial of service • The system is forced into a state where normal services are unavailable or where service provision is significantly degraded • Corruption of programs or data • The programs or data in the system may be modified in an unauthorised way • Disclosure of confidential information • Information that is managed by the system may be exposed to people who are not authorised to read or use that information

  14. Security and dependability • Security and availability • Systems that are insecure may be liable to denial of service attacks that compromise the availability of the system. • Security and reliability • Corruption of programs and data may mean that a system becomes unreliable and, possibly, unsafe. • Security and safety • Safety validation relies on demonstrating that a particular system is safe. Insecurities can result in changes to the system so we can no longer be confident in its safety

  15. Development for dependability • Use of formal methods for system specification • Use of formal verification to demonstrate that a program is consistent with its specification • Separate teams for implementation and testing • Incorporation of redundant code and self-checking in programs • Redundant hardware units • Measurement of test coverage

  16. Costs of increasing dependability C o s t Dependability L o w M e d i u m H i g h V e r y U l t r a - h i g h h i g h

  17. Key points • Computer-based systems are socio-technical systems which include hardware, software, operational processes and procedures and people. • An increasing number of socio-technical systems are critical systems • Systems have emergent properties i.e. properties which are only apparent when all sub-systems are integrated. • Critical system attributes are dependability attributes - reliability, availability, safety and security

  18. Key points • The dependability of a system reflects the user’s trust in that system • The availability of a system is the probability that it will be available to deliver services when requested • The reliability of a system is the probability that system services will be delivered as specified • Reliability and availability are generally seen as necessary but not sufficient conditions for safety and security • Insecure systems cannot be guaranteed to be available, reliable or safe

More Related