1 / 36

CmpE 550 – Advanced Software Engineering

CmpE 550 – Advanced Software Engineering. “Scenario-Based Assessment of Nonfunctional Requirements”. Selim Özyılmaz & Elif Sürer. Introduction. INTRODUCTION. Purpose: Validation of Requirements Specification using Scenarios

lyris
Télécharger la présentation

CmpE 550 – Advanced Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CmpE 550 – Advanced Software Engineering “Scenario-Based Assessment of Nonfunctional Requirements” Selim Özyılmaz & Elif Sürer

  2. Introduction INTRODUCTION • Purpose: Validation of Requirements Specification using Scenarios • “Scenarios are applied to the analysis of non-functional requirements using dependency tables to assess the relationship between goals (functional and non-functional requirements) and the agents and tasks that achieve them in the i* language”

  3. Intro(Cont’d) • Many NFR’s are influenced by human properties, they inherit the diverse nature of human characteristics (System Reliability is influenced by human characteristics such as ability) • This work is a revised version of a previous one that prompted designers with questions about potential problems in a scenario event sequence. (Psychology-based taxonomy of failure)

  4. Intro(Cont’d) • The problem is that too many scenario variations were generated. To prevent, transform the taxonomy of human and system failures into a model to predict the errors in the system design. • Bayesian Belief Nets (BNs) are used to predict the reliability.

  5. Related Work • Model-checking techniques have been used extensivelyto verify and validate requirements. • Communicationproblem occurs between user-stakeholders and the model developers. Software Cost Reduction (SCR) [ tabular representation ]

  6. Related Work Related Work • A combination of visualizations,examples, and simulations is necessary to explain complexrequirements to end users. • Scenario-based representationsand animated simulations help users see the implicationsof system behavior and, thereby, improverequirements validation.

  7. Related Work Related Work • Animation simulation tools -> by Dubois et al. inthe ALBERT II language • KAOS language andsupporting GRAIL tool which enable formal reasoningabout dependencies between goal model, system behavior and constraints. • Animator-validator tool, TROLL, uses a formalobject-oriented language for modeling information systems

  8. Related Work Related Work • What is asufficient set of scenarios to enable validation to becompleted with confidence? • While we believe there is noquick answer to this vexing problem, one approach is toautomate the process as far as possible so more scenarioscan be tested.

  9. Related Work Related Work • Intent specifications provide a hierarchical model tofacilitate reasoning about system goals and requirementsin safety critical systems. • Goals are decomposed in ameans-ends hierarchy, widely practiced in requirementsengineering. • Automated support for reasoning about conflicting systemstates and behavior is provided by the SpecTRM-RL tool.

  10. Related Work Related Work • Assessment of nonfunctional system requirements, suchas system reliability, has to use probabilistic reasoning sincethe range of potential system behaviors is either unknown, in the early requirements phase, or too large to specify. • Bayesian Nets (BNs) have been developed to assess softwarequality from properties of the code and softwareengineering process.

  11. Related Work Related Work • BNs have been widely applied as aprobabilistic reasoning technique in software engineeringand other domains; however, previous work used singlenets to evaluate a set of discrete states pertaining to asoftware product or development process. • More automated tools for scenario analysis of NFRconformance for requirements specifications with multipleBN tests have been developed.

  12. Modeling Uncertainity Modeling Uncertainty • There are four different techniques to model: • Bayesian Probability • Dempster-Shafer • Fuzzy Sets • Possibility Theory • Bayesian Probability offers an easier combination of multiple influences on probability than Dempster-Shafer and a sounder reasoning than Fuzzy sets. • Bayesian Probability provides a decision theory of how to act on the world in an optimal fashion under circumstances of uncertainity.

  13. Bayesian Belief Nets • Directed acyclic graphs of causal influences, where the nodes represent the variables and the arcs represent the relationships between variables. • Variables can have any number of states in a BN, so the choice of measurement is left to the analyst. • Network Probability Table (NPT)

  14. BN (Cont’d) • Input evidence are propagated through the network updating the values of other nodes (Computationally complex but effective algorithms exist)

  15. BN (Cont’d) • Three possible ways to compute the probability of each node in the network • Input a posterior probability into a BN as a prior observation each run should be assumed to be independent • Combine output probabilities from a sequential run using a summarizer probabilities of particular run has to be set initially • Output probability of each event is compared with a threshold value, if surpassed success else failure • Pinpoint some steps in the scenario which are weak in reliability • Sensitiviy analysis can be carried out with multiple BN runs by varying environmental variables

  16. BN Model of System Reliability • Influencing factors are divided into two: • Slips: Attention based lapses and omissions in skilled behavior • Mistakes: failures in plans and knowledge of processing. • System environmental variables have an indirect effect on an individuals ability whereas organizational factors (management culture) have a direct effect. • High Cognitive Complexity more prone to mistakes • High Physical Complexitymore prone to slip errors

  17. System Reliability • Remarks • Input variables are all discrete states • The BN is a run with a range of scenarios that stress-test system design against operational variables. • Scenarios can be either taken from domain-specific operational procedures or by interviewing with users • Two different outputs: slips and mistakes

  18. Operational Performance Time • Same variables with reliability case, the difference is that except two output nodes only one output node • The output is used to increase the best case task completion time to reflect the less than ideal properties of human and machine agents. • ET: estimated time BT: best task completion time • To reflect the case of reverting to manual when an automated technology fails, highly automated tasks worst completion time is set greater than the those of manual tasks

  19. SRA System Architecture • Analysis starts with the selection of the i* model to beevaluated and creating the test scenarios. • Scenarios arenarratives taken from real life experience describingoperation of similar systems from which event sequencesare extracted.

  20. SRA Tool’s Components • The Session Controller implements the user commandinterface for selecting designs and scenariosand executes the algorithm that assesses a set ofscenarios with the BNs. It calls the system reliabilityor operational performance BN assessors to executethe BN runs with all possible environmentalcombinations.

  21. SRA Tool’s Components • The i* model editor allows interactive constructionof i* models with typical CASE tool-type functions. • The Interactive Scenario Constructor produces testscenarios from the system model based on userdirections. Scenarios are stored in a database in anarray of tuples.

  22. SRA Tool’s Components SRA Tool’s Components • The Model Controller controls the BN models. Itselects the appropriate BN model for each task step,then populates the input nodes, runs the model, andreceives the belief distributions of the output nodes. • The Model Controller also manages the backpropagation of the BN model to identify requiredtechnology and agent characteristics.

  23. SRA Tool’s Components SRA Tool’s Components • The BN assessor modules run the net by calling theHUGIN algorithm for each task step and for each setof environmental variable combinations. • The outputfrom each run is compared with the desired NFRthreshold and the survivor runs are passed to theresults visualizer.

  24. SRA Tool’s Components SRA Tool’s Components • The Visualizer provides a visual summary of allqualified BN runs for a set of scenarios for one ormore system designs. • This enables different designsto be compared and problem areas in the requirementsto be identified, i.e., task/technical componentcombinations which show low potential NFRassessments. The Visualizer displays results at threelevels: System, Scenario, and Phase views.

  25. NFR Analysis Method NFR Analysis Method • Remarks • Scenarios are composed of a number of phases and each phase is composed of a number of task-steps each modeled as <Agent,Task,Technology> • Phases are used to structure task sequences that fulfill a higher order goal. • The best design is generally the one who has more surviving BN runs. • The best design also needs to be resilient to environmental conditions.

  26. NFR Analysis Method NFR Analysis Method • Impact of Environmental Variables • (1) Survivor runs with ER x set to best case • (2) Total survivor runs for all settings • If an overall design or a particular task fails to meet the threshold back propagation alaysis is used to discover the necessary settingto achieve the NFR value. • All input nodes are unconstrained (calculation for all) • One/few inputs are unconstrained (calculation for unconstrained)

  27. Case Study • The application of the SRA tool invalidating the operational performance and system reliabilityof a complex socio-technical system. • The requirementsquestion is to assess the impact of new automatedtechnology on the task of loading weapons on to aircraftin an aircraft carrier.

  28. Case Study Case Study • Tasks in Design 1are manual or semiautomated, while, in Design 2, they aresemi or fully automated. • The second design saves manpower since itcan be operated by one WA and is potentially more rapid tooperate, but it is more expensive.

  29. Case Study • When the operational performance times are compared, Design 2 is quicker fornearly all tasks, which is not surprising since it has moreautomated tasks.

  30. Case Study Case Study • The critical environmentalvariables for both designs, shows that incentives, motivation, duty time concurrency,and time constraints were all marked as vulnerable forDesign 1. • Design 2 in contrast, fares better withonly motivation, concurrency, and maintenance marked asvulnerable.

  31. Case Study Case Study • After identifying the most appropriate design, theproblematic tasks, and the critical environmental variables,the analyst investigates the improvements required for theAutoload palette component, which was the weakest link inDesign 2.

  32. Validating the BN Models • Data mining techniques are used to test the assumptions in BN models. • All possible permutations are simulated and created a database of reliability and performance time predictions • Data mining Techniques: • Relevance Analysis: ranks input parameters of the model based on their relevance with output • Association Rules: describe how often two or more facts cooccur in a dataset and were employed to check the causal associations in our model. • Classification: partitions large quantities of data into sets with common characteristics and properties

  33. Validating BN Models (Cont’d) Validating the BN Models • Results • Sea state had only a minor influence on system error  relevance analysis • IF (Duty-time=high)  survived=fail • IF (Workload=high)  survived=fail (Association) • These rules indicate that causal influences of these variables are higher than assumed, alter NPT settings to overcome. • Crew motivation and agent ability problems  classification

  34. Discussion & Conclusions • Automated testing of requirements specifications and designs for conformance to nonfunctional requirements using a set of scenarios and variations in the system environment have been developed.

  35. Discussion & Conclusions Discussion & Conclusions • The SRA could be applied to any class of component-based problems where the selection of components needs to be optimized for nonfunctional requirement types of criteria. • The SRA tool was a development from previous BN requirements analyzer and has partially addressed the difficult problem of scenario-based testing

  36. Discussion & Conclusions Discussion & Conclusions • The SRA tool is aimed at requirements investigation in complex socio-technical systems and, hence, it complements model-checking tools which are more appropriate to later stages in development when specifications of agent behavior are available.

More Related