Download
9 system integration and verification validation n.
Skip this Video
Loading SlideShow in 5 Seconds..
9. SYSTEM INTEGRATION and Verification & Validation PowerPoint Presentation
Download Presentation
9. SYSTEM INTEGRATION and Verification & Validation

9. SYSTEM INTEGRATION and Verification & Validation

422 Vues Download Presentation
Télécharger la présentation

9. SYSTEM INTEGRATION and Verification & Validation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. 9. SYSTEM INTEGRATIONand Verification & Validation

  2. Software Engineering Roadmap: Chapter 9 Focus Construct system in stages - Plan integration of parts to yield whole - Test subassemblies - Assemble in “builds” - Test whole system in a variety of ways Identify corporate practices Plan project Maintain Analyze requirements Integrate & test system Design Implement Test units Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  3. Chapter Learning Goals • Be able to plan the integration of modules • Understand types of testing required • Be able to plan and execute testing • beyond the unit level Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  4. Verification vs validation • Validation: "Are we building the right product" • The software should do what the user really requires (the requirements should express this) • Verification: "Are we building the product right" • The software should conform to the architecture and design as driven by the requirements

  5. Two methods of V & V • Software inspections Concerned with analysis of the static system representation to discover problems (static verification) • May be supplement by tool-based document and code analysis • Software testing Concerned with exercising and observing product behaviour (dynamic verification) • The system is executed with test data and its operational behaviour is observed

  6. V& V Goals • Verification and validation should establish confidence that the software is ready for use • This does NOT mean completely free of defects • Rather, it must be good enough for its intended use and the type of use will determine the degree of confidence that is needed

  7. V & V Planning • Careful planning is required to get the most out of testing and inspection processes • Planning should start early in the development process • The plan should identify the balance between static verification and dynamic testing

  8. 1. Introduction to system integration Before we can talk more about V & V we have to discuss an approach to integration

  9. Unified Process for Integration & Test Jacobson et al: USDP Requirements Analysis Design Implemen- tation Test Inception Elaboration Construction Transition Integration Unit Tests Integration tests ... System tests Prelim. iterations Iter. #1 Iter. #n Iter. #n+1 Iter. #m Iter. #m+1 Iter. #k ….. …..

  10. Development Overview Customer loss of information … Why? Requirements loss Architecture loss loss Detailed design Interface specs loss loss Function code Module (e.g., package) code After Myers System code

  11. order of testing Testing Overview: Artifact Flow Requirements (11) Acceptance tests Docs. (10) Installation tests Architecture System Tests (9) Usability tests Docs. Docs. (8) System tests Detailed design (7) Regression tests Docs. Integration Tests Interface specs (6) Integration tests Function code Docs. (5) Interface tests Code + Code + Module (e.g., package) code (2),(4) Module tests Code + Unit Tests Code + (1),(3) Function tests Iteration or System code Complete code

  12. Testing forValidation and Verification(After Myers) (11) Acceptance tests* Tested against requirements (“validation”) Requirements (10) Installation tests*‡ (9) Usability tests*‡ (8) System tests*‡ (7) Regression tests* (1), (4) Function tests Includes… : *use-cases ‡performance testing

  13. Testing for Validation and Verification(After Myers) (11) Acceptance tests* Requirements (10) Installation tests*‡ “validation1” (9) Usability tests*‡ (8) System tests*‡ Architecture “verification2” (7) Regression tests* (6) Integration tests* Interface specs “verification2” (5) Interface tests (1), (4) Function tests “verification2” Detailed design (2), (3) Module tests Note 1: Tested against requirements Note 2: Tested against documents indicated Includes… : *use-cases ‡performance testing

  14. 2. The integration process

  15. The Build Process Single level iteration Double level iteration . . . . Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  16. Integration in Spiral Development Design Requirements analysis First iteration Second iteration 2. Design for additional requirements 1. Get additional requirements 3. Code additional 5. Test 4. Integrate new code Implementation Test Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  17. Relating Builds and Iterations in the Unified Process Jacobson et al: USDP Requirements Analysis Design Implemen- tation Test Inception Elaboration Construction Transition First build for iteration i Last build for iteration i Prelim. iterations Iter. #1 Iter. #n Iter. #n+1 Iter. #i Iter. #m Iter. #m+1 Iter. #k ….. ….. …..

  18. Build Sequences: Ideal vs. Typical Build a Bridge Build a Video Game

  19. 1. Understand the architecture decomposition. • try to make architecture simple to integrate 2. Identify the parts of the architecture that each iteration will implement. • build framework classes first, or in parallel • if possible, integrate “continually” • build enough UI to anchor testing • document requirements for each iteration • try to build bottom-up • so the parts are available when required • try to plan iterations so as to retire risks • biggest risks first • specify iterations and builds so that each use case is handled completely by one 3. Decompose each iteration into builds if necessary. 4. Plan the testing, review and inspection process. 5. Refine the schedule to reflect the results. One way to ... Plan Integration& Builds Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  20. 1. Decide extent of all tests. Roadmap for Integration and System Test 2. For each iteration … 2.1 For each build … 2.1.1 Perform regression testing from prior build 2.1.2 Retest functions if required 2. 1.3 Retest modules if required 2. 1.4 Test interfaces if required 2. 1.5 Perform build integration tests -- section 3.1 Development of iteration complete 2.2 Perform iteration system and usability tests -- sections 3.4, 3.5 System implemented 3. Perform installation tests -- section 3.8 System installed 4. Perform acceptance tests -- section 3.7 Job complete

  21. RPG Video Game Architecture Packages -- built around the domain classes Framework layer GameEnvironment GameArtifacts «framework package» «framework package» RolePlayingGame Characters «framework package» «framework package» «uses» Application layer 3 EncounterGame «uses» 2 EncounterCharacters «application package» «application package» EncounterGame Engagement Refers to both EncounterCharacter EngagementDisplay «uses» 1 EncounterEnvironment PlayerCharacter «application package» ForeignCharacter Area EncounterAreaConnection PlayerQualityWindow ConnectionHyperlink Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  22. Factors Determining the Sequence of Integration • Usage of modules by other modules • build and integrate modules used before modulesthat use them • Defining and using framework classes • Exercising integration early • Exercising key risky parts of the application as early as possible • Showingparts or prototypes to customers technical (factors) risk reduction requirements Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  23. Integration Schedule Milestones Iterations Elaboration iterations Inception iteration Iteration 2 “elementary interaction” Iteration 1 “view characters in areas” Builds Modules

  24. Integration Schedule Month 1 Month 2 Month 3 Month 4 Month 5 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 Milestones Proto. requirements Complete prototype Iterations Elaboration iterations Inception iteration Iteration 2 “elementary interaction” Iteration 1 “view characters in areas” Builds build 1 build 3 build 2 . . . . . . . Modules Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  25. Integration Schedule Month 1 Month 2 Month 3 Month 4 Month 5 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 Milestones Proto. requirements Complete prototype Iterations Elaboration iterations Inception iteration Iteration 2 “elementary interaction” Iteration 1 “view characters in areas” Builds build 1 build 3 build 2 RolePlaying- Game package Characters package GameEn- vironment package Modules Encounter Characters package Encounter- Environ- ment package EncounterGame package Integrate & test Integrate & test Integrate & test Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  26. 3. The testing process

  27. Encounter Continual Integration week 3 RolePlayingGame EncounterGame Characters GameCharacter Layout EncounterCharacters EncounterLayout Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  28. Encounter Continual Integration week 7 RolePlayingGame EncounterGame Characters GameCharacter Layout EncounterCharacters Map EncounterLayout EncounterEnvironment «facade» Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  29. Encounter Continual Integration week 11 RolePlayingGame RPGame EncounterGame Characters GameCharacter Layout EncounterCharacters Map EncounterLayout EncounterEnvironment «facade» EncounterCast «facade» Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  30. RolePlayingGame Encounter Continual Integration week 15 RPGame EncounterGame Characters EncounterGame «facade» GameCharacter Layout EncounterCharacters Map Location EncounterCharacter EncounterLayout EncounterEnvironment «facade» EncounterCast «facade» Area Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  31. Plan and Execute Integration Tests One way to ... 1. Decide how and where to store, reuse and code the integration tests. • show this in the project schedule 2. Execute as many unit tests (again) as time allows • this time in the context of the build • no drivers or stubs required this time • prioritize by those most likely to uncover defects 3. Exercise regression tests • to ensure existing capability has not been compromised 4. Ensure build requirements are properly specified 5. Exercise use cases that the build should implement • test against the SRS 6. Execute the system tests supported by this build Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  32. Relationship between Use Cases, Iterations and Builds … 4 Iteration 5 … 6 5.2 build 5.3 5.4 Use case 7 14 builds on details of 7 * Use case 14 * * Use case 23 Use case 9 * «extends» or «includes» Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  33. Final Code Build and Integration Schedule: a Banking Example Twice weekly Over night week 23 week 31 biweekly daily weekly code builds baseline Frequency of regression testing increases toward the end of project. release

  34. Final Code Build and Integration Schedule: a Banking Example week 23 week 31 biweekly daily weekly code builds baseline Integration of module into baseline bank query module release tasks bank deposit module Code is frozen when being tested. Why? bank withdrawal module Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  35. Typical Day-by-Day Code Integration Process Run regression tests 6 pm 7 am development development time Confirm baseline or revert to previous baseline Freeze development biweekly daily weekly builds week 23 week 24 week 25 week 26 week 27 week 28 week 29 week 30 week 31 release = overnight regression test Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  36. Artifacts and Roles for Integration Testing* Component engineer Integration tester System tester Test engineer . . . . . . . . . . . . . . . . . . responsible for . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test evaluation  Use-case model Test plan Test procedure Test component Defect management Test case *Jacobson et al: USDP

  37. Success in Interface Testing • Understand interface requirements • Perform early “smoke” tests to weed out unanticipated interface issues • Test thoroughly as per “black box” testing (see Unit Testing) so as to fully exercise the interface in terms of: • Variety of values for each parameter • Parameter combinations • Volume and timing

  38. Types of System Tests* 1 • Volume Subject product to large amounts of input. • Usability Measure user reaction (e.g., score 1-10). • Performance Measure speed under various circumstances. • Configuration Configure to various hardware / software e.g., measure set-up time. • Compatibility -- with other designated applications e.g., measure adaptation time. • Reliability / Availability Measure up-time over extended period. *see Kit [Ki]

  39. Types of System Tests* 2 • Security Subject to compromise attempts. • e.g., measure average time to break in. • Resource usage Measure usage of RAM and disk space etc. • Install-abililty Install under various circumstances. • measure time to install. • Recoverability Force activities that take the application down. • measure time to recover • Serviceabililty Service application under various situations. • measure time to service • Load / Stress Subject to extreme data & event traffic * see Kit [Ki]

  40. Organization of Integration and System Test Documentation** Software Test Documentation STD‡ SCMP consists of ... System T.D.‡ Acceptance T.D.‡ Installation T.D.‡ Integration T.D.‡ references ... ‡ Test Documentation, each divided into: • Introduction • Test plans • Test designs • Test cases • Test procedures • Test log Build 1 T.D.‡ Build 2 T.D.‡ Build 3 T.D.‡ ** With R. Bostwick Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  41. ANSI/IEEE 829-1983 Software Test Documentation (reaff. 1991) • 1. Introduction • 2. Test plan • items under test, scope, approach, resources, schedule, personnel • 3. Test design • items to be tested, the approach, the plan in detail • 4. Test cases • sets of inputs and events • 5. Test procedures • steps for setting up and executing the test cases • 6. Test item transmittal report • items under test, physical location of results, person responsible for transmitting • 7. Test log • chronological record, physical location of test, tester name • 8. Test incident report • documentation of any event occurring during testing which requires further investigation • 9. Test summary report • summarizes the above NOT REQUIRED IN PROJECTS

  42. 5. The Transition iterations

  43. Goals of the Transition Iterations • Find defects through customer use • Test user documentation and help • Determine realistically whether application meets customerrequirements • Retire deploymentrisks • Satisfy miscellaneous marketinggoals Requirements Analysis Design Implemen- tation Test Transition Iter. #m+1 Iter. #k …

  44. Alpha- and Beta- Releases In-house and highly trusted users • Several repetitions by real users • Previews customer reaction • Benefits third-party developers • Forestalls competition Alpha Selected customers • Many repetitions by real users • Gets customer reaction Beta Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  45. Roadmap for the Transition Iterations • Define population • Plan defect collection • Identify stopping criteria • (num. allowed defects/test or /hour) 1. Plan alpha and beta testing. • Prepare • Distribute & install • Carry out (users / customers) • Gather defect reports • Observe stopping criteria • Correct defects 2. Conduct alpha testing. 3. Conduct beta testing. Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  46. Number per 1000 hrs Stopping Criteria: Graphical Representation Error detection rate Target: <= 7 per1000 hrs for 4 weeks % Percentage of deposit transaction types tested Target: 98% Target: 91% Percentage of withdrawal transactions tested week 1 week 10 End tests week 20 Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

  47. 7. Tools for integration and system testing

  48. Capabilities of Automated System Test Tools 1. Record mouse and keyboard actions to enable repeated playback 2. Run test scripts repeatedly 3. Enable recording of test results 4. Record execution timing 5. Record runtime errors 6. Create and manages regression tests 7. Generate test reports 8. Generate test data 9. Record memory usage 10. Manage test cases 11. Analyze coverage

  49. Types of Capture/Playback Tests* • Native / software intrusive test software intermingled with software under test • could compromise software under test • least expensive • Native / hardware intrusive test hardware intermingled with software under test • could compromise software under test • Non-intrusive uses separate test hardware • does not compromise software under test • most expensive * adapted from Kit [Ki]

  50. Memory Usage Test Tools • Memory leaks • detect growing amounts of unusable memory • inadvertently caused by the implementation • Memory usage behavior • confirm expectations • identify bottlenecks • Data bounds behavior • e.g., confirm integrity of arrays • e.g., detect attainment of limiting values • Variable initialization • indicate un-initialized variables • Overwriting of active memory