1 / 48

Verification & Validation

Verification & Validation. IS301 – Software Engineering Lecture #30 – 2004-11-10 M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University mailto:mkabay@norwich.edu V: 802.479.7937. Topics.

Télécharger la présentation

Verification & Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification & Validation IS301 – Software Engineering Lecture #30 – 2004-11-10 M. E. Kabay, PhD, CISSP Assoc. Prof. Information AssuranceDivision of Business & Management, Norwich University mailto:mkabay@norwich.edu V: 802.479.7937

  2. Topics • Verification and validation planning • Software inspections • Automated static analysis • Cleanroom software development

  3. Verification and Validation • Assuring that software system meets user's needs

  4. Objectives • To introduce software verification and validation and to discuss distinction between them • To describe program inspection process and its role in V & V • To explain static analysis as verification technique • To describe Cleanroom software development process

  5. Verification vs Validation • Verification: “Are we building the product right?” • software should conform to its specification • Validation: “Are we building the right product?” • software should do what user really requires

  6. V & V Process • Whole life-cycle process • V & V at each stage in software process • Two principal objectives • Discover defects in system • Assess whether system is usable in operational situation

  7. Static and Dynamic Verification • Software inspections • Analysis of static system representation • Static verification • May be supplemented by tool-based document and code analysis • Software testing • Exercising and observing product behavior (dynamic verification) • System executed with test data and its operational behavior observed

  8. Static and Dynamic V&V

  9. Program Testing • Can reveal • Presence of errors • But NOT their absence • Successful test • Discovers one or more errors • Only validation technique for non-functional requirements • Should be used in conjunction with static verification to provide full V&V coverage

  10. Types of Testing • Defect testing • Tests designed to discover system defects • Successful defect test reveals presence of defects in system • Statistical testing • Tests designed to reflect frequency of user inputs • Used for reliability estimation

  11. V & V Goals • Establish confidence that software is fit for purpose • Does NOT mean completely free of defects • Good enough for its intended use • Type of use will determine degree of confidence needed

  12. V & V Confidence • Depends on system’s purpose, user expectations and marketing environment • Software function • Level of confidence depends on how critical software is to organization • User expectations • Users may have low expectations of certain kinds of software • Marketing environment • Getting product to market early may be more important than finding defects in program Sound familiar?

  13. Testing and Debugging • Defect testing and debugging are distinct processes • Verification and validation concerned with establishing existence of defects in program • Debugging concerned with locating and repairing these errors • Debugging involves formulating hypotheses about program behavior then testing these hypotheses to find system error(s)

  14. Debugging Process

  15. V & V Planning • Careful planning required to get most out of testing and inspection processes • Start early in development process • Identify balance between • Static verification and • Testing • Test planning is about • Defining standards for testing process rather than • Describing product tests

  16. V-Model of Development

  17. Structure of Software Test Plan • Testing process • Requirements traceability • Tested items • Testing schedule • Test recording procedures • Hardware and software requirements • Constraints

  18. Software Inspections • People examine representation of system to find anomalies and defects • May be applied to any representation of system • Requirements, design, test data . . . • Do not require execution of system • Used before implementation • Very effective technique for discovering errors

  19. Inspection Success • In testing, one defect may mask another so several executions are required • Reuse domain and programming knowledge so reviewers are likely to have seen types of error that commonly arise • Thus many different defects may be discovered in single inspection

  20. Inspections and Testing • Inspections and testing are complementary and not opposing verification techniques • Both should be used during V & V process • Inspections can check • Conformance with specification • But not conformance with customer’s real requirements • Inspections cannot check non-functional characteristics • Performance, usability, etc.

  21. Program Inspections • Formalized approach to document reviews • Intended explicitly for defect DETECTION (not correction) • Defects may be • Logical errors, • Anomalies in code that might indicate erroneous condition • (E.g. uninitialized variable) • Or non-compliance with standards

  22. Inspection Pre-Conditions • Precise specification must be available • Team members must be familiar with organization standards • Syntactically correct code must be available • Error checklist should be prepared • Management must accept that inspection will increase costs early in software process • Management must not use inspections for staff evaluations • I.e., finding errors does not necessarily mean programmer is BAD

  23. Inspection Process

  24. Inspection Procedure • System overview presented to inspection team • Code and associated documents distributed to inspection team in advance • Inspection takes place and discovered errors noted • Modifications made to repair discovered errors • Re-inspection may or may not be required

  25. Inspection Teams • Made up of at least 4 members • Author of code being inspected • Inspector who finds errors, omissions and inconsistencies • Reader who reads code to team • Moderator who chairs meeting and notes discovered errors • Other roles: • Scribe • Chief moderator

  26. Inspection Checklists • Checklist of common errors should be used to drive inspection • Error checklist is programming language dependent • ‘Weaker’ type checking needs larger checklist • Examples: • Initialization • Constant naming • Loop termination • Array bounds, etc.

  27. Inspection Checks (1) Data Faults • Are all program variables initialized before their values are used? • Have all constants been named? • Should the lower bound of arrays be 0, 1 or something else? • Should the upper bound of arrays be equal to the size of the array or (size – 1)? • If character strings are used, is a delimiter explicitly assigned?

  28. Inspection Checks (2) Control Faults • For each conditional statement, is the condition correct? • Is each loop certain to terminate? • Are compound statements correctly bracketed? • In case statements, are all possible cases accounted for?

  29. Inspection Checks (3) • Input/Output Faults • Are all input variables used? • Are all output variables assigned a value before the are output? • Interface Faults • Do all function and procedure calls have the correct number of parameters? • Do formal and actual parameter types match? • Are the parameters in the right order? • If components access shared memory (globals), do they have the same model of the shared memory structure?

  30. Inspection Checks (4) • Storage Management Faults • If a linked structure is modified, have all links been correctly reassigned? • If dynamic storage is used, has space been allocated correctly? • Is space explicitly deallocated after it is no longer required? • Exception Management Faults • Have all possible error conditions been taken into account?

  31. Inspection Rate Estimating cost of inspection for 500 lines of code: • 500 statements/hour during overview = 1 hr per person x 4 people = 4 person-hours • 125 source statements/hour during individual preparation = 4 hours x 4 people = 16 person-hours • 90-125 statements/hour can be inspected in meeting with 4 people in team = ~5 hours x 4 people = ~20 person-hours • Inspection is therefore expensive process • Inspecting 500 lines thus takes ~ 4 + 16 + 20 = ~40 person-hours • Estimate programmer salary $80K/2K hr ~$40/hr • Multiply by 2 for extended costs = $80/hr • Therefore costs of 40 person-hours effort = ~ $3,200

  32. Automated Static Analysis • Static analyzers = software for source text processing • Parse program text • Try to discover potentially erroneous conditions • Report to V & V team • Effective aid to inspections • Supplement to but not replacement for inspections

  33. Static Analysis Checks

  34. Stages of Static Analysis (1) • Control flow analysis • Checks for loops with multiple exit or entry points, finds unreachable code, etc. • Data use analysis • Detects uninitialized variables, variables written twice without intervening assignment, variables which are declared but never used, etc. • Interface analysis • Checks consistency of routine and procedure declarations and their use

  35. Stages of Static Analysis (2) • Information flow analysis • Identifies dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review • Path analysis • Identifies paths through program and sets out statements executed in that path. Again, potentially useful in review process • Both these stages generate vast amounts of information. Must be used with care.

  36. 138% more lint_ex.c #include <stdio.h> printarray (Anarray) int Anarray; { printf(“%d”,Anarray); } main () { int Anarray[5]; int i; char c; printarray (Anarray, i, c); printarray (Anarray) ; } 139% cc lint_ex.c 140% lint lint_ex.c lint_ex.c(10): warning: c may be used before set lint_ex.c(10): warning: i may be used before set printarray: variable # of args. lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(11) printf returns value which is always ignored LINT Static Analysis

  37. Use of Static Analysis • Particularly valuable for language such as C • Weak typing • Hence many errors undetected by compiler • Less cost-effective for languages like Java • Strong type checking • Can therefore detect many errors during compilation

  38. Cleanroom Software Development • Name derived from 'Cleanroom' process in semiconductor fabrication • Philosophy is defect avoidance rather than defect removal • Software development process based on: • Incremental development • Formal specification • Static verification using correctness arguments • Statistical testing to determine program reliability

  39. Cleanroom Process

  40. Cleanroom Process Characteristics • Formal specification using state transition model • Incremental development • Structured programming • Limited control and abstraction constructs are used • Static verification using rigorous inspections • Statistical testing of system (covered in Ch. 21)(but not in this course)

  41. Incremental Development

  42. Formal Specification and Inspections • State-based model = system specification • Inspection process checks program against this model • Programming approach defined so that correspondence between model and system is clear • Mathematical arguments (not proofs) are used to increase confidence in inspection process

  43. Cleanroom Process Teams • Specification team • Develops and maintains system specification • Development team • Develops and verifies software. • Software NOT executed or even compiled during this process • Certification team • Develops set of statistical tests to exercise software after development • Reliability growth models used to determine when reliability acceptable

  44. Cleanroom Process Evaluation • Results in IBM impressive • Few discovered faults in delivered systems • Independent assessment: • Process no more expensive than other approaches • Fewer errors than in ‘traditional’ development process • Not clear how this approach can be transferred to environment with • Less skilled or • Less highly motivated engineers

  45. Key Points (1) • Verification and validation are not same thing. Verification shows conformance with specification; validation shows that program meets customer’s needs • Test plans should be drawn up to guide testing process. • Static verification techniques involve examination and analysis of program for error detection

  46. Key points (2) • Program inspections are very effective in discovering errors • Program code in inspections is checked by small team to locate software faults • Static analysis tools can discover program anomalies which may be indication of faults in code • Cleanroom development process depends on incremental development, static verification and statistical testing

  47. Homework • Required by Wed 17 Nov 2004: • For 32 points • 22.1 – 22.5 (@4) • 22.9 & 22.10 (@6) (think hard) • Optional by Mon 29 Nov 2004 • For up to 25 extra points, write detailed answers for any or all of • 22.6, 22.7 (@10) • 22.8 (@5)

  48. DISCUSSION

More Related