1 / 60

SENG 550: Software Verification and Validation

SENG 550: Software Verification and Validation. V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University. Overview. Software Inspections. Today Software Metrics. 02/21/2002 Software Reliability Engineering.

jena
Télécharger la présentation

SENG 550: Software Verification and Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SENG 550: SoftwareVerification and Validation V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University

  2. Overview • Software Inspections. • Today • Software Metrics. • 02/21/2002 • Software Reliability Engineering. • 02/28/2002

  3. Agenda • The big picture. • Inspection Process. • Applying Inspection Process. • Utilizing Orthogonal Defect Classification. • Inspection Checklists.

  4. The Big Picture:V&V Principles, Foundations • V&V MUST be conducted throughout the entire life-cycle. • The outcome of V&V should not be considered a binary variable. • If using models, build them to satisfy certain objective. • Model’s credibility measured wrt. these objectives. • V&V requires ‘some’ independence to prevent developer’s bias.

  5. V&V Principles (2) • V&V is difficult in general. Creativity and insight are required. • Credibility can be claimed ONLY for the prescribed conditions, which have been validated/verified. • Complete model testing is not possible. • Testing demonstrates the presence of defects, not their absence! • V&V must be planned and documented.

  6. V&V Principles (3) • Errors/defects should be detected as early as possible. • A V&V environment must provide for repeatability. • Success in V&V of submodels (modules) does not imply overall system credibility. • A well founded problem is ESSENTIAL to the acceptability and accreditation of the V&V results.

  7. V&V Techniques • Informal techniques. • Tools rely heavily on human reasoning and subjectivity. • This does not imply the lack of structure or guidelines. • Static techniques. • Concerned with accuracy assessment based on static design (mental execution). • Dynamic techniques. • Require instrumentation, execution and analysis. • Formal techniques. • Based on mathematical correctness proofs.

  8. V&V Techniques Formal: Induction Inductive assertions Inference Logical deduction Logical abduction Correctness proofs Convergence proofs Stability proofs Dynamic: Acceptance tests Assertion checking Comparison tests Compliance tests Execution profiling Fault injection Interface tests Partition tests Regression tests Sensitivity analysis Special input tests Statistical techniques Visualization and animation Informal: Inspections Documentation checks Reviews Walkthroughts Static: Cause-effect graphs Control analysis Data analysis Fault/Failure analysis Interface analysis Semantic analysis Traceability analysis

  9. Agenda • The big picture. • Inspection Process. • Applying Inspection Process. • Utilizing Orthogonal Defect Classification. • Inspection Checklists.

  10. Software Inspection Process • The goal of software inspections is to remove as many defects as possible prior to product release. • Defect removal efficiency: • Inspections contribute to high defect removal efficiency.

  11. Why effective? • Most software problems can be traced to requirements. • Requirements are usually written as English prose. • Personnel training problems. • Requirements elicitation, analysis, negotiation, specification, validation, management, etc. • Imprecise, ambiguous, nondeterministic. • Software, by its formal nature, is precise, unambiguous, deterministic (?).

  12. What is an Inspection? • Rigorous, in-depth technical review. • Identifies problems close to the point of their origin. • Developed in the 1970’s at IBM [Fagan] • Objectives: • Upon finding problems, assure that an agreement is reached on the course of action.

  13. Inspection objectives (2) • Verify rework against predefined criteria. • Provide data on product quality and process effectiveness • Build technical knowledge of team members. • Increase the effectiveness of software testing. • Raise the standards of excellence for software engineers.

  14. Common questions • Are inspections formal or informal V&V technique? • They are formal, wrt. The defined roles and responsibilities of participants. • They are informal wrt. The level of mathematical rigor.

  15. Common questions (2) • Inspection meeting participants: • Moderator: Coordinates the process, leads discussions. • Producer: Submits the artifacts being inspected. • Reader: Presents the artifacts at the meeting. • Inspector(s): Inspects the artifact. • Recorder: Documents and records problems. • Manager: Supports the organization of meetings.

  16. Responsibilities: Moderator • A key player. • A trained, adequately prepared individual • Technical and managerial skills required. • Selects inspection team. • Ensures team members devote sufficient time to preparations. • If the team is unprepared, postpones the meeting. • Leads team discussions, mediates disputes. • Recognizes issues, keeps the team focused. • Ensures proper documentation of problems.

  17. Responsibilities: Producer • Inspections conducted to HELP him/her. • Ensures product readiness. • Resolves problems identified by the inspection team. • Must remain objective (not defensive). • At the meeting, clarifies the issues not clear to the inspector. • No need to justify design/implementation style, unless it affects compliance with the requirements.

  18. Responsibilities: Reader • Selecting, describing portions of the product that are the focus of inspection. • Diverts attention from producer to product. • Thoroughly familiar with the product. • Identifies logical chunks of the product allowing the meeting to stay focused at one problem at a time.

  19. Responsibilities: Inspectors • Selected based on knowledge and familiarity with the product. • Represent a cross-section of available skills (software, marketing, manufacturing…). • Look for discrepancies between the product, documentation, standards. • Inspectors are producers too (in different meetings). • Focus on problem identification, not a solution. • Objectivity, criticize the product, not the producer.

  20. Responsibilities: Recorder • An optional role. Needed if this is a very time consuming task. • Captures and records the description of noticed problems. • Recorder is an inspector too. • Provide support for the moderator by providing/recording additional information.

  21. Responsibilities: Manager • Helps decide what to inspect. • Must accommodate scheduling issues. • Must allocate resources. • Supports inspection training. • May participate in the selection of moderators. • Discusses the results with the moderator. • Not present at actual meetings.

  22. Inspections vs. Walk-Through

  23. Inspection Process Attributes • Defined roles and responsibilities. • Documentation supporting the process. • Collection of product and process metrics. • Support the analysis of global trends in the quality of the product under consideration. • Inspection against documents preceding the current artifact. • Availability of supporting infrastructure. • Training, avoiding “why did you do it this way?” questions. • Planning, support of managers and supervisors.

  24. What to inspect? • Producer and manager make the choice. • Critical product functions. • Complex modules. • Modules that have been “problematic” in the past. • Experience of the producer. • Safety, criticality, reliability, maintainability, availability, security (integrity and confidentiality)…

  25. Mechanics of inspections • The team must reach consensus on issues recorded as errors and defects. • Error is a problem (lack of compliance with the requirements) identified at the point of origin. • Defect is found beyond the point of origin (ex. Design problem identified in code). • The producer doesn’t get to vote.

  26. Mechanics of inspections (2) • Inspection meetings limited to 2 hours. • Posting results of individual inspection meetings is controversial. • Consider posting aggregate results. • Supports quality improvement without personalizing the guilt. • Inspection is complete when all the problem reports are closed.

  27. Agenda • The big picture. • Inspection Process. • Applying Inspection Process. • Utilizing Orthogonal Defect Classification. • Inspection Checklists.

  28. Inspection Process Attributes • Inspections MUST BE an integral part of software development. • Process must be defined and documented. • Flexible, allowing changes. • Participants agree to follow the process. • The process includes metrics collection. • Metrics are utilized for process modifications. • Actively managed. • Attributes are indicators of process maturity.

  29. Corporate resistance • Management issues: • Support for objections, commitment of resources, schedule concerns. • Software development process: • Does it exist? Could it accommodate the inclusion of inspections? Training? • Software developers: • A fear of performance reviews. • Metrics collection: • Readiness, acceptance, focus on software quality.

  30. Requirements Inspections • Objectives • Is every requirement traceable to the preceding document? Is every requirement clear and concise, internally consistent, unambiguous, testable/demonstrable. • Prerequisites • The preceding document exists and is accepted, SRS been internally reviewed, availability of a checklist.

  31. Requirements Inspections (2) • Planning phase. • Diversity of inspector’s backgrounds, include clients (if possible). • Identification of the SRS, other documentation. • Preparation phase. • Self-study, inspector-producer discussions. • Inspection meeting. • Checking preparedness, discussing discrepancies. • Follow-up phase

  32. Design Inspection • Objectives • SRS-SDD compliance, traceability, design conformance to standards. • Prerequisites • SRS has been inspected and completed, SDD internally reviewed, availability of checklists, availability of design documentation (CASE tools).

  33. Design Inspection (2) • Planning phase. • Inspector’s backgrounds include software engineers, QA, hardware engineers. • Identification of the SRS, SDD other documentation. • Overview meeting phase. • Producer’s presentation, inspectors ask questions. • Preparation phase. • Inspection meeting. • Checking preparedness, discussing discrepancies, going through checklists. • Follow-up phase

  34. Code Inspections • Objectives and Prerequisites • SDD inspected and accepted. • Compiled code. • 50-100 lines of C code per hour of preparations. • 100-200 lines per hour of inspection.

  35. Test Script Inspections • Objectives: • Accurate validation of requirements in SRS, taking advantage of known design decisions. • Prerequisites: • Internally reviewed and executed tests/scripts, checklists, acceptable test results.

  36. Standardized forms • The inspection problem report form • The inspection provess summary form • Checklists • See examples in Appendices C and D [S Rakitin]

  37. Agenda • The big picture. • Inspection Process. • Applying Inspection Process. • Utilizing Orthogonal Defect Classification. • Inspection Checklists.

  38. Principles of ODC • A technique that bridges the gap between quantitative and qualitative methods. • Extracts semantic information in defects via classification. • Quantitative progression of defect counts through a project lifecycle is shown next.

  39. Defect-Type Attribute • Function: Affects significant capability, user features, APIs. • Requires formal design change. • Assignment: Initializations, etc. • Interface: Errors in inter-component interactions. • Checking: Program logic that fails to properly validate data, loop conditions…

  40. Defect-Type Attribute (2) • Timing/Serialization: Shared and real-time resources. • Build/Package/Merge: Library systems, version control. • Documentation: Errors in publications and maintenance nodes. • Algorithm: Efficiency or correctness problems. • Requires reimplementing an algorithm or a data structure.

  41. Defect Trigger Attribute • The activity that facilitates defect discovery. • System test triggers. • Recovery and exception handling, workload and stress, HW and SW configurations, etc. • Function test triggers. • Test coverage, test sequencing, test interaction, coverage (covered later in this class).

  42. Review and Inspection Triggers • Backward release compatibility. • Lateral compatibility. • Documentation within the same release. • Design conformance. • Concurrency. • Operational semantics. • Understanding the logic flow. • Document consistency and completeness. • Rare situation. • Extensive experience of an inspector.

  43. Agenda • The big picture. • Inspection Process. • Applying Inspection Process. • Utilizing Orthogonal Defect Classification. • Inspection Checklists: An experiment.

More Related