1 / 60

How to be Confident that a Tool Works and How to Document it Scientifically

How to be Confident that a Tool Works and How to Document it Scientifically. Dr. James R. Lyle National Institute of Standards and Technology. Plan of Talk. The Problem Overview of V&V Overview of Conformance Testing An Example: CFTT. The Problem.

Télécharger la présentation

How to be Confident that a Tool Works and How to Document it Scientifically

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to be Confident that a Tool Works and How to Document it Scientifically Dr. James R. Lyle National Institute of Standards and Technology

  2. Plan of Talk • The Problem • Overview of V&V • Overview of Conformance Testing • An Example: CFTT

  3. The Problem • A Forensics Lab might use several software tools • How does the lab establish that the tools work correctly

  4. Some Possible Approaches • Ad Hoc Testing • Formal Verification &Validation • Conformance Testing

  5. Ad Hoc Testing is not Enough • Just running a few test cases is not enough • Test case selection must be justified • Test environment and procedures must be documented • Test results must be documented

  6. Goals of V&V V&V is a process that provides an objective assessment of products and processes throughout the software life cycle: • Correct • Complete • Consistent • Testable

  7. Definitions of V & V (informal) • Verification: Does the software solve the problem correctly. • Validation: Does the software solve the correct problem.

  8. Definitions of Verification • The process of evaluating a system to determine whether the products of a given development phase satisfy the conditions imposed at the start of the phase. IEEE 610 • Confirmation by examination and provisions of objective evidence that the specified requirements have been fulfilled. ISO 8402:1994/IEEE 1012

  9. Definitions of Validation • The process of evaluating a system during or at the end of the development process to determine whether it satisfies specified requirements. IEEE 610 • Confirmation by examination and provisions of objective evidence that the particular requirements for a specific intended use are fulfilled. ISO 8402:1994/ IEEE 1012

  10. V&V Processes in a Project • Management • Acquisition • Supply • Development • Operation • Maintenance

  11. Development V&V Activites • Concept – delineation of a solution • Requirements – what should be done • Design – how to do it • Implementation – create the code • Test – try it out (systematically) • Installation & checkout – deploy

  12. Wait a sec – what do we want? • We want to assess a COTS tool for a computer forensics lab • We don’t need: concept, design or implementation • We just want to do testing • But to test we need requirements • Should look at V&V of requirements and tests

  13. Requirements V&V Tasks • Traceability Analysis • Software Requirements Evaluation • Interface Analysis • Criticality Analysis • System V&V Test Plan • Acceptance V&V Test Plan • Configuration Management Assessment • Hazard Analysis • Risk Analysis

  14. Requirements Evaluation • Correctness • Consistency • Completeness • Accuracy • Readability • Testability

  15. Requirements Correctness • Requirements within constraints and assumptions of the system • Requirements comply with standards, regulations, policies, physical laws • Validate data usage and format

  16. Requirements Consistency • Terms and concepts are consistent • Function interactions and assumptions are consistent • Internal consistency

  17. Requirements Completeness • All functions specified • Interfaces: hardware, software & user • Performance criteria • Configuration data

  18. Requirements Accuracy • Precision (e.g., truncation & rounding) • Modeled physical phenomena conform to system accuracy and physical laws

  19. Requirements Readability • Legible, understandable and unambiguous to the target audience • All acronyms, mnemonics, abbreviations, terms and symbols are defined

  20. Requirements Testability • There must be objective acceptance criteria for validating the requirements

  21. Testing V&V Tasks • Traceability Analysis • Test Procedure Generation • Test Case Execution & Report • Hazard Analysis • Risk Analysis

  22. Test Case Traceability • Each test case must be derived from one or more requirements • Constructing a requirements to test cases matrix is helpful

  23. Test Case Procedure Generation • Each test case must have a documented procedure for test execution

  24. Test Case Execution • Run each test case • Document results • Anomaly identification and documentation procedure

  25. Hazard Analysis • Hazard: a source of potential harm or damage (to people, property or the environment) • Hazard Analysis: identification and characterization of hazards

  26. Risk Analysis • Risk: combination of the frequency (or probability) and the consequence of a given hazard. • Risk Analysis: systematic use of available information to identify hazards and to estimate the risk to individuals, populations, property or the environment.

  27. V & V of COTS Summary • Create set of tool Requirements • Review • Create Test Cases & Test Environment • Review • Execute Test Cases • Write Report

  28. What Is Conformance Testing? • Used to check an implementation against a standard or specification • ISO Guide 2 defines Conformance as the fulfillment of a product, process or service of specified requirements. • Requirements are specified in a standard or specification.

  29. Why Conformance Testing?

  30. Components for Testing • Specification (less formal) or standard (ISO, IEEE, ANSI) • Conformance Test Suite • What about validation and certification? • Need: Testing Lab, Certification Authority and dispute resolution

  31. (Requirements) Specification • Tool User Manual? No – too specific • Spec should cover core functionality required for correct operation of similar tools. • Should apply to most similar tools • OK if some features omitted – cover omitted features in another spec • Use V&V techniques to review spec

  32. Conformance Test Suite • Test cases & test case documentation • Derived from the specification (using V&V techniques: traceable, complete, consistent, etc) • Description of test purpose • Pass/fail criteria • Trace back to requirements in spec

  33. Testing Methodology • Source code is not available (a good thing) • Apply black box testing theory to create tests • Test both legal and illegal inputs • Tools may provide optional features – some test cases may be executed only if a tool provides an optional feature

  34. Who Runs the Tests? • Testing Lab, Vendor, Tool User • Lab may be accredited or just recognized as qualified to run the tests • Lab produces a test report

  35. The Test Report • Detail pass/fail for each test case • Complete description of Tool Under Test • Name of test lab • Date of test • Name and Version of test suite • Unambiguous statement of results

  36. Certification Authority • Reviews (validates) test results • Explicit criteria for issuing a certificate • Issues certificate (or brand) for validated product • Another definition of Validation: process necessary to perform conformance testing in accordance with a prescribed procedure and official test suite. • Certification: acknowledgement that a validation was completed and the criteria established for issuing certificates was met

  37. NIST CFTT Products • Forensics tool function specifications • Forensics tool function test methodologies • Test support software • Forensics tool function test reports

  38. Specification Development • CFTT sponsors identify tool function • Focus group (LE & NIST) to identify requirements • NIST drafts specification for external review • NIST develops test methodology, test harness

  39. Tool Testing Process • Sponsors identify tool to test • Write test plan (identify test cases to run) • Run Tests • Write Test Report

  40. Examples From CFTT • Writing Requirements • Support Software Documentation • Test Cases

  41. Outline of a Specification • Introduction • Scope • Technical Background • Requirements: what should the tool do • Test Assertions: If/then statements that are testable • Test Methodology • Test Cases: combinations of test assertions • Traceability Matrices

  42. SW Write Blocker Requirements • Informal requirement: No change allowed to a drive that contains evidence • Also: Must be able to read the entire drive • Formal: (1) The tool shall block any commands to a protected disk in the write, configuration, or miscellaneous categories. • (2) The tool shall not block any commands to a protected disk in the read, control or information categories

  43. Apply V&V to Spec for … • Correctness • Consistency • Completeness • Accuracy • Readability • Testability

  44. Requirements Completeness • Completeness: E.g., look at combinations of parameters: Read/Write vs protected or not Need another: (3) The tool shall not block any commands to an unprotected disk.

  45. Requirements Testability • SWB test methodology: replace the int 13 routine with one that counts the number of times each I/O function is called, but does not execute any commands. • Any blocked commands have a count of 0

  46. A Testable Requirement? • Documentation shall be correct insofar as the mandatory and any implemented optional requirements are concerned.

  47. Evolution of an Imaging Req • If a source disk is imaged to a destination disk of the same geometry then the disks compare equal. • If a duplicate copy is created directly from a source disk of the same geometry, then the disks must compare equal. • The tool shall create a bit-stream duplicate of the original. • If there are no errors accessing the source media, then the tool shall create a bit-stream duplicate of the original.

  48. Imaging Test Methodology • Init source to known state • SHA-1 for source • Init destination to known state • Run tool • Compare source to destination • Rehash source

  49. Test Case Traceability • Every test case must be traceable and unambiguous. • This case is neither: TEST CASE: DI-167 Create an image from a BIOS-IDE source disk to a BIOS-IDE destination disk and the source contains a FAT32 partition where the source disk is smaller than the destination and where the source disk contains a deleted partition EXPECTED RESULTS: src compares qualified equal to dst deleted-partition is recovered

  50. Test Logging • Log everything, automatically if practical • Hardware, Software, Versions • Time/date • Operator

More Related