1 / 11

What Do You Mean It Doesn’t Do What We Thought?

What Do You Mean It Doesn’t Do What We Thought?. Validating a Design. Agenda – Design Validation. Concepts and implications Specification mitigations Design mitigations Test set mitigation Summary. Concepts.

martinm
Télécharger la présentation

What Do You Mean It Doesn’t Do What We Thought?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Do You Mean It Doesn’t Do What We Thought? Validating a Design

  2. Agenda – Design Validation • Concepts and implications • Specification mitigations • Design mitigations • Test set mitigation • Summary

  3. Concepts • Validation – Confirmation, through the provision of objective evidence, that the requirements for a specified intended use or application have been fulfilled

  4. Implications • The issues relating to validation encompass those of verification • Additional concerns with validation • Caused by the need to match the application to the product • Application has been translated to specification through the requirements process • Requirements process is by nature imperfect • Sometimes the specification does not satisfy the needs of the application • Result – a verified product might be invalid • May require significant rework to the product • May require accepting reduced functionality (waiver) • A goal of the development process is to minimize validation failures • Begins in review of the requirements process (hopefully primary point) • Mitigate by design activities • Reduce by robust test set design

  5. The Implication Illustrated • Alice electronics (detector component and C&DH component) • Joint requirement: process > 10 k sustained events per second • Individual requirements defined for detector and C&DH processing • Both met individual requirements for processing • When combined only 6-7 k sustained events per second • Verification of individual units led to invalid system • What went wrong? • The overall requirements were not broken down correctly • The C&DH and DE test sets were not high fidelity • Verified functionality, not performance • We got lucky that a waiver was acceptable

  6. Specification Mitigation • Only list requirements, not desirements • State unambiguous performance requirements • Build in adequate margin • Not open-ended enhancement, but • Enough to accommodate performance shortfalls • Ruthlessly remove TBDs • Insist on definite test methods for mitigation • Remember – Unless application needs can be unambiguously specified, they won’t be met!

  7. Design Mitigation • Implement specification exactly • Isolate various sub-sections • Minimizes “corner cases” and negative interactions • Allows correction with minimal impact when things don’t work right • Verify complex functions early, thoroughly, and completely • Allows early look at potential problems • Analysis / simulation / what ifs should be as realistic as possible • Insist on end-user review of implementation • Allows user community to comment • Minimizes misunderstandings upon delivery • Develop test plans that have high fidelity to the end application

  8. Test Set Mitigation • Ensure interfaces are maximally flight-like • Precludes misunderstandings of characteristics • Provides early indication of problems • Don’t emulate only one characteristic of interface • Make test set reasonably sophisticated • Sufficient complexity to reproduce operational timing • Adequate functionality for stress testing • Run all interfaces at maximum speed with margin • Don’t let the same group build the tested unit (design) and the unit tester (test bench) • Identical assumptions might go into both ends of an interface • Faithful reproduction is dependent on familiarity (if possible, test bench should be provided by end user)

  9. Test Set Mitigation (cont.) • Make the control interface as application like as possible • Forces correct command structures / types • Allows all test scripts to be reproduced at higher levels • If at all possible, incorporate early interface tests of real engineering hardware • Keep the test (or simulation) environment unless the flight system changes • Don’t change test equipment hardware configurations • Apples to apples comparisons during tests vital • Ensure that flight changes are reflected in test set design

  10. Test Set Mitigation (cont.) • Use the same controls for test set development as for flight unit development • Configuration management • Software development • Peer reviews • Build in diagnostics so that anomalies can be traced to test equipment or unit under test • Ensure that test results mean something • Pass / fail criteria clear • Allowable flight parameter variations included • Reasonable displays (with significant information clearly shown) • Ensure that test set accommodates calibration

  11. Summary • Successful verification does not always guarantee successful validation • Techniques can be incorporated that improve the likelihood that validation will succeed • Careful specification development • Thorough and cautious design techniques • Extensive test set fidelity to flight requirements • Effective techniques for validation are extra effort • More time consuming • More expensive • But, definitely worth it.

More Related