1 / 45

PATFrame Workshop #2 Tool Concepts

PATFrame Workshop #2 Tool Concepts. Dan Ligett Softstar Systems Ligett@SoftstarSystems.com (603) 672-0987 www.SoftstarSystems.com. Decision Support System. Risk tradeoff tool (a la Madachy/Valerdi for SE) Testing process tool Management process tool Leading Indicators tool

usoa
Télécharger la présentation

PATFrame Workshop #2 Tool Concepts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PATFrame Workshop #2Tool Concepts Dan LigettSoftstar Systems Ligett@SoftstarSystems.com (603) 672-0987 www.SoftstarSystems.com

  2. Decision Support System Risk tradeoff tool (a la Madachy/Valerdi for SE) Testing process tool Management process tool Leading Indicators tool Cost/Schedule/Effort/Risk/Quality tradeoff tool Game, Training tool, Simulation Risk Identification Risk Mitigation Tell testers when they're done Real Options Give advice on split to Live/Virtual/Constructive It’s a tool to help you win arguments 2

  3. Requirements Elicitation 3

  4. (~) Mix & Match -- Technology vs. Question

  5. PATFrame Dimensions Can imagine thousands of different tools. For example: Real Options can be applied to the SUT or at the Test Infrastructure level 1) The question to answer 2) The technology to apply 3) SUT or Infrastructure 4) OT or DT 5) Now vs. 15/25 years from now 6) Army/Navy/AF 7) Program of Record vs. Rapid Acquisition 8) Space/Air/Sea/Land/Undersea 9) Degree of autonomy 10) Degree of netcentricity 11) Manned vs. unmanned 12) Complexity of SUT 13) Tester vs. Evaluator I don't claim that these dimensions are perfectly orthogonal. 5

  6. Need Feedback! Which of these might help you? Which are too hard? Which make no sense for you? What have we missed? DSS will be driven by MIT / UTA / USC research AND feedback we get to these tool concepts. 6

  7. Tool Concept #1 Question: When am I Done testing? Technology: Defect estimation model Trade Quality for Delivery Schedule Inputs: Defects discovered Outputs: Defects remaining, cost to quit, cost to continue 7

  8. Tool 1, Data Input

  9. Tool 1, Trends

  10. Tool 1, Analysis

  11. Tool Concept #2 • Question(s): • Should I invest $$$ in infrastructure? • (What should I test?) • Technology: • Real Options • Inputs: • Cost of each investment, decisions & risks • Outputs: • Value of the investment, value of flexibility

  12. Tool 2, Data Input

  13. Tool 2, Data Input

  14. Tool 2, Analysis

  15. Tool Concept #3 • Question: • Will XXX improve my testing process? • Technology: • System Dynamics simulation • Inputs: • Description of process • Outputs: • Prediction of progress/quality/testing rate

  16. Tool Concept #4 • Question: • Is my project healthy? • Technology: • Leading Indicators • Inputs: • Description of progress (available early) • Outputs: • Comparison to completed projects • Odds of success

  17. Tool 4, Trends 17

  18. Tool 4, Analysis 18

  19. Tool Concept #5 • Question: • When will I finish testing? • Can I trade X for Y? • Technology: • Parametric estimation model • Inputs: • Size drivers, cost drivers • Outputs: • Cost, Effort, Duration, odds of success

  20. Tool 5, Size Input

  21. Tool 5, Cost Driver Input

  22. Tool 5, Output per Phase

  23. Tool 5, Output per Activity

  24. Tool 5, Effort PDF

  25. Tool 5, Effort CDF

  26. Tool 5, Duration PDF

  27. Tool 5, Duration CDF

  28. Tool Concept #6 Question: What can I do to mitigate risks? Technology: Risk trade model (a la Madachy & Valerdi) Inputs: Descriptions of system under test Outputs: Risks prioritized Mitigation strategies 28

  29. Tool 6, Data Input Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009 29

  30. Tool 6, Prioritized Risks Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009 30

  31. Tool 6, Mitigation Guidance Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009 31

  32. Tool Concept #N • Question: • If the test instrumentation software is reconfigured during the adaptation, will X run-time requirement be met? • Can the test system automatically adapt in appropriate ways to changing data rates from external C4ISR systems? • Are the probes in the test system sufficient to detect and evaluate autonomous behavior X in the system-under-test? • Technology: • Architectural modeling • Inputs: • Models of SUT & infrastructure • Outputs: • Simulation of testing architecture

  33. Remember, each of these Generalizes… The “Question” is just an example Could look at micro or macro questions Trade-off tool Schedule/Cost/Quality/Risk/Training/etc. System Dynamics tool Schedule/Defect Rates/Staffing Levels Prediction tool… 33

  34. Backup Slides 34

  35. Other Candidate Questions… My access to resource X ends next quarter. What work depends on that access? What testing can I perform in parallel? I can't analyze the test data in time to plan the next day's tests. What are my alternatives? Can I reschedule work? Can I take short cuts? Can I use heuristics to guide me? Which ones? Are Rev 12 of System X and Rev 314 of System Y compataible? System X has changed. What do I need to retest? Can I test Rev 12 of System X and Rev 314 of System Y simultaneously? How many of the tests are valid in this environment? System X and System Y can't currently operate within 2 miles of each other because of radio interference. What useful testing can I perform under that constraint? What remedies have worked in the past? Historically, how much has this delayed a comparable program? 35

  36. Other Candidate Questions II… After testing System X, we must wait 24 hours for the nerve gas to disipate. What useful testing can we perform during those 24 hours? Was Test #1234 performed with the prototype of System X, or the production version? Are the results of Run #123 substantially the same as Run #122? Test #1234 is failing. Trace that back, and tell me which requirements of which systems are not being met. I have more than the usual number of contractors/systems. How will that impact risk/schedule? What can I do to mitigate risk/reduce schedule? What are the odds that the SoS will work in the field? Has anyone ever faced this situation before? What's worked before? What's failed before? Why? 36

  37. Other Candidate Questions III… Is the test plan/case good enough? Is my test plan/case in compliance with the standards? Do I have the right people/tools/methodology? Where should I put my resources? Am I in compliance with XXX std? How many test cases should I have? What part of the program should I focus on next? Have I reached the point of diminishing returns in my testing/planning? How can I lower the risk? 37

  38. Other Candidate Questions IV… I need to prioritize my testing. What are some safe & easy tests I can start with, to get my team warmed up? I need to prioritize my testing. What tests should I perform to ensure that the SoS can meet its fundamental objectives? I need to prioritize my testing. I want to do the riskiest parts first -- where should I start? System X is not ready to be tested. What useful work can I do now without it? Things aren't going well. Can your system dynamics model help me understand the problem? What if we tweak the process by doing X? Your system dynamics model indicates that we can meet the schedule if we hire 10,000 people to analyze the test results. Can we try to model other solutions? Everything has been going as planned. But, now they need systems X, Y, and Z in Iraq right now, as is. What capabilities have been tested? 38

  39. Other Candidate Questions V… Remember that SoS we tested last year? Well, now we need to certify it with the F-99 rather than the old F-35. Do we need to do anything more than edit the Word docs? What test should I do next? What part of my plan deviates the most from the normative framework? My program is in progress. How does it compare to the norm? Is this program/project/plan more risky than the norm? What are the odds of success? Do we have the right people? What is the most similar program performed in the last N years? How long did a similar project take? 39

  40. Tool 3, Causal Loop Model http://en.wikipedia.org/wiki/Systems_dynamics 40

  41. Tool 3, Stock & Flow Model http://en.wikipedia.org/wiki/Systems_dynamics 41

  42. Tool 3, Simulation Results http://en.wikipedia.org/wiki/Systems_dynamics 42

  43. Tool 6, Cost CDF 43

  44. Tool 6, Cost PDF 44

  45. Requirements (Early!) V1 Costar manual vs. V7 Costar manual Eliciting, not soliciting Platform? Some of these are poor ideas? Some of these outside PATFrame scope? Some of these may be hard to build? But, need to start a discussion... …are any of these ideas close? 45

More Related