1 / 42

Feasibility Evidence Description (FED) Supporting Information Document (SID)

This document demonstrates the feasibility and consistency of Valuation artifacts and provides a viable business case for the system being developed. It also identifies risks and uncertainties in the project.

schmitt
Télécharger la présentation

Feasibility Evidence Description (FED) Supporting Information Document (SID)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feasibility Evidence Description (FED) Supporting Information Document(SID) Dr. Barry BoehmCS577 Fall 2009

  2. Objectives of the FED • Demonstrate feasibility and consistency of other Valuation (or Foundations, or IOC) artifacts: OCD, SSRD, SSAD, and LCP • Demonstrate a viable business case for the system being developed • Obtain stakeholder commitment to proceed • Identify risks, i.e.: • Uncertainties in your demonstration of feasibility and consistency • Uncertainties in the business case (c) USC-CSSE

  3. Pass/Fail Condition • The feasibility rationale covers the key pass/fail question: • “If I implement the specified architecture using the specified process, will the resulting system support the operational concept, realize the prototyping results, satisfy the requirements, and finish within budget and within schedule?” (c) USC-CSSE

  4. The Need For Feasibility Evidence 1. A commercial customer specified a natural language interface for an otherwise simple query system. The project was cancelled after the natural language interface caused a factor-of-5 overrun in project budget and schedule. 2. A commercial customer specified a project to fully digitize a set of corporate records via scanning and optical character recognition. The resulting cost escalated by a factor of 10 after if was discovered that the records included many hard-to-capture tables, charts, and graphs. 3. A government customer specified a 1-second response time for an extremely large transaction processing system. Meeting this requirement involved a custom architecture and a $100M project. The customer authorized a prototyping activity, which determined that 90% of the transactions needed only 4-second response time. With this performance requirement, a commercial-technology-based solution was achieved for $30M. (c) USC-CSSE

  5. Differences between Architected Agile and NDI/NCS (c) USC-CSSE

  6. Introduction 1.1 Purpose of the FED document 1.2 Status of the FED document • Identify and document the differences between the contents of FED and Win-Win negotiations. • Identify major FED related issues (c) USC-CSSE

  7. 2. Business Case Analysis • Return On Investment (ROI) • ROI = (Benefits- Costs) / Costs • Non-monetary benefits • Better student performance • More timely health services • Stronger research reputation • Increased use of parks, disadvantaged community services (c) USC-CSSE

  8. Business Case Analysis (cont.) • Costs include: • Development costs: hardware, software, facilities, people (= client-side stakeholders’ time: in hours for CS577; in salaries in “real-world” settings), etc. • Transition costs: training, data preparation/conversion, site preparation (facilities, equipment, supplies), operational readiness testing, etc. • Operational costs: salaries for operators COTS licenses, facilities, supplies, etc. • Maintenance costs: salaries for DB administrators, maintenance staff, etc. (c) USC-CSSE

  9. Business Case Analysis (cont.) • Benefits include: • Hours of time saved by client-side employees for CS577; salary saved in “real-world” settings • Revenues (payments for services), if any • Enhanced quality, response time, reputation, etc: harder to quantify as time/financial returns are possible/likely in the long term, but hard to estimate • Etc. (c) USC-CSSE

  10. From Data Mining the Library Catalogue’s LCA ROI Analysis Example (Part I) (c) USC-CSSE

  11. Example of Costs, Benefits & ROI • From FC Package of “Data Mining the Library Catalogue” CS577Project • See next slide for numbers and calculation • Costs are copied from previous slide • Benefits are computed as number of hours saved over old way of performing the same activity. (Has to do with “SURG filtered report,” and “Unicorn” report, which would take too much time to describe in detail.) (c) USC-CSSE

  12. From Data Mining the Library Catalogue’s LCA ROI Example (Part II) Using the previous numbers as the Investment Costs, and calculating hours saved for one person as the time it takes to review an original sized report compared to a SURG filtered report of 1/3 the original Unicorn size (See Section 2.1.5.1), the Return On Investment for this project is shown in the table and chart below: (c) USC-CSSE

  13. Business Case: Problems with ROI Example • ROI calculated as benefit/cost (B/C) • Should be net benefit/cost (B-C)/C • 330-229 = 1.44 should be (330-229)/229 = 0.44 • Semester used inappropriately • OK for development • Use annual costs, benefits for post-development • Positive net benefits mean no added investment costs • Unless costs are external payments • In example, 28 hours of operational sustainment effort are paid for out of 78 hours of operational effort saved • In Spring 99, 19 hours saved mean reduction in investment • Time inverted = 35 hours, net benefit = 0 (c) USC-CSSE

  14. Adjusted ROI Example (c) USC-CSSE

  15. Break-even Graph • From FC Package of “Data Mining the Library Catalogue” CS577Project • See next slide (c) USC-CSSE

  16. ROI Example (Part III) (c) USC-CSSE

  17. Note that • With respect to development costs, FED must: • “prove” that schedule is sufficient to deliver required capabilities • be consistent with (included in) Budgets in LCP • With respect to transition costs, FED must: • be consistent with (included in) Budgets in LCP. (Next slide gives an example of transition cost calculation • Following slide gives an example of how to estimate maintenance cost (c) USC-CSSE

  18. From Hispanic Digital Archive’s LCA Transition Costs Estimate Example (c) USC-CSSE

  19. Maintenance Estimate Example (c) USC-CSSE From Data Mining the Library Catalogue’s LCA

  20. Market Trend and Product Line Analysis • Analyze the market trend, product popularity, product market standpoint, predicted longevity of the company • Analyze the related products that are developed/ launched by the same company/ organization. (c) USC-CSSE

  21. Architecture Feasibility • Level of Service Feasibility • Explicitly state how the design will satisfy the SSRD L.O.S. Requirements. • Can be done through • Analysis • detailed references to prototypes • Models • Simulations • Complete coverage of the L.O.S. requirements is essential. • Capability Feasibility • Explicitly state how the design will satisfy SSRD Capability Requirements • Evolutionary Feasibility • Explicitly state how the design will satisfy SSRD Capability Requirements (c) USC-CSSE

  22. Level of Service Feasibility For Level of Service of module that you develop For Level of Service of module that depend on NDI/NCS performance (c) USC-CSSE

  23. Capability Feasibility (c) USC-CSSE

  24. Evolution Feasibility For Evolutionary requirements that based on module that you develop For Evolutionary win conditions that based on NDI/NCS (c) USC-CSSE

  25. Process Feasibility • Provide rationale for • choice of process model • Choice of increments, blocks, or builds sequence in incremental development. • Provide evidence that: • priorities, process and resources match budgeted cost and schedule • Architecture is such thatlow priority features can be feasibly dropped to meet budget or schedule constraints (c) USC-CSSE

  26. Risk Assessment • Risk: any combinations of capabilities or objectives whose feasibility is difficult to assure • Risk Assessment consists of risk identification, risk analysis and risk prioritization: • Organize major sources of risk into a Top-10 (or Top-N) risk items list • For critical risks indicate: • Description • Risk Exposure : magnitude and probability of loss • Risk Reduction Leverage: in reducing risk exposure • Actions to mitigate risk • Contingency plan • Identify low-priority requirements that can be left out in case of schedule slippage (c) USC-CSSE

  27. Software Risk Management Techniques (c) USC-CSSE

  28. Analysis Results • Identify architectural alternatives and their impacts • Identify infeasible architectures or rejected alternatives • Describe feasible architectural alternatives which were rejected due to constraints on the way that the problem must be solved, e.g., a mandated technology. (c) USC-CSSE

  29. Sample of Analysis Results from Team 16-1 Spring ’99 (c) USC-CSSE

  30. NDI/NCS Feasibility Analysis • Assessment Approach • Identify the approach in assessing NDI/ NCS component candidates. Discuss the instruments and facilities that you are using (c) USC-CSSE

  31. Assessment Results • NDI/NCS Candidate Components (Combinations) • Identify all candidate COTS, GOTS, ROTS, OSS, libraries, and NCS component that you are using/ plan to use. • Identify the purpose of each component. If you have to use multiple NDI/NCS to satisfy the objectives & constraints and priorities, group them in combination set.   (c) USC-CSSE

  32. NDI/NCS Evaluation Criteria • Identify the evaluation criteria in term of NDI/NCS attributes • Identify the evaluation Criteria in terms of features (c) USC-CSSE

  33. Example of Attributes – Evaluation Criteria Note: Full list of NDI/NCS can be found in ICM EPG (c) USC-CSSE

  34. Example of features – evaluation criteria (c) USC-CSSE

  35. Evaluation Results Screen Matrix • Assess each component based on evaluation criteria (c) USC-CSSE

  36. Supporting Information Document (SID)

  37. Outline of SID • Introduction 1.1 Purpose of the SID Document 1.2 Status of the SID 1.3 Reference • Requirement Traceability Matrix • Glossary • Document Tailoring • Prototype History • Appendix (c) USC-CSSE

  38. Supporting Information Document • Contain common information of OCD, SSRD, SSAD, LCP and FED • For NDI/NCS teams, instead of tracing SSRD, trace win conditions (c) USC-CSSE

  39. 2. Requirements Traceability • Summarize the various traceability concerns across ICM artifacts i.e. OCD, WikiWinWin Report, SSRD, SSAD, Test Plan. (See next slide.) (c) USC-CSSE

  40. Requirements Traceability : Example Please note that this traceability matrix is not a complete example because it is intentionally missing tracing test cases. Anyhow this example is good until Development Commitment Package. (c) USC-CSSE

  41. 3. Glossary • A collection of specialized terms with their meanings • For example • CSC California Science Center • J2EE a programming platform for developing and running distributed multitier architecture Java application (c) USC-CSSE

  42. 4. Guideline Tailoring • Contain changes made to IICM-Sw guideline • If any section in IICM-Sw document is tailored, provide rationale to support the changes. • For example (c) USC-CSSE

More Related