1 / 21

Acquisition Reform

Acquisition Reform. Gates 4-6. Team Members. Brian Persons (NAVSEA) Nidak Sumrean (PEO Ships) Brian Scolpino (NAVAIR) Geoffrey Tisone (NAVAIR) Alex Gierber (PEO LS) Don Burlingham (MCSC) Bill Bray (PEO IWS) John Metzger (PEO C4I) Phil Charles (SPAWAR).

dunn
Télécharger la présentation

Acquisition Reform

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Acquisition Reform Gates 4-6

  2. Team Members • Brian Persons (NAVSEA) • Nidak Sumrean (PEO Ships) • Brian Scolpino (NAVAIR) • Geoffrey Tisone (NAVAIR) • Alex Gierber (PEO LS) • Don Burlingham (MCSC) • Bill Bray (PEO IWS) • John Metzger (PEO C4I) • Phil Charles (SPAWAR) Tasking: Develop Metrics and Templates for assessing Program Health

  3. Annual CSB Two Pass Processwith Phase Development NAVAL REQUIREMENTS ACQUISITION PASS 2 PASS 1 OSD/JOINT LEVEL MS A JROC MS B JROC 1 2 3 5 4 6 NAVY / USMC LEVEL SSAC SDD CDD and CONOPS Approval RFP Approval Contract Award ICD Approval Alternative Selection SDS Approval CBA R3B OPNAV N8 R3B OPNAV CNO/ASN(RDA) R3B OPNAV / CFFC CNO R3B ASN(RDA) ASN(RDA) R3B ASN(RDA) ASN(RDA) Briefing ASN(RDA) Forum*: Lead Org*: Chair/Co-chairs*: PEO/SYSCOM/ OPNAV/HQMC LEVEL AOA CONOPS CDD Phase I SDS Phase II SDS RFP IBR # Gate Review * Substitute MC Equivalent when appropriate

  4. Gates 4 Review System Design Specification • Input Criteria: • JROC validated CDD • Service approved CONOPS • Independent cost estimates are understood and compared to PM estimates and available budget • Goals / Exit Criteria: • Approved System Design Specifications (SDS) • Translation of CDD requirements to be used for developing system design • Do we know what we are buying • Ensure system designed for Producibility, Operability and Maintainability • Define Navy design criteria in areas that are applicable • Naval approval to proceed to DAB • Service approval of key milestone documents • Program Health Assessment (PoPs methodology) • Based on the requirement of the CDD and development of the SDS • Are the Cost , Schedule and Technical risks identified and mitigation strategies approved andfunded • Understanding of the Industrial Implications of what we are procuring • Proposed Material solution aligned with Service and DoD vision

  5. Gates 4 Review Outline • SDS Methodology and Metrics • System alignment with validated Capabilities • Identify technical requirements • Establish triggers for re-review • Producibility • Program Health Methodology and Metrics • Program Requirements • Program Resources • Program Execution • Capability “Fit” Capability Vision • Program Advocacy

  6. SDS Template • System alignment with validated capabilities • Identify technical requirements • Establish triggers for re-review • Producibility Work In progress by SDS Team

  7. Gates 5 Review RFP Approval • Input Criteria: • Approved System Design Specification (SDS) • Milestone B DAB approval • Key knowledge of the Business process / business arrangements • Goals / Exit Criteria: • Approval for RFP release • Approval of buy and build business strategy (Acquisition Strategy) • Program Health Assessment (PoPs methodology) • Same as gate 4 Plus • Is the Government staffing aligned to support evaluation of proposals

  8. Gates 5 Review Outline • RFP Release • Alignment • Program • Enterprise • Maturity • Currency, accuracy and completeness • Program Health Methodology and Metrics • Program Requirements • Program Resources • Program Execution • Capability “Fit” Capability Vision • Program Advocacy

  9. Gate 5: RFP Review

  10. The Three Dimensions of RFP Readiness • Alignment • Program • Enterprise • Maturity • Currency, accuracy and completeness

  11. Alignment • Program • RFP and attachments are aligned with the program acquisition strategy, CDD and SDS and should elicit the desired behavior from the successful offeror • Method of contract formation, i.e., competitive vs. non-competitive • Contract type • Contract line item structure • Use of options • Period of performance/delivery schedule • Data rights • Contract financing • Deviations from standard clauses and provisions • Special contract requirements, e.g., warranties, re-openers • Traceability back to SDS

  12. Alignment (cont’d) • Enterprise • Does the RFP implement enterprise-wide initiatives and strategies • Open architecture • Industrial base strategies

  13. Maturity • Do the RFP and supporting documents clearly communicate to potential offerors the Government’s requirements and desired terms and conditions? • Technical requirements adequate to prepare a proposal? • No specification TBDs? • Clear and complete proposal instructions and evaluation factors • Fully developed special contract requirements?

  14. Currency, Accuracy and Completeness Specifications and SOW accurate, complete and traceable? Requirements track from program WBS to specs and SOW to contract line item structure? Clauses up-to-date? Required FAR/DFARS deviations have been obtained? Technical data complete and up-to-date?

  15. Gates 6 Sufficiency Review • Input Criteria: • Source Selection Complete • Contract Awarded • Integrated Baseline Review Complete • Goals / Exit Criteria: • Assessment if Validated Capabilities (CDD) and SDS are met • IBR Complete and Performance Measurement Baseline established • Program Health Assessment (PoPs methodology) • Same as gate 5 Plus • Assess results of the IBR • Based on Performance, what is the impact to Cost, Schedule • Is there a POM/PR requirement impact

  16. Gates 6 Review Outline • Sufficiency Review • Performance against the PMB • Change Volatility • Resource Requirements (descopes) • Aligns with OSD CSB process • Program Health Methodology and Metrics • Program Requirements • Program Resources • Program Execution • Capability “Fit” Capability Vision • Program Advocacy

  17. Gate 6: Sufficiency Review

  18. IBR Objectives • Assess the adequacy of the Performance Measurement Baseline (PMB) (scope, schedule, budget, resources and management processes) including identification of associated risks • Achieve a mutual understanding of the PMB and its relationship to the underlying Earned Value Management System (EVMS) • Insure all tasks are planned and can be measured objectively, relative to technical progress; and that managers have appropriately implemented required management processes. • Attain agreement on a plan of action to evaluate the identified risks • Quantify the identified risks and incorporate in an updated Estimate At Completion (EAC) A major goal and benefit of the IBR is agreement between the government technical staff members and their contractor counterparts with respect to technical scope content and performance specifications.

  19. Results of IBR • Achieve a mutual understanding of the baseline plan and its relationship to the underlying Earned Value Management System (EVMS) • Attain agreement on a plan of action to evaluate the identified risks. • Quantify the risks and incorporate in an updated Estimate At Completion (EAC) • Assess the adequacy of the performance measurement baseline (scope, schedule, budget, resources, and management processes) Results from the IBR provides the basis for CSB discussion

  20. Configuration Steering Board(CSB) • Requirement • Program Managers work on an annual basis to identify a set of descoping options that reduce program cost or moderate requirements • CSB chaired by SAE and membership from AT&L and Joint staff

More Related