1 / 74

A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering

A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering Effectiveness Committee INCOSE - Orlando Chapter Geoff Draper Harris Corporation gdraper@harris.com February 28, 2008. Agenda. Introduction – NDIA Systems Engineering Division (SED) Organization and Committees

breena
Télécharger la présentation

A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering Effectiveness Committee INCOSE - Orlando Chapter Geoff Draper Harris Corporation gdraper@harris.com February 28, 2008

  2. Agenda • Introduction – NDIA Systems Engineering Division (SED) • Organization and Committees • NDIA Systems Effectiveness Committee: • A Survey of Systems Engineering Effectiveness • http://www.sei.cmu.edu/pub/documents/07.reports/07sr014.pdf NDIA SE Division web page: http://www.ndia.org/Template.cfm?Section=NDIA_Divisions_Page&Template=/TaggedPage/TaggedPageDisplay.cfm&TPLID=3&ContentID=677

  3. NDIA SE Division – Org Chart

  4. Key NDIA SE Division Initiatives • OSD (A&T) Initiatives • CMMI Co-Sponsor • Conferences • NDIA Systems Engineering Conference • CMMI Technology Conference • Net-Centric Operations Conference • Committees • Task Groups / Workshops • Awards • Ferguson Award for SE Excellence • Top 5 Government Programs

  5. Effective Systems Engineering: What’s the Payoff for Program Performance? NDIA Systems Engineering Effectiveness Committee CMMI Technology Conference November 15, 2007

  6. Does this sound familiar? These are the ASSERTIONS, but what are the FACTS?

  7. The Problem • It is difficult to justify the costs of SE in terms that program managers and corporate managers can relate to. • The costs of SE are evident • Time • Effort • The benefits are less obvious and less tangible • Cost avoidance (e.g., reduction of rework from interface mismatches • Risk avoidance (e.g., early risk identification and mitigation) • Improved efficiency (e.g., clearer organizational boundaries and interfaces) • Better products (e.g., better understanding and satisfaction of stakeholder needs) How can we quantify the effectiveness and value of SE?How does SE benefit program performance?

  8. Systems Engineering Effectiveness Survey(2004-2007) • Hypothesis: The effective performance of SE best practices on a development program yields quantifiable improvements in the program execution (e.g., improved cost performance, schedule performance, technical performance). • Objectives: • Characterize effective SE practices • Correlate SE practices with measures of program performance • Approach: • Distribute survey to NDIA companies • SEI analysis and correlation of responses • Survey Areas: • Process definition Trade studies Project reviews • Project planning Interfaces Validation • Risk management Product structure Configuration mgmt • Requirements development Product integration Metrics • Requirements management Test and verification

  9. The ChallengePrevious Studies - Summary Mink, 2007

  10. The Challenge -Supporting Evidence Gruhl, Werner (1992), Lessons Learned: Cost/Schedule Assessment, Internal Presentation, NASA Comptroller’s Office Honour, Eric (2004), Understanding the Value of Systems Engineering, Proceedings of the 14th Annual INCOSE International Symposium

  11. 14 Process Areas 31 Goals 87 Practices 199 Work Products CMMI-SE/SW/IPPD v1.1 • 25 Process Areas • 179 Goals • 614 Practices • 476 Work Products SystemsEngineering-related Filter Size Constraint Filter 13 Process Areas 23 Goals 45 Practices 71 Work Products Considered significant to Systems Engineering Survey Development Survey content is based on a recognized standard (CMMI)

  12. Survey Methodology(Conducted: 2004-2007)

  13. Analysis • Perf = f (PC, PE, SEC, AC) • where:Perf = Project Performance PC = Project Challenge PE = Project Environment AC = Acquirer Capability • SEC = Systems Engineering Capability • SEC can be further decomposed as: • Project Planning • Project Monitoring and Control • Risk Management • Requirements Development and Management • Technical Solution • Trade Studies • Product Architecture • Product Integration • Verification • Validation • Configuration Management • IPT-Based Capability SE capabilities and analyses are fully defined by mappings of associated survey question responses

  14. Project Challenge (PC) Overall SE Capability (SEC) Analysis -Validation of Survey Responses Acquirer Capability (AC) Project Performance (Perf) Analyzed distributions, variability, relationships… To ensure statistical rigor and relevance

  15. Total SE Capability (SEC) vs. Project Performance (Perf) Notation Projects with better Systems Engineering Capabilities deliver better Project Performance (cost, schedule, functionality)

  16. Relating Project Performance to Project Challenge and SE Capability Project challenge factors: • Life cycle phases • Project characteristics (e.g., size, effort, duration, volatility) • Technical complexity • Teaming relationships Projects with better Systems Engineering Capabilities are better able to overcome challenging environments

  17. Results1. Product Architecture and Performance Projects with better Product Architecture show a “Moderately Strong / Strong” Positive Relationship with Performance

  18. Results2. Trade Studies and Project Performance Projects with better Trade Studies show a “Moderately Strong / Strong” Positive Relationship with Performance

  19. Results3. Technical Solution and Project Performance Projects with better Technical Solution show a “Moderately Strong” Positive Relationshipwith Performance

  20. Results4. IPT-Related Capability and Performance Projects with better IPTs show a “Moderately Strong” Positive Relationshipwith Performance

  21. Results5. Requirements and Performance Projects with better Requirements Development and Management show a “Moderately Strong” Positive Relationship with Performance

  22. ResultsSummary of Process Relationships Details Moderately Strong to Strong Relationship Moderately Strong Relationship Strong Relationship Weak Relationship

  23. ResultsSummary of Relationships - Composite Details Composite Measures Moderately Strong to Strong Relationship Moderately Strong Relationship Strong Relationship Weak Relationship

  24. Results - Reqts + Tech Solution controlled by Project Challenge Project challenge factors: • Life cycle phases • Project characteristics (e.g., size, effort, duration, volatility) • Technical complexity • Teaming relationships Projects with higher Requirements and Technical Solution capability are better able to achieve higher performance even in challenging programs

  25. Summary SE Effectiveness • Provides credible measured evidence about the value of disciplined Systems Engineering • Affects success of systems-development projects Specific Systems Engineering Best Practices • Highest relationships to activities on the “left side of SE Vee” • The environment (Project Challenge) affects performance too: • Some projects are more challenging than others ... and higher challenge affects performance negatively in spite of better SE • Yet good SE practices remain crucial for both high and low challenge projects

  26. Potential Next Steps • Provide recommendations for action upon survey findings • Conduct additional follow-on surveys and analysis of collected data • IV&V • Broadened sample space • Trending • Improvements to survey instrument • Survey system acquirers

  27. DoD Systemic Root Cause Analysis- Why do projects fail? • Root causes from DoD analysis of program performance issues appear consistent with NDIA SE survey findings. • Reference: • Systemic Root Cause Analysis, • Dave Castellano, Deputy Director Assessments & Support, OUSD(A&T) • NDIA Systems Engineering Conference, 2007 • and NDIA SE Division Annual Planning Meeting

  28. Acknowledgements

  29. SE EffectivenessPoints of Contact Al Brown alan.r.brown2@boeing.com Geoff Draper gdraper@harris.com Joe Elm jelm@sei.cmu.edu Dennis Goldenson dg@sei.cmu.edu Al Mink Al_Mink@SRA.com Ken Ptack ken.ptack@ngc.com Mike Ucchinomichael.ucchino@afit.edu

  30. Backup NDIA SE Effectiveness Survey Analysis Slides

  31. Conclusions & CaveatsConsistent with “Top 10 Reasons Projects Fail*” • Lack of user involvement • Changing requirements • Inadequate Specifications • Unrealistic project estimates • Poor project management • Management change control • Inexperienced personnel • Expectations not properly set • Subcontractor failure • Poor architectural design Above Items Can Cause Overall Program Cost and Schedule to Overrun * Project Management Institute Matching items noted in RED

  32. Conclusions & CaveatsConsistent with “Top 5 SE Issues*” (2006) • Keysystems engineering practicesknown to be effective arenot consistently appliedacross all phases of the program life cycle. • Insufficient systems engineering is applied earlyin the program life cycle, compromising the foundation for initial requirements and architecture development. • Requirements are not always well-managed, including the effective translationfrom capabilities statementsinto executable requirements to achieve successful acquisition programs. • The quantity and quality ofsystems engineering expertise is insufficientto meet the demands of the government and the defense industry. • Collaborative environments, includingSE tools, are inadequateto effectively execute SE at the joint capability, system of systems, and system levels. * OUSD AT&L Summit Matching items noted in RED

  33. Summary SE Relationshipsto Project Performance Details

  34. Summary SE Relationshipsto Project Performance Details Highest scoring SE capability areas in Higher Performing Projects*: Risk Management; Requirements Development and Management; IPTs *Based on small partitioned sample size Lowest scoring SE capability areas in Lower Performing Projects*: Validation; Architecture; Requirements Development and Management

  35. Terminology and NotationDistribution Graph Histogram of response frequencies Median Interquartile Range Outliers Sample size (responses to corresponding survey questions) Data Range

  36. Terminology and NotationMosaic Chart Column width represents proportion of projects with this level of capability Relative performance distribution of the sample Gamma: measures strength of relationship between two ordinal variables p: probability that an associative relationship would be observed by chance alone Projects exhibiting a given level of relative capability (Lowest, Intermediate, Highest) Sample size and distribution for associated survey responses (capability + performance) Measures of association and statistical test

  37. SE Capability: Product Architecture (ARCH)

  38. SE Capability: Product Architecture (ARCH) Survey Questions

  39. SE Capability: Configuration Management (CM)

  40. SE Capability: Configuration Management (CM) Survey Questions

  41. SE Capability: IPT-Related Capability (IPT)

  42. SE Capability: IPT-Related Capability (IPT) Survey Questions

  43. SE Capability: Product Integration (PI)

  44. SE Capability: Product Integration (PI) Survey Question

  45. SE Capability: Project Monitoring and Control (PMC)

  46. SE Capability: Project Monitoring and Control (PMC) Survey Questions (Part 1)

  47. SE Capability: Project Monitoring and Control (PMC) Survey Questions (Part 2)

  48. SE Capability: Project Planning (PP)

  49. SE Capability: Project Planning (PP) Survey Questions (Part 1)

  50. SE Capability: Project Planning (PP) Survey Questions (Part 2)

More Related