1 / 16

Empirical Study of Software Quality and Reliability

Empirical Study of Software Quality and Reliability. 14 November 2007. High Assurance Systems Engineering Conference IEEE HASE 2007. Jeff Tian, Ph.D., PE Southern Methodist University Dallas, TX USA 75275 Tel: (214) 768-2861 FAX: (817) 768-3085 tian@engr.smu.edu. Michael F. Siok, PE

audi
Télécharger la présentation

Empirical Study of Software Quality and Reliability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Empirical Study of Software Quality and Reliability 14 November 2007 High Assurance Systems Engineering Conference IEEE HASE 2007 Jeff Tian, Ph.D., PE Southern Methodist University Dallas, TX USA 75275 Tel: (214) 768-2861 FAX: (817) 768-3085 tian@engr.smu.edu Michael F. Siok, PE Lockheed Martin Aeronautics Company P.O. Box 748, MZ 8604 Fort Worth, TX 76101 Tel: (817) 935-4514 Mike.F.Siok@lmco.com

  2. Avionics Software Development • How much software process is enough ? • How much software is enough ? • Benchmarks ? • In-house Best Practice ? Cost too Much Takes too long Metrics, metrics, metrics Good Software = f (productivity, reliability, quality)

  3. Aircraft Software at LM Aero . . . Controls Displays Radar Mission Computer Electro- Optical Global Positioning Weapons External Stores Inertial Stick and Throttle Ailerons Gyros Rudder(s) Flight Control Stabilizers Accelerometers Engine(s) • Characteristics • Large Complex Systems • Decades Lifespan • Frequent Software Updates • Mix of Computation Types • Computational • Displays • Logic/State Machine • Signal Processing • Feedback Control • Hard & Soft Real-Time • Severe Computing Resource Constraints • COTS requirements • Legacy reuse

  4. Aircraft Software at LM Aero . . . • Embedded Code size ~2 MSLOC & climbing fast • Hundreds of object instances • Some COTS • Products: F-16, C-130, U-2, F-22, JSF, support equipment, others . . . .

  5. Background (Cont’d) • Avionics Software Managers want to know . . . • How are projects performing, individually and collectively? • Is OO better than SA/SD? • Does programming language make a difference? • Productivity, reliability, quality . . . How are we doing? • Issues . . . • Measuring success • Benchmarking • How to improve • Use statistics to provide some answers

  6. Topics IEEE HASE 2007 Paper: Empirical Study of Embedded Software Quality and Productivity By Michael F. Siok and Jeff Tian • Background • Avionics Software Project Data • Data Analysis • Wrap up and Next Steps

  7. Avionics Software Project Data • 39 software projects carefully chosen for study • Project Application Domain (a.k.a. project complexity) • Project Size • Software Development Methodology • Programming Language used • Metrics Framework • Size • Cost • Schedule • TP & Quality

  8. Avionics Software Project Data (Cont’d) • Metrics Framework provides reference to capture metric and scope • 27 metrics available from all projects and normalized, where appropriate • Comparable • Individually • In Aggregate

  9. Data Analysis-- Approach • Study metrics data using Descriptive Statistics • Individual Metrics • Metrics in aggregate • Perform Hypothesis Tests to answer management questions • Domain Separation • OO –vs- SA/SD • Size –vs- Reliability • Size –vs- Productivity • Cost –vs- Reliability • Cost –vs- Productivity • Language –vs- Productivity • Summarize & Report Results

  10. Data Analysis (Cont’d)-- Descriptive Statistics • Used to discover interesting characteristics about each metric in dataset

  11. Data Analysis (Cont’d)-- Hypothesis Tests • Hypothesis Testing • Since data not normally distributed, used non-parametric tests to accept or reject hypotheses • H0 – Data compared is from same population • H1 – Data compared is from different populations Wilcoxon Rank Sum Test

  12. Data Analysis (Cont’d)-- Reporting • Hypothesis Testing (Continued) • Conducted on metrics, results collected, summarized • Majority rules policy on Accept/Reject test results • Reject = difference at .05 level of significance • Accept = no difference

  13. Wrap Up and Next Steps • Avionics software organization needed methodology to • Assess project performance • Assess project performance relative to other similar projects • Identify and act on opportunities for improvement • Software project data was difficult to acquire • Data actually very easy to get • Projects had to demonstrate selected process controls • Managed variability in metrics • Projects had to submit data to company metrics repository and use it • Projects had to validate data in company repository • Analysis method fairly simple, straight-forward • Descriptive Statistics to study metrics behaviors • Hypothesis Testing • Summary reporting to capture analysis results for action

  14. Wrap Up and Next Steps (Cont’d)

  15. Wrap Up and Next Steps (Cont’d) • Statistical Testing did not uncover a clear ‘best all around project’ • Want to identify well-rounded best-in-class project(s) • Project demonstrated best in cost, schedule, performance, and quality • Use Data Envelopment Analysis (DEA) as benchmarking method to identify best in class software projects • Non-parametric analysis method • Establishes Multivariate production efficiency frontier • Statistical Analysis coupled with DEA will provide repeatable methodology to study & assess company software project data • To understand software project and organizational performance • To identify best performing software projects • To clearly identify practical software process & product improvement opportunities 1011000101010110001010100010100100010010001010010101110010101101010101011111100101011111100000101010101010100010111 . . . to better the business practice of software development

  16. Points of Contact Michael F. Siok, PE Lockheed Martin Aeronautics Company P.O. Box 748, MZ 8604 Fort Worth, TX 76101 Tel: (817) 935-4514 Mike.F.Siok@lmco.com Jeff Tian, Ph.D., PE Southern Methodist University Dallas, TX USA 76 Tel: (214) 768-2861 FAX: (817) 768-3085 tian@engr.smu.edu

More Related