260 likes | 603 Vues
Objectives. Provide an overview of the challenges faced by software engineering teams in the Software Measurement arenaOutline some of the strategies that are employed to meet these challenges in GSG. Overview. About MotorolaOrganisationGlobal Software GroupGSG ScotlandCMMConcept of Six Sigm
E N D
1. Practical Software Measurement- Challenges and Strategies Staff Engineer
Motorola Six Sigma Black BeltStaff Engineer
Motorola Six Sigma Black Belt
2. Objectives Provide an overview of the challenges faced by software engineering teams in the Software Measurement arena
Outline some of the strategies that are employed to meet these challenges in GSG Global Software GroupGlobal Software Group
3. Overview About Motorola
Organisation
Global Software Group
GSG Scotland
CMM
Concept of Six Sigma
Metrics
Define & Measure
Data Set
Analysis
Tools & Methodology
Improvements
Conclusions
Control
Future Plans
5. GSG Locations
6. GSG Scotland - Background Centre Started in Dec 2000.
Why Scotland ?
Large talent pool of qualified Engineers from Scottish Universities.
Synergy with (Motorola) Semiconductors and ISLI (Institute of System Level Integration)
Proximity to, and ease of doing business with, main European markets
Scotland is now the lead Automotive centre for GSG.
Focussed on embedded software applications primarily automotive.
System-on-Chip (SoC) design team focussed on Motorola design needs post Freescale split
Achieved CMM L3 certification in May 2004, currently working towards Level 5 later this year
7. Six Sigma What is Six Sigma?
We all should be able to link it to one of these definitions.
The most Common Definition- It is a metric- 3.4 defects per million opportunities.
Sigma is a measurement of goodness
Sound Methodology - DMAIC
Customer Focused CTQs/CCRs
Data Driven Decisions
Fixes the root cause NOT the symptom
It is Focused, Disciplined & Simple
Sis Sigma describes Motorolas Overall commitment to QualityWhat is Six Sigma?
We all should be able to link it to one of these definitions.
The most Common Definition- It is a metric- 3.4 defects per million opportunities.
Sigma is a measurement of goodness
Sound Methodology - DMAIC
Customer Focused CTQs/CCRs
Data Driven Decisions
Fixes the root cause NOT the symptom
It is Focused, Disciplined & Simple
Sis Sigma describes Motorolas Overall commitment to Quality
8. Software Six Sigma Normality is a major issue in software data
Some of our CMM L5centers claim to be operating at >6 sigmaNormality is a major issue in software data
Some of our CMM L5centers claim to be operating at >6 sigma
9. Strategies Process Improvement
10. DMAIC - Define DEFINE what is important to the Organization ?
But what is of paramount importance to GSG?
Parameters chosen for Measurement &Analysis (Scorecard)
CSS ( Customer Satisfaction Survey)
COQ and COPQ (Cost of Quality and Cost of Poor Quality)
Productivity
Estimation Accuracy
Effort
Schedule
Size GQM Paradigm was used
The major issue was to identify the areas for analysis, which would add value to the centre. The general consensus was to concentrate on the parameters of paramount importance namely:
CSS-Customer Satisfaction Survey
COQ and COPQ Cost
GQM Paradigm was used
The major issue was to identify the areas for analysis, which would add value to the centre. The general consensus was to concentrate on the parameters of paramount importance namely:
CSS-Customer Satisfaction Survey
COQ and COPQ Cost
11. Measurement is for all full lifecycle projects
Data analysis done separately for Hybrid Projects and Managed Head Count projectsMeasurement is for all full lifecycle projects
Data analysis done separately for Hybrid Projects and Managed Head Count projects
12. Customer Satisfaction Surveys Pre and Post Project Surveys
Criteria of satisfaction and importance
Scorecard goal- 8.86 average and 75% of projects with all high importance areas rated at 8 or above
Measured against a Baseline
13. COQ and COPQ Cost Of Quality (COQ) is the sum of effort due to appraisal, prevention, internal failure and external failure, expressed as a percentage of total project effort.
Cost Of Poor Quality (COPQ) is the sum of effort due to internal failure and external failure, expressed as a percentage of total project effort.
COQ = COPQ + %Appraisal Effort + % Prevention Effort Appraisal Effort - Effort of FIRST TIME verifying, checking, or evaluating a product or service during the creation or delivery process to ensure conformance to quality standards and detect any failures inserted into the product or service before reaching the customer.
Internal Failure Effort Effort resulting from nonconformance to quality standards and effort associated with overcoming the impact of failures found before the product or service reaches the customer (including required re-verification and re-checking of the product).
External Failure Effort Effort resulting from nonconformance to quality standards and effort associated with overcoming the impact of failures found after the formal release of the product (including required re-verification and re-checking of the product).
Prevention Effort - Effort incurred to ensure that errors are not made at any stage during the production and delivery process of the product or service to a customer.
Appraisal Effort - Effort of FIRST TIME verifying, checking, or evaluating a product or service during the creation or delivery process to ensure conformance to quality standards and detect any failures inserted into the product or service before reaching the customer.
Internal Failure Effort Effort resulting from nonconformance to quality standards and effort associated with overcoming the impact of failures found before the product or service reaches the customer (including required re-verification and re-checking of the product).
External Failure Effort Effort resulting from nonconformance to quality standards and effort associated with overcoming the impact of failures found after the formal release of the product (including required re-verification and re-checking of the product).
Prevention Effort - Effort incurred to ensure that errors are not made at any stage during the production and delivery process of the product or service to a customer.
14. COQ and COPQ Current Baseline based on Scorecard
Cost of Quality: <25% (+/-10% for each project)
Cost of Poor Quality: <5%
15. Productivity
Productivity is the ratio of Delta Code released to the customer (New + Deleted + Modified + Reused + Tool-generated) to the Total Project Effort.
Productivity: increase 10% ? 0.62 KAELOC/SM
Productivity is domain specific Embedded Automotive (lower than desktop applications)Productivity is domain specific Embedded Automotive (lower than desktop applications)
16. Estimation Accuracy Estimation Accuracy is the ratio of Actual value to the Estimated value of a parameter.
Size, Effort and Schedule estimates are not stand-alone metrics but should be analysed in context with each other.
Deviations of all three estimations going outside of the limits are indications that the collective action of the estimation, planning and control processes is not performing well. Actual Effort
Estimated Effort (First / Last)
Estimation Accuracy = Actual / Estimated for the respective attribute
LCL: 85% of the estimate
UCL: 115% of the estimateActual Effort
Estimated Effort (First / Last)
Estimation Accuracy = Actual / Estimated for the respective attribute
LCL: 85% of the estimate
UCL: 115% of the estimate
17. Estimation Accuracy - Size The size estimation accuracy metric (ZEA) provides an insight into the project's ability to estimate the size of the project
ZEA is critical for Embedded Application where size is constraint by target device
Size Estimation Accuracy ZEA: 100 +/- 15%
ROM Size
Critical Computer Resources Tracking captured in the planning documentsROM Size
Critical Computer Resources Tracking captured in the planning documents
18. Estimation Accuracy - Effort The effort estimation accuracy metric (EEA) metric provides an insight into the project's ability to estimate effort of the project
EEA critical for accurate cost estimation
Effort Estimation Accuracy EEA: 100 +/- 15%
Actual Effort (Blanket MOUs)
Estimated Effort (First / Last)
Estimation Accuracy = Actual / Estimated for the respective attribute
LCL: 85% of the estimate
UCL: 115% of the estimateActual Effort (Blanket MOUs)
Estimated Effort (First / Last)
Estimation Accuracy = Actual / Estimated for the respective attribute
LCL: 85% of the estimate
UCL: 115% of the estimate
19. Estimation Accuracy - Schedule The schedule estimation accuracy metric (SEA) metric provides an insight into the project's ability to estimate effort of the project
SEA critical for On Time Delivery
Schedule Estimation Accuracy SEA: 100 +/- 15%
Size, Effort and Schedule estimates are not stand-alone metrics but should be analysed in context with each other.
Deviations of all three estimations going outside of the limits are indications that the collective action of the estimation, planning and control processes is not performing well. When these deviations occur, it is important to identify the cause(s), document them and, if necessary, identify and implement solution(s) to improve process performanceSize, Effort and Schedule estimates are not stand-alone metrics but should be analysed in context with each other.
Deviations of all three estimations going outside of the limits are indications that the collective action of the estimation, planning and control processes is not performing well. When these deviations occur, it is important to identify the cause(s), document them and, if necessary, identify and implement solution(s) to improve process performance
20. DMAIC - Analyze E- IQMEn, PSD and SMSC:
Project Summary Database (PSD) is the output from E- IQMEn, which serves as the GSG organizational data repository.
Software Metrics Summary Charts (SMSC) are integrated with the PSD, with no additional data input required by users.
21. Solution Design Overall design of the metrics collection process in GSG
MS Excel and XML are the outputs for further analysisOverall design of the metrics collection process in GSG
MS Excel and XML are the outputs for further analysis
22. DMAIC - Improve Software Metrics Summary Charts-Organizational Health
23. DMAIC - Improve Software Metrics Summary Charts-Organizational Health Organizational Metrics presented to Senior Management, high-level view of quality, cycle time and productivity for any organization whose business it is to create and deliver software
Roll of all GSG centres globally
Also called the 9 up charts because of the 9 charts produced
The purpose of this document is to describe charts that are to be used to manage progress toward organizational Performance Excellence Scorecard goals and indicators
Commonality of data and metrics definitions. Software organizations had different definitions of the measures that were reported. For example, supported code size in one Performing Organization could include comments whilst another might not. Organizational Metrics presented to Senior Management, high-level view of quality, cycle time and productivity for any organization whose business it is to create and deliver software
Roll of all GSG centres globally
Also called the 9 up charts because of the 9 charts produced
The purpose of this document is to describe charts that are to be used to manage progress toward organizational Performance Excellence Scorecard goals and indicators
Commonality of data and metrics definitions. Software organizations had different definitions of the measures that were reported. For example, supported code size in one Performing Organization could include comments whilst another might not.
24. Concentrate on ensuring that the data in E-IQMEn is:
Accurate: The data should provide a true reflection of the status of GSG projects, both completed and active.
Complete: The data should be span all major metrics areas (effort, size and faults).
Up to date: Monthly updates to keep data current.
Inclusive: E-IQMEn should contain as much GSG data as possible, for all types of GSG work.
DMAIC - Future Plans (1) Commonality of data and metrics definitions. Software organizations had different definitions of the measures that were reported. For example, supported code size in one Performing Organization could include comments whilst another might not.
A common culture in regard to data, metrics and reports. We expect these to be taken as a common set of metrics to be applied across Motorola for Software Development.
Support Performance Excellence Scorecard software engineering related goals. Commonality of data and metrics definitions. Software organizations had different definitions of the measures that were reported. For example, supported code size in one Performing Organization could include comments whilst another might not.
A common culture in regard to data, metrics and reports. We expect these to be taken as a common set of metrics to be applied across Motorola for Software Development.
Support Performance Excellence Scorecard software engineering related goals.
25. DMAIC - Future Plans (2) Continuing work in this area:
Redefinition of project categories to simplify (and othogonalise) schema.
Building traceability through intermediate tools if required, to allow for better tracking of GSG effort usage.
Development of interfaces from E-IQMEn to other tools to facilitate once-only data entry.
26. DMAIC - Conclusions Software measurements are amenable to statistical analysis
Software measurements are amenable to statistical analysis
27. Thank You Krishna Arul
Motorola