260 likes | 297 Vues
Software measurement. Ronan Fitzpatrick. Quotes. “ Measure what is measurable, and make measurable what is not so ”. Galileo Galilei:
E N D
Software measurement Ronan Fitzpatrick
Quotes “Measure what is measurable, and make measurable what is not so”. Galileo Galilei: • “When you can measure what you are speaking about and express it in numbers you know something about it; but when you cannot measure it, when you cannot express it in numbers your knowledge is of a meagre and unsatisfactory kind”. William Thomson (Lord Kelvin)
Quotes • “You can’t control what you can’t measure”. Tom DeMarco • “We must be bold in our attempts at measurement. Just because no one has measured some attribute of interest does not mean that it cannot be measured satisfactorily”. Fenton and Pfleeger
Vocabulary • Measure • Indirect measure • Metric • Composite metric • Models for measurement • Generic methods
Definition • Measurement • set of operations having the object of determining a value of a measure. ISO/IEC 15939:2002 • [A] Software measurement is a quantified attribute of a characteristic of a software product or the software process. Wikipedia, 2010 • Compare the similarity with • A software metric is a measure of some property of a piece of software or its specifications.
The purpose of software measurement • Prediction – • To predict complexity • To predict usage • Predictive analytics • Control – • Production hours, cost, security, quality • Assessment – • Usability • Web analytics • Return on Investment
Some generic examples • Halstead • McCall et al. • Boehm – CoCoMo • Albrecht – Function Point Analysis • Bevan – Usability metrics. • Nielsen - Heuristics
Domain of units • Counting is core to measurement, that is counts of the: • Units of the metric • Units of time • Units of cost
Production • Complexity Metrics • Cohesion and Coupling • Instability (I): The ratio of efferent coupling (Ce) to total coupling (Ce + Ca) such that I = Ce / (Ce + Ca). This metric is an indicator of the package's resilience to change. The range for this metric is 0 to 1, with I=0 indicating a completely stable package and I=1 indicating a completely instable package. Robert Cecil Martin’s software package metrics
Project Metrics • Historical records • Producing lines of code • Cost per hour
Number of lines of customer requirements. Program Size Robert Cecil Martin’s software package metrics Bugs per line of code Source lines of code or Program Length Execution Time Code coverage Cohesion Comment density[3] Coupling Cyclomatic complexity Function point analysis Instruction path length Program load time Number of classes and interfaces Software MetricsWikipedia March 2009
Suitability Installability Functionality Adaptability Ease-of-use Learnability Interoperability Reliability Product Metrics • Safety • Security • Correctness • Efficiency • Portability • Testability • Maintainability • Re-usability H-CI quality Technical quality
Top 10 Call Centre Metrics and What they mean to you Canadian Marketing Association 2010 • Examples • Manage your workforce • Control costs effectively • Continuously enhance the client experience • Ensure the contact centre is a contributor to the overall profitability of an organization
Top 10 Call Centre Metrics and What they mean to you Canadian Marketing Association 2010 • Specific examples • 1. Abandon Rate • Abandon rate is the number of calls that hang-up before connecting to an agent. This number does not include those calls that receive a busy signal. • Calculation: Abandoned Calls / Total Incoming calls Compare this to web analytics e.g., converted customers
Top 10 Call Centre Metrics and What they mean to you Canadian Marketing Association 2010 • Specific examples • 10. Call Quality • Call Quality is a standard scoring/rating system that contact centres use to determine how well an agent deals with the customers. There are no industry standards for monitoring quality, but there will usually be a list of criteria that an agent must cover during a call. This includes, but is not limited to, how the agent answers the call, how they navigate the caller to a resolution, and how they end the call. • Calculation: Number of Criteria Met / Number of Total Criteria
Procurement Metrics • ROI. • The total amount of revenue brought in by the software minus the total amount of costs to produce the software.
Operations and Maintenance Metrics • SLA Metrics • Mean Time Metrics • Between failures; to recover; to repair
Usability Metrics • Effectiveness • Productivity • Safety • Satisfaction
Other Measures • Total Ownership Cost • Cost of program development • Cost of program maintenance • Cost of Quality • Percent of total time spent in appraisal (walkthroughs, reviews, inspections) • Percent of total time spent in rework (compile and re-test) • Requirements Satisfaction • Number of acceptance test defects in User acceptance • Acceptance test defects per KLOC
Desirable characteristics of metrics Metrics should be • reliable (free from random error) • repeatable (same entity, same environment, same visitors and same evaluator) • reproducible (same entity, same environment, same visitors but different evaluator) • available (constraint conditions) • indicative of improvement (of the entity quality) • correct (objective, impartial and precise) and meaningful (about the entity behaviour or quality characteristics).
Measurement and International Standards (typical) • BS ISO/IEC 20968:2002 Software engineering. Mk II function point analysis. Counting practices manual • BS ISO/IEC 14143-1to6:1998 Information technology. Software measurement. Functional size measurement. • BS ISO/IEC 15939:2002 Software engineering. Software measurement process • PD ISO/IEC TR 9126-4:2004Software engineering. Product quality. Part 4 – Quality in Use metrics
Fenton and Pfleeger Conception Design Preparation Execution Analysis Documentation and decision making IEEE std 1061 Identify the quality factors Identify the metrics sample Perform a statistical analysis Document the results Revalidate the metrics Evaluate the stability of the environment Generic Metric validation methodology
Validating software metrics • Analysis using statistical methods • correlation, tracking changes capability, consistency, predictability, discriminative power and reliability.
Conclusion An alternative school of thought suggests that: • The emphases on measurement is unkind to experts from other disciplines, for example, Darwin, Maslow, Van Gough.