1 / 37

Agile Metrics that Matter

Agile Metrics that Matter. September 1, 2011 Ian Savage Brian Kronstad. First… some questions for you. Size of your org: <25 25-100 100-500 >500 Familiarity with metrics : I confuse “metrics” with “matrix” I know something about metrics

gyula
Télécharger la présentation

Agile Metrics that Matter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agile Metrics that Matter September 1, 2011 Ian Savage Brian Kronstad

  2. First… some questions for you • Size of your org: • <25 • 25-100 • 100-500 • >500 • Familiarity with metrics: • I confuse “metrics” with “matrix” • I know something about metrics • I have managed projects using explicit exit criteria • I have implemented a successful org-wide metrics program • Your expectations?

  3. Overview

  4. Some Agile Metrics* • Predictable, consistent delivery of business value • Customer involvement • Team cohesion, joy, happiness, trust • Status of customer service queues (SLA) • Burndown irregularities • Absolute velocity (total work done) • Relative velocity (total work done / plan) *from McAfee Program Managers – August 2011

  5. Others’ Thoughts on Agile Metrics • “…primarily three metrics that provide the best measure of the state of an Agile project, namely progress, quality, and teammorale.” http://www.projectsmart.co.uk/metrics-that-matter-in-agile-projects.html • “What do your stakeholders want?... All the information they need to make decisions and no more…” http://community.thoughtworks.com/files/1c2707ac7c/Agile_Metrics_that_Matter.pdf • “The number of graphs we generate from our application code is over 16,000. How do we collect so many metrics? We keep the process super simple.” http://www.mikebrittain.com/blog/2011/03/19/metrics-driven-engineering-at-etsy/

  6. Metrics Matter • People change behaviors to meet the numbers • So choose metrics wisely… • Easy to collect: • KLOCs • bug counts • calendar days • progress against plan • Quality-enabling • Strategic • customer loyalty • earned value • cost of quality • Operational • early detections • unit test coverage and worth • value delivered

  7. Weinberg Productivity Experiment

  8. So? How do we choose metrics? And what’s this GQM and FURPS?

  9. Goal-Question-Metric (GQM) • Dr. Victor Basili c. 1970 – University of Maryland • Goals • Business, product, or iteration results • Questions • Define the nature and scope of our inquiry • Metrics • Measurements that give the answers • Method* • Data collection, sorting, aggregating *my addition

  10. Software Attributes (FURPS) • Grady and Caswell - 1987 -Hewlett-Packard • Model for classifying software quality attributes • Functionality - feature set, capabilities, generality, security • Usability - human factors, aesthetics, consistency, documentation • Reliability- failure frequency, recoverability, predictability, accuracy • Performance - speed, efficiency, resource use, throughput, response • Supportability - testability, extensibility, adaptability, maintainability, compatibility, configurability, serviceability, installability, localizability, portability Software Metrics: Establishing a Company-wide Program, Prentice-Hall

  11. Goals and Tradeoffs • Tim Lister: “If the organization does not measure, or in some other legitimate way determine, if a goal has been met, then the organization was never really serious about that goal.” PNSQC Proceedings - 1993 Keynote • Grady/Caswell: “Establishing priorities is important because of the tradeoffs involved between quality attributes. For example, adding a new function might improve functionality but decrease performance, usability, and/or reliability.”

  12. A Working Definition of “Quality” Quality is a function of salient attributes: Q(total) = f (attribute(1), attribute(2),… attribute(N))

  13. Metrics Template

  14. Metrics Development (1/4) Notes: • Stakeholders meet and rank sort the attributes. • Some attributes are left unmanaged.

  15. Metrics Development (2/4) Notes: • Team discusses the attributes and records the questions we want answered. • These questions reflect values and drive decisions.

  16. Metrics Development (3/4) Notes: • Team decides the metrics that will answer the questions. • And set targets for each metric.

  17. Metrics Development (4/4) Notes: • Team determines methods for generating those metrics. • Check and adjust: • MethodMetric • Metric Questions • Questions  Goals

  18. Metrics for Project XYZ

  19. Agile Scrum Characteristics • “Short” cycles that deliver value • Emphasis on fixed time and flexible scope • “if the scope don’t fit, you must omit” • Continuous backlog planning • Tests driving (or closely aligned with) coding • Refactoring is an expectation • Daily scrum meetings

  20. Implications for Reporting • Multiple levels to consider • Strategic – Product Backlog • Operational – Releases • Tactical – Sprints and scrums • Levels impact different project aspects • Decision-making, Planning, Measuring, Reporting • Stakeholders have different requirements • Project managers – e.g. Scope, Schedule, Budget, Quality • Sponsor – e.g. Features by release and date, TCO • Customers – e.g. Fit for use & purpose • May need to re-educate stakeholders • Agile terms, interpreting Agile reports, what’s significant to track (e.g. how to measure defect density given refactoring)

  21. Extending GQMM + Reporting • Goals (conceptual level) • What results do we want? • W.r.t. : Products, Processes, Resources • Questions (operational level) • How will we know we’ve met those goals? • Metrics (quantitative level) • What measurement gives the answer? • Method • How do we collect / derive the data? • Reporting • How do we convey the information in a way that provides value to various stakeholders?

  22. Quality attribute goals - Sprint

  23. Quality attribute goals - Release

  24. Quality attribute goals - Product

  25. Quality attribute goals - Portfolio

  26. Performance Reporting Example Sprint and Release Impacts on Performance Significance: The performance metric indicates response time changes due to sprints and release Analysis: Response times at sprint completions are progressing towards the NFR target of 2 seconds. Recent sprints have had less impact on response times Response: Team will have an independent architecture and design review. Findings will be addressed in a refactor sprint following the October release

  27. Common Agile Reports • Burn Down / Burn Up charts • Product, Release & Sprint • Amount of work completed • Cumulative Flow Diagrams • Earned Business Value • Velocity Table • Backlog Change Report

  28. Example Burn Down Chart http://www.atlassian.com/software/greenhopper/tour/burndown-charts.jsp

  29. Cumulative Flow Chart Remaining In Process Completed http://www.atlassian.com/software/greenhopper/tour/burndown-charts.jsp

  30. Earned Business Value Business value delivered per sprint and release Significance: The metric indicates the relative business value delivered per sprint and release Analysis: While significant business values has been delivered, it has consistently fallen short of expectations Response: Team will perform root cause analysis to identify contributing factors for lower business values (e.g. scope cut) and whether additional business value is necessary to achieve product ROI

  31. Basic Velocity Table* *More developed velocity tables use a complexity factor which relates to the risk of a particular story or set of stories (aka epic)

  32. Backlog Change Report Backlog Features delivered per sprint and release Significance: Demonstrates the addition of backlog features to the product for each sprint and release Analysis: 95% of backlog features delivered as of sprint 10. Trending toward completion by sprint 12 Response: PM will discuss remaining features to determine whether their contribution to the end product is necessary.

  33. Some Lessons Learned • Understand how the information will be used • Does the report answer the Question? • Revalidate the Question • With experience the Question may change, new questions may arise • Understand how to deliver the information • Frequency, method (e.g. email), level of detail • Stakeholder preferences: tables/graphics, data/discussion • Include context for the information • Significance: Why is this information important • Analysis: How should this particular information be interpreted

  34. Some (more) Lessons Learned • Don’t overburden the development team with metrics collection • Less is more, metrics should be a bi-product of development • Tracking and reporting story points versus hours • Be clear on how they’re used and be consistent • Set Stakeholder expectations • Velocity will not likely stabilize until after a few sprints, therefore expect early estimates to be wrong • Flexible scope <> unmanaged scope

  35. Caution • Don’t mix project metrics and people metrics • Project metrics are not for reviewing people • They can be gamed and the data will skew • Will hurt team morale

  36. Additional Resources • The Goal Question Metric Approach, Basili, Caldiera, Rombach, et. al. • Software Metrics: Establishing a Company-wide Program, Grady and Caswell, 1987 Prentice-Hall • Practical Software Metrics for Project Management and Process Improvement, Grady, Robert , 1992 H-P • Earned Value and Agile Reporting, Anthony Cabri, Mike Griffiths, QuadrusDevelopment Inc. • Establishing Metrics using Balance Scorecard and Goal Question Metrics Technique For Organizational Prosperity, Deborah Devadason, Qualcon 2004 • GQM+Strategies, Basili, et. al.

  37. Questions?

More Related