1 / 51

Software Sizing, Estimation, and Tracking

Software Sizing, Estimation, and Tracking . Pongtip Aroonvatanaporn CSCI577 Spring 2012 February 10, 2012. Outline. Terms and Definitions Software Sizing Software Estimation Software/Project Tracking. Terms and Definitions. Software Sizing Mechanism to estimate size and complexity

ivana
Télécharger la présentation

Software Sizing, Estimation, and Tracking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Sizing, Estimation, and Tracking PongtipAroonvatanaporn CSCI577 Spring 2012 February 10, 2012 (C) USC-CSSE

  2. Outline • Terms and Definitions • Software Sizing • Software Estimation • Software/Project Tracking (C) USC-CSSE

  3. Terms and Definitions • Software Sizing • Mechanism to estimate size and complexity • Software Estimation • Mechanism to estimate effort, time, duration • Software/Project Tracking • Mechanism to manage project progress (C) USC-CSSE

  4. Software Sizing • Agile Techniques • Story points • Planning Poker • Traditional Techniques • Expert Judgment • Function Points • Application Points • Uncertainty treatment • PERT Sizing • Wideband Delphi • COCOMO-U (C) USC-CSSE

  5. Story Points (C) USC-CSSE

  6. Story Points: What? • Estimation mechanism based on user stories • Features/capabilities = story point • Often used by Scrum team • Strong focus on agile process • A way to estimate difficulty • Without committing time duration • Measure size and complexity • Essentially, how hard it is (C) USC-CSSE

  7. Story Points: Why? • Better than hours • Humans are not good at estimating hours • Standish group survey • 68% projects failed to meet original estimates • Hours completed tells nothing • No useful information for clients/customers • Story points can provide roadmap of capabilities to be delivered • Less variation (C) USC-CSSE

  8. Story Points: How To? • Involves the entire team • Process • Look at the backlog of features • Pick the easiest • Give a score to that feature (i.e. 2) • Estimate other features relative to that point • Cohn Scale • Fibonacci • 0, 1, 2, 3, 5, 8, 13, 20, 40, 100 (C) USC-CSSE

  9. Story Points: How To? • Velocity • First sprint… • After 2-3 sprints, average story points completed • Velocity used for planning future iterations guess (C) USC-CSSE

  10. Story Points: The Good • Estimate backlog • Focus on product, not tasks • Items that are valuable to clients/customers • Track progress based on results delivered • Hours are bad • 1 hour for most productive team = hours for least productive team • In industry • Story point estimation cuts estimation time by 80% • More estimation and tracking than typical waterfall • 48 times faster than traditional waterfall estimations 2000 (C) USC-CSSE

  11. Story Points: The Bad • Publishing vs. Development • Less effort for publishing • Complexity vs. Time • Some stories are intellectually complex • Some stories are simply time consuming • Less complex, but repetitive tasks get lower numbers. • No accurate on actual effort required • Some developers prefer hours and days • Difficult to determine completion time without velocity (C) USC-CSSE

  12. Story Points: Example • Students can purchase monthly parking passes online • Parking passes can be paid via credit cards • Parking passes can be paid via PayPal • Professors can input student marks • Students can obtain their current seminar schedule • Students can order official transcripts • Students can only enroll in seminars for which they have pre-requisites • Transcripts will be available online via a standard browser http://www.agilemodeling.com/artifacts/userStory.htm (C) USC-CSSE

  13. Planning Poker (C) USC-CSSE

  14. Planning Poker: What? • A mechanism to • Introduce estimation • Invoke discussions • Like playing poker • Each person has cards • Reveal cards at the same time (C) USC-CSSE

  15. Planning Poker: Why? • Multiple expert opinions • Knowledgeable • Best suited for estimation tasks • Estimates require justifications • Improve accuracies • Better compensated for missing information • Good for story point estimation • Average of estimates gives better results (C) USC-CSSE

  16. Planning Poker: How? • Include all developers • Process • Each estimator given a deck of cards • For each user story, moderator reads the description • Discuss about story until all questions are answered • Each estimator selects a card representing his/her estimate • Everyone show their cards at the same time (avoid bias) • High and low estimators explain their estimates • Discuss about estimates. • Estimators re-select cards • If estimates converge, take the average. • If estimates do not converge, repeat the process. (C) USC-CSSE

  17. Planning Poker: How? • Done at two different times • First • Before project begins • Estimate large number of items • Initial set of user stories • Second • During the end of each iteration • Estimate for the upcoming iteration (C) USC-CSSE

  18. Planning Poker: The Good • Fun and enjoyable • Convergence of estimates • More accurate • Justifications • Invoke group discussions • Improve understandings • Improve perspectives (C) USC-CSSE

  19. Planning Poker: The Bad • Easy to get into excessive amount of discussion • Not accurate estimates • Bad results • Require high level of expertise • Opinions • Analogies • High and low estimators may be viewed as “attackers” (C) USC-CSSE

  20. Function Points (C) USC-CSSE

  21. Function Points: What? • Quantify the functionality • Measure development and maintenance • Independent of technology • Consistently across all projects • Unit of measure representing the function size • Application = number of functions delivered • Based on user’s perspective • What user asked for, not what is delivered • Low cost and repeatable • Good for estimating use-cases (C) USC-CSSE

  22. Function Points: How? • Process • Determine function counts by type • Determine complexity level. Classify each function count by complexity levels • Apply complexity weights • Compute Unadjusted Function Points. Add all the weighted function counts to get one number (UFP) (C) USC-CSSE

  23. Function Points: How? • Data Functions • Internal logical files • External interface files • Transactional functions • External inputs • External outputs • External inquiries (C) USC-CSSE

  24. Internal Logical Files • Data that is stored and maintained within your application • Data that your application is built to maintain • Examples • Tables in database • Flat files • Application control information • Configuration • Preferences • LDAP data stores (C) USC-CSSE

  25. External Interface Files • Data that your application uses/references • But not maintained by your application • Any data that your application needs • Examples • Same as Internal Logical Files • But not maintained by your system (C) USC-CSSE

  26. External Inputs • Unique user data or user control input that enters the application • Comes from external boundary • Examples • Data entry by users • Data or file feeds by external applications (C) USC-CSSE

  27. External Output • User data or control that leaves the applications • Leaves the external boundary • Present information • Retrieval of data or control • Examples • Reports • Data display on screen (C) USC-CSSE

  28. External Inquiry • Unique input-output combination • Input causes/generates immediate output • No mathematical formulas or calculations • Create no derived data • No Internal Logical Files maintained during processing • Behavior of system not altered • Examples • Reports that do not involve derived data(direct queries) (C) USC-CSSE

  29. Complexity Levels (C) USC-CSSE

  30. Function Point to SLOC • COCOMO II has built-in calibration for converting Unadjusted FP to SLOC • First specify the implementation language/technology • Apply the multiplier (SLOC/UFP) • More information can be found in COCOMO II book (C) USC-CSSE

  31. Function Points: The Good • Independent of programming language and technology • Help derive productivity and quality performance indicators • Benchmarking • Productivity rate • Cost/FP • Guard against increase in scope • Function creep (C) USC-CSSE

  32. Function Points: The Bad • Requires subjective evaluations • A lot of judgment involved • Many cost estimation models do not support function points directly • Need to be converted to SLOC first • Not as much research data available compared to LOC • Can only be performed after design specification (C) USC-CSSE

  33. Estimating with Uncertainty? (C) USC-CSSE

  34. Uncertainty Treatment • PERT Sizing • Use distribution • Specify pessimistic, optimistic, and most likely sizes • Biased? • Wideband Delphi • Experts discuss and estimate individually • Discussions focus on points where estimates vary widely • Reiterate as necessary • COCOMO-U • Extension to COCOMO II • Use Bayesian Belief Network to address uncertain parameters • Provide range of possible values (C) USC-CSSE

  35. Workshop time! (C) USC-CSSE

  36. Scenario • Develop a software system for Effort Reporting • Sounds familiar? • Software Requirements • User authentication • User capabilities • Select Week • Submit weekly effort • View/Update weekly effort • View weekly total • Admin capabilities • View grade report by user (on time submission) • Add/view/edit effort categories (C) USC-CSSE

  37. Outline • Terms and Definitions • Software Sizing • Software Estimation • Software/Project Tracking (C) USC-CSSE

  38. Project Tracking • Goal-Question-Metric • PERT Network Chart • Gantt Chart • Burn Up and Burn Down Charts (C) USC-CSSE

  39. Goal-Question-Metric: What? • By Victor Basili, University of Maryland and NASA • Software metric approach • Capturesmeasurement on three levels • Conceptual level (goal) • Defined for an object • Operational level (question) • Define models of the object of study • Quantitative level (metric) • Metrics associated with each question in a measurable way (C) USC-CSSE

  40. Goal-Question-Metric: Why? • Used within context of software quality improvement • Effective for the following purposes: • Understanding organization’s software practices • Guiding and monitoring software processes • Assessing new software engineering technologies • Evaluating improvement activities (C) USC-CSSE

  41. Goal-Question-Metric: How? • Six-step process • Develop a set of corporate, division, and project business goals • Generate questions defining those goals • Specify measures needed to be collected to answer questions • Develop mechanisms for data collection • Collect, validate,and analyze data. Provide feedback in real-time • Analyze data in post mortem fashion. Provide recommendations for future improvements. (C) USC-CSSE

  42. Goal-Question-Metric: The Good • Align with organization environment • Objectives and goals • Project context • Flexible (C) USC-CSSE

  43. Goal-Question-Metric: The Bad • Only useful when used correctly • Must specify the right goals, questions, and metrics to measure • Requires experience and high level of knowledge to use • No explicit support for integrating with higher-level business goals and strategies • Some things cannot be measured (C) USC-CSSE

  44. GQM+Strategies: What? • An extension of GQM • Built on top • Link software measurement goals to higher-level goals • Software organization • Entire business (C) USC-CSSE

  45. GQM+Strategies: Example • Wants: Increase customer satisfaction • Strategy: Improve product reliability • Both hardware and software • Software development contribution • Reduce defect slippage • Improve testing process • Team leaders decide on set of actions to take • Implement improvements • Measure results of improvements • A tie between test defect data and customer satisfaction (C) USC-CSSE

  46. GQM+Strategies: Example (C) USC-CSSE

  47. Other Project Management Methods (C) USC-CSSE

  48. PERT Network Chart • Identify critical paths • Nodes updated to show progress • Grows quickly • Becomes unusable when large • Especially in smaller agile environments • Eventually gets thrown away (C) USC-CSSE

  49. “Burn” Charts Burn Up Burn Down • Effective in tracking progress • Good for story points • Not good at responding to major changes (C) USC-CSSE

  50. Workshop Time! (C) USC-CSSE

More Related