1 / 62

INFO 630 Evaluation of Information Systems Prof. Glenn Booker

INFO 630 Evaluation of Information Systems Prof. Glenn Booker. Week 2 – Fundamental measures. Quality. A common objective for measuring creation of something is to ensure it has quality The definition of quality is often somewhat subjective, or based on limited personal experience

lani
Télécharger la présentation

INFO 630 Evaluation of Information Systems Prof. Glenn Booker

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INFO 630Evaluation of Information SystemsProf. Glenn Booker Week 2 – Fundamental measures INFO630 Week 2

  2. Quality • A common objective for measuring creation of something is to ensure it has quality • The definition of quality is often somewhat subjective, or based on limited personal experience • What does ‘quality’ mean to you? INFO630 Week 2

  3. Quality • More formal definitions include • Crosby – “conformance to requirements” • Juran – “fitness for use” • Both of these address a similar theme, i.e. it can do what it’s supposed to do INFO630 Week 2

  4. Quality • These definitions imply that the customer defines quality, not the developer • Another set of quality definitions are • Small q – related to the lack of defects and reliability of a product • Big Q – has small q, plus customer satisfaction and process quality INFO630 Week 2

  5. Software Quality • Software quality typically includes many of these aspects • Conformance to requirements • Lack of defects • Customer satisfaction • Later we’ll look at more specific aspects of customer satisfaction, using the CUPRIMDA measures INFO630 Week 2

  6. What is the perfect vehicle for: • Going 200 mph at LeMans? • Driving through deep mud and snow? • Carrying a Boy Scout troop to a ball game? • Carrying sheets of plywood to a construction site? • Feeling the wind in your hair? • Towing an enormous trailer? INFO630 Week 2

  7. The Right Metrics • Asking “what should I measure?” is like asking “what vehicle should I buy?” • The answer varies wildly, depending on your needs and resources • Hence a major portion of this course is describing typical metrics for a wide range of needs INFO630 Week 2

  8. Measurements by area • Recall from last week that, in order to create a product, we need people (resources) using tools according to some kind of processes • We can measure each of those areas • Product • Process • Resource • Tool INFO630 Week 2

  9. Product Metrics • Size (Lines of Code, Function Points) • Size (memory or storage needs - particularly for embedded software) • Complexity (within or among modules) • Number of modules, classes, tables, queries, screens, inputs, outputs, interfaces, etc. INFO630 Week 2

  10. Process Metrics • Is the process being followed? • When in doubt, define a plan of action - then measure progress against the following the plan • Are milestones being met? • Are tasks starting on time? Ending? • Are people participating in the processes? INFO630 Week 2

  11. Process Metrics • Are the processes effective and productive? • How do you measure that? • Do the processes meet quality standards? • Which standards are relevant for your industry? INFO630 Week 2

  12. Resource Metrics • Do effort, cost, and schedule meet the plans for same? • Are needed people available? Is the project fully staffed? • Are the staff qualified to perform their duties? • Is training adequate to meet project needs? INFO630 Week 2

  13. Tool Metrics • How much (time, labor effort, cost) did our tools cost? Does that include training? • How much are the tools being used? • Are the tools meeting our needs? • Are they providing unexpected benefits? • How has use of the tools affected our productivity? Rework? Defect rate? INFO630 Week 2

  14. Testing Metrics • Effort expended on testing • Number of test cases developed, run, and passed • Test coverage (% of possible paths) • Test until desired quality achieved (defect rate) • Number of defects remaining (by severity) INFO630 Week 2

  15. Where to begin? • Given such a wide range of measurements, where do we begin? • Start with the core measurements that are most critical to getting a handle on your organization’s performance INFO630 Week 2

  16. Initial Core Measurements • Size - how large is the software? • Effort and cost - how much staff time or dollars to build software? • Schedule - over what calendar time period? • Problems and defects - how good is the process, based on the quality of the resulting product? INFO630 Week 2

  17. Why Measure Software Size? • Project planning – need inputs for prediction models • Effort and cost are a function of size, etc. • Project management - monitor progress during development • Plot planned vs. actual lines of code over time • Process improvement - normalizing factor • Programmer productivity, defect density INFO630 Week 2

  18. Program Size • Lines of code or function points are used as measures of program size • Function points also used in project management course (INFO 638) • Lines of code • Source statements usually implies logical lines • Count the delimiters for a language, such as “;” INFO630 Week 2

  19. Program Size • Source LOC (SLOC) refers to physical or logical lines of code • Many different ways to decide what counts as a LOC and what doesn’t count • See LOC handout, “Lines of Code Definition Options” INFO630 Week 2

  20. Problems With LOC • Differences in counting standards across organizations • Physical lines • Logical lines • Variations within each standard INFO630 Week 2

  21. Problems With LOC • No consistent method for normalizing across languages - penalizes high-level languages • More than half the effort of development is devoted to non-coding work • Moving from a low level language to a high level language reduces volume of code • Non-coding work acts as a fixed cost – driving up the cost per line of code INFO630 Week 2

  22. Examples Illustrating LOC • Project with 10,000 SLOC (assembly language) • Front-end work = 5 months; coding = 10 months • Total project time = 15 months at $5,000 per staff month • Cost = 15 months x $5,000 = $75,000 • $7.50 per LOC INFO630 Week 2

  23. Examples Illustrating LOC • Project with 2,000 SLOC (Ada83) • Front end work = 5 months; coding= 2 months • Total project time = 7 months at $5,000 per staff month • Cost = 7 months x $5,000 = $35,000 • $17.50 per LOC • So the more modern language looks more expensive! INFO630 Week 2

  24. Halstead Metrics for Size • Tokens are either operators (+, -) or operands (variables) • Counts • Total number of operator tokens used, N1 • Total number of operand tokens used, N2 • Number of unique operator tokens, n1 • Number of unique operand tokens, n2 INFO630 Week 2

  25. Halstead Metrics for Size • Derive metrics such as Vocabulary, Length, Volume, Difficulty, etc. • Measure of program size • N = N1 + N2 = n1*log2 (n1) + n2*log2 (n2) • Relation to Zipf’s law in library science • Says the most frequently used words in a document drop off logarithmically INFO630 Week 2

  26. Function Points • Function Points measure software size by quantifying functionality provided to user • Objectives of function point counting: • Measure functionality that user requests; and receives • Measure software development and maintenance independently of implementation technology • Provide a consistent measure among various projects and organizations INFO630 Week 2

  27. Why Function Points? • SLOC gives no indication of functionality • Some languages (such as COBOL) are more verbose than others (such as C or Pascal) or 4GL database languages (e.g. MS Access, Powerbuilder, 4th Dimension) • Function points are independent of language and number of SLOC INFO630 Week 2

  28. Why Function Points? • SLOC can actually be measured only when coding is complete • Using function points, we can evaluate software early in the life cycle INFO630 Week 2

  29. Function Points Count • External Inputs: inputs from the keyboard, communication lines, tapes, touchscreen, etc. • External Outputs: reports and messages sent to user or another application; reports may go to screen, printer or other applications INFO630 Week 2

  30. Function Points Count • External Queries: queries from users or applications which read a database but do not add, change or delete records • Logical Internal Files: store information for an application that generates, uses and maintains the data • External Interface Files: contain data or control information passed from, passed to or shared by another application INFO630 Week 2

  31. Measuring Productivity Using Function Points • Productivity measure “SLOC/effort” not a good measure when comparing across languages • Productivity using function points isProductivity = Total function points Effort in staff hours (months) INFO630 Week 2

  32. Cost-Effort Relationship • Some number of people are working on a project; their hourly rate, times the number of hours worked, is the labor cost for that project Labor Cost = Sum of (rate*hours) • Most models for effort, cost, and schedule are based on the size of the product in LOC INFO630 Week 2

  33. Labor Rate • The hourly rate used for cost of a project is the burdened labor rate, which includes the person’s salary, plus overhead expenses such as • Facility costs, support personnel (admin assistants, upper management), generic computer resources, profit, utilities, etc. • The labor rate for an efficient organization is about 2 to 2.5 times the salary INFO630 Week 2

  34. Labor Rate • The worst example I saw was a labor rate 4.15 times my salary • That was as a civilian engineer in the Navy  • Have you seen any actual labor rates you can share? INFO630 Week 2

  35. Size-Duration Relationship • Projects tend to have a typical relationship between their effort and their overall duration • This produces a typical range of durations for a given size project • The effort and duration imply how many people will be needed on average for the project INFO630 Week 2

  36. Software Effort • Measure effort for • Total software project (the big picture) • Life-cycle phase (useful for planning) • Each configuration item or subsystem (helps improve planning) • Fixing a software problem or enhancing software (maintenance) INFO630 Week 2

  37. Software Development Models • Used to estimate the amount of effort, cost, and calendar time required to develop a software product • Basic COCOMO (Constructive Cost Model) • Intermediate COCOMO • COCOMO II • Non-proprietary model first introduced by Barry W. Boehm (at USC) in 1981 • Most popular cost estimation model in 1997 INFO630 Week 2

  38. Software Cost Models • COCOMO II is within 20% of actual project data just under 50% of the time, based on 83 calibration points (projects) – not very impressive! • SLIM • COCOTS • Also developed by USC • Used for estimating commercial-off-the-shelf software (COTS)-based systems • Still experimental INFO630 Week 2

  39. COCOMO Effort Equations Shows the software size-effort relationship as it has evolved through various versions of COCOMO INFO630 Week 2

  40. COTS • COTS: Commercial off-the-shelf software • Assessment • Activity of determining feasibility of using COTS components for a larger system • Tailoring • Activity performed to prepare COTS program for use • Glue code development • New code external to COTS that must be written to plug component into the larger system INFO630 Week 2

  41. COTS • Volatility • Refers to frequency with new versions or updates of the COTS software are released by the vendors over the course of the system development INFO630 Week 2

  42. COTS Problems • No control over a COTS product’s functionality or performance • Most COTS products are not designed to work with each other • No control over COTS product evolution • COTS vendor behavior varies widely INFO630 Week 2

  43. COCOTS • COCOTS provides solid framework for estimating software COTS integration cost • Needs further data, calibration, iteration • Current spreadsheet model provided by Boehm could be used experimentally • COCOTS can be extended to cover other COTS related costs • Model hasn’t been updated recently INFO630 Week 2

  44. Software Structure • Some legacy terminology for software elements has survived • Software may have high level CSCI’s (computer software configuration items) • Each CSCI may be broken into CSC’s (computer software components) • Each CSC may be broken into CSU’s (computer software units), which have one or more modules of actual source code • This is the origin of the term “unit testing” INFO630 Week 2

  45. Schedule Data • Schedule defines, for each task: • Start date (planned and actual) • End date (planned and actual) • Duration (planned and actual) • Resources needed (i.e. number of people) • Dependencies on other tasks INFO630 Week 2

  46. Schedule Data • Provide project milestones and major events • Reviews • PDR (preliminary design review) • CDR (critical design review) • Audits and inspections • Release of deliverables • Compare milestones’ planned and actual dates INFO630 Week 2

  47. Schedule Data • Uses calendar dates • Track by configuration item or subsystem • Number of CSU’s completing unit test • Number of SLOC completing unit test • Number of CSU’s integrated into the system • Number of SLOC integrated into the system INFO630 Week 2

  48. Tracking Software Problems • Software problem data is • Management information source for software process improvement • Analyze effectiveness of prevention, detection, and removal process INFO630 Week 2

  49. Tracking Software Problems • A critical component in establishing software quality • Number of problems in product • If product ready for release to next development step or to customer • How current version compares in quality to previous or competing versions INFO630 Week 2

  50. Problem Terminology • Error: A human mistake resulting in incorrect software. • Defect: An anomaly in the product. • Failure: When a software product does not perform as required. • Fault: An accidental condition which causes system to perform incorrectly. INFO630 Week 2

More Related