1 / 90

Chapter 8: Management of SWE

Chapter 8: Management of SWE. Prof. Steven A. Demurjian Computer Science & Engineering Department The University of Connecticut 371 Fairfield Road, Box U-2155 Storrs, CT 06269-2155. steve@engr.uconn.edu http://www.engr.uconn.edu/~steve (860) 486 – 4818 (860) 486 – 3719 (office).

Télécharger la présentation

Chapter 8: Management of SWE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 8: Management of SWE Prof. Steven A. Demurjian Computer Science & Engineering Department The University of Connecticut 371 Fairfield Road, Box U-2155 Storrs, CT 06269-2155 steve@engr.uconn.edu http://www.engr.uconn.edu/~steve (860) 486 – 4818 (860) 486 – 3719 (office)

  2. Motivating SW Management • Why is Management Needed? • What are The Main Tasks Of Managers? • What is Special in the Case Of Software? • How Can Productivity be Measured? • Which Tools May be Used for Planning and Monitoring? • How Can Teams be Organized? • How Can Organizations' Capabilities be Defined and Measured?

  3. Overview of Chapter 8 • Motivate and Review the Entire SW Management Process – Introduce Ideas and Concepts • Detailed Examination of Project Management w.r.t. • Estimation • Risk Analysis • Implementation Strategies • Project Control • Work Breakdown • Project Scheduling • Organization of Personnel - Approaches to Teams • Software Acquisition • Software Re-engineering (CT Insurance Project) • Software Quality Assurance

  4. Motivation and Approach • Traditional Engineering Practice is to Define a Project Around the Product being Developed • Project Manager Oversees the Project and: • Documents Goals • Develops a Schedule and Budget • Acquires Resources • Oversees Project Execution • Monitors Progress • Staffing: Human Resource Acquisition • Recruiting (within the Company) and Hiring • Database Specialist May work on 2 – 3 Projects • 20% - 50% – 30% • Training, Rewarding, and Retaining • Directing Project Members

  5. Challenge of Project Management • Utilize Limited Resources to Achieve Independent and Sometimes Conflicting Goals • “Plan the Work and Work the Plan” • Management Decisions Involve: • Tradeoffs that Impact (Impacted by) Technical Aspects of Software Engineering • Software Engineering as: • Team Activity of Many Software Engineers • Management Coordinates Actions/Responsibilities "The creation and maintenance of an internal environment in an enterprise where individuals, working together in groups, can perform efficiently and effectively toward the attainment of group goals" (Koontz et al, 1980)

  6. Management Functions • Planning: Flow of Information, People and Products • What are Required Resources? • How/When to Acquire Them to Achieve Goals? • Organizing: Clear Lines of Authority/Responsibility • Staffing: Hiring Personnel for Positions • What Will Each person Do? • Recruitment, Compensation, Promotion, etc. • Directing: Overseeing the Process • Guiding Subordinates to Understand (Accept) Organizational (Project) Structure and Goals • Controlling: Measuring and Correcting Activities to Ensure Goals are Achieved • Measure Performance Against Plans • Keep People Interacting and Project on Track

  7. Project Planning • Project Manager Creates a Plan to Achieve Goals which Guide Software Engineers in their Tasks • Software Cost Estimation:Predictive Means to Estimate the Complexity of software Prior to D & D • Predict Size of the Software • Use as Input to Estimate Person Years/Timeline • Software Cost Estimation: Multi-Phased Task • Prediction of Project Complexity • Delineation of Project Functions • Guesstimate of Hours/Function • Organization into Plan with Timeline (Deadlines) • Consideration of SWE Capabilities in Process • Impact of Available Software, Tools, etc.

  8. Software Estimation • Software Estimation Involves the “Guesstimate” of Resources, Cost, and Schedule for Project • Relies on Experience and Historical Data • Part of Feasibility Study/Requirements Spec • Needs to be Constantly Revisited and Reassesed • Accuracy of Estimation Influenced by • Project Complexity • Project Size and Interdependency Among Components • Degree of Project Structure (or Lack Thereof) • Embodies a Degree of “Risk” or Uncertainty • Focuses on: • People, Hardware, Software, Space, Time, etc.

  9. Software Cost Estimation

  10. Estimation: Software Scope • What is the Software Scope: • Function, Performance, Constraints, Interfaces, Reliability, Databases, etc. • Estimating Functional and Performance Requirements • A Statement of Software Scope Bounded by: • Quantitative Data Stated Explicitly • # Simultaneous Users, Max # Users, Response Time,… • Constraints Noted • Product Cost that Limits Memory Size • Additional Factors • Algorithms to Utilize, Offsite System Interactions, … • If System Specification Available – these are there… • Recall Specification Discussion in Chapter 5

  11. Estimation: Resources • CASE Tools: ER, UML, DFD, PNs, etc. • Business System Planning Tools • Recall Business Process Modeling in UML, Visio • Project Management Tools • Gantt and PERT, MS Project • Support Tools: MS Office, Visio, Corel Draw, etc. • Analysis and Design Tools • CASE Tools + Simulators, Queueing Models, etc. • Programming Tools: IDEs + PLs (e.g. Eclipse, VS) • Integration/Testing Tools – github,SCCS, RCS, etc. • Application Platforms • DBMS, OSs, Web Servers, Application Servers, ... • Hardware Platforms: Servers, PCs, Disks, etc. • Prototype vs. Test vs. Production

  12. Estimation: Other Approaches • Prototyping and Simulation Tools • Visio and Rapid Prototyping of GUIs with PLs • Maintenance Tools • Code Restructuring and Analysis • Refactoring (What is this?) • Framework Tools • Database Management, Configuration and Version Management, Suite of D & D Tools (UML +_PL) • Reusing Software • Acquire Existing Software that Meets Spec • CT Insurance – Tiff Converter for Redacting • Modify Existing or Purchased Software • Estimate Code of Modification vs. From Scratch • Includes Cost of Purchased Product

  13. Estimation: Productivity Metrics • Typically Quantified in Different Ways • Classic Approach is Lines of Code Produced/Day • Estimation of LoC per Task (Function, Class, etc.) • Cumulative Assessment of LoC for Project • Often Divided by Major Task (GUI, DB, Server, Client) • Mapping from LoC to Hours/Task • Usage of Project Planning Technique (MS Project) • Classic Software Coding Methods Estimate Amount of Functionality (LoC) Produced Per Unit Time • Two Approaches: • Function Point • Code-Based • Can “Miss by a Mile” - PB Project had 15-20 Classes as Estimate – 200 Classes in Final Prototype

  14. Function Points • Goal: Arrive at a Single Number that Characterizes the System and Correlates with SWE Productivity • Obtained by: • Defining and Measuring the Amount of Value (or Functionality) Produced Per Unit Time • Determine Complexity of Applications as its Function Point • Weighted Sum of 5 Characteristic Factors • What Problem do you see?

  15. Function Points 4 5 4 10 7 • What does Each Represent? • Input/Outputs – Provided by User • Inquiries – Number of Interactive Queries Made by Users that Require Specific Action by System • Files – Groups of Related Information • Interfaces –Interactions with External Systems • Use these Raw Values with Weights to Obtain Total of: 20 + 40 + 40 + 70 + 70 = 240 Item Weight * 5 Number of inputs * 8 Number of outputs * 10 Number of inquiries * 7 Number of files * 10 Number of interfaces

  16. What’s the Next Step… • Total: 240 is Considered in Context of Target Programming Language • For each PL – LoC per Function Point • Sample Values Include: 320 (assembly), 128 (C), 91 (Pascal), 71 (Ada83), and 53 (C++, Java) • If 240 + Java  250*53 = 12,720 LoC • What’s Problem Here? • Number Doesn’t Always work for All Cases? • What if Inputs More “Complex” than 4? Or Inquiries have a Higher or Lower Weight? • What are Some of the Other Issues Not Considered?

  17. Measuring Code • Size of Code Produced per Unit of Time as Productivity Measure • What is Measured? • DSI – delivered source instructions • NCSS – noncommented source statements • KLOC – Thousands of Lines of Code • What is the Potential Issues? • What about Comments, Documentation? • Impact of Code Reuse, Code Efficiency? • How Does One Measure “Automatically Generated Code”– Getters, Setters, Method Signatures…

  18. What are Other Factors that Impact? • Professionals' Capabilities • Product Complexity • Schedule Constraints • Previous Experience with a Language • Complex Software Requirements (Reliability, Timing, and/or Performance) • Larger Variation in Productivity of SWE • Personalities and Interactions Play a Role • Makeup of Team can have Positive or Severely Negative Impact! • Estimation Utilized for: • Estimating Team Size/Effort for Project • Assessing (Re-assessing) Project Progress

  19. Code Estimation • LoC Good metric for Total Life Cycle Costs • Most Cost Estimation Methods Utilize Size of Project to Derive Total Effort Required PM = c.KLOCk • Legend • PM: person month • KLOC: K lines of code • c, k depend on the model • k>1 (non-linear growth)

  20. Factors Effecting Estimation • Product: Reliability Requirements or Inherent Complexity • Computer: Execution Time and/or Storage Constraints • Personnel • New vs. Experienced? • Impact of Approach (e.g., Multi-Tier Web) or Programming Language • Project: Usage of Sophisticated Software Tools • Cost Estimation Procedure • Estimate Size and Use Formula to Obtain Initial Estimate • Revise Estimate Using Above Factors • Keep Reapplying Metric as Software is Developed to Re-Estimate Cost More Accurately

  21. Estimation: Metrics • COCOMO: COnstructive COst MOdel proposed by B. Boehm • Evolved from COCOMO to COCOMO II • Size Estimate based on Thousands of Delivered Source Instructions, KDSI • Categorizes Software as: • Organic – e.g., Standard Payroll Application • Semidetached – e.g., TPS, DBMS, OS • Embedded – e.g., Flight Control Software • User Interface Driven – e.g., Web/Mobile App • Each has an Associated Formula for Nominal Development Effort Based on Estimated Code Size • Each has Differing Order of Complexity that Impacts Estimation

  22. COCOMO Mode Feature Organic Semidetached Embedded Organizational understanding of Thorough Considerable General product objectives Experience in working with related Extensive Considerable Moderate software systems Need for software conformance with Basic Considerable Full pre - es tablished requirements Need for software conformance with Basic Considerable Full external interface specifications Concurrent development of Some Moderate Extensive associated new hardware and operational procedures Need for inn ovative data processing Minimal Some Considerable architectures, algorithms Premium on early completion Product size range Low <50 KDSI Medium <300 KDSI High All sizes

  23. Nominal Effort/Schedule Equations • COCOMO • Geared Towards Traditional Development Life Cycle Models • Focused on Custom Software Built from Precisely Stated Specifications • Relies on Lines of Code • Consider the Equations Below: • Produce KDSI (Amount of LoC) • Derive Person Months

  24. COCOMO Scaling Factors

  25. COCOMO II • COCOMO is Based on “Old” Model of Software w.r.t. its Makeup and Content • COCOMO II Tries to Transcend its Focus on LoC and the Three Application Types to Today’s Applications • COCOMO II is Collection of 3 Models • Application Composition Model • Suitable for Software Built Around Graphical User Interface (GUI) and Modern GUI-builder Tools • Uses Object Points as a Size Metric • Extension of Function Points • Count of the Screens, Reports, and Modules, Weighted by aa Three-level Factor (Simple, Medium, Difficult) • For CT Insurance Dept – Based Future Estimates for Divisions on Experiences with Developed Code

  26. COCOMO II • Early Design Model • Used Once Requirements are Known and Alternative Software Architectures have been Explored • Cost Prediction Based on Function Points aand Coarse-grained Cost Drivers • Leverage Personnel Capability And Experience • Post-Architecture Model • Redo Estimates Based on Actual Coding Process • Cost prediction based on • Size (source instructions or function points, with modifiers to account for reuse) • 7 Multiplicative Cost Drivers • 5 Factors that Determine the Non-linear Growth of Person-month costs in terms of size

  27. Project Management: Risk Analysis • Risk Analysis Deals with the Ability to • Understand Potential Problem Areas • Monitor Project Closely • Act When Problem Found • Four Dimensions: • Identification and Projection • Assessment and Management and Monitoring • Risk Deals with Three Factors: • Future: What Risks Might Endanger Software Project? • Change: How will Change Affect Project? • Choice: How will Choices of Methods, Tools, and People Affect Project?

  28. Risk Analysis: Identification • Project Risks • Budget, Schedule, Personnel, Resource, Customer, Requirements Problems • Technical Risks • Design, Implementation, Interfacing, Verification, Maintenance Problems • Business Risks • Building a Product No One Wants • Building a Product that Doesn’t Fit Company’s Product Strategy • Building a Product Sales Staff Doesn’t Know how to Sell • Losing Management Support – Change in Focus • Losing Budget or Personnel

  29. Risk Analysis: Projection • Attempt to Determine: • Likelihood that Risk is Real • Consequences of Problems with Risk • Four Activities in Risk Projection • Establish a Scale that Reflects Likelihood of Risk • Quantitative, Probabilistic, or Statistical • Delineate Consequences of Risk • Estimate Impact of Risk on Project • Note Overall Accuracy of Risk Projection • Risks Weighted by Perceived Impact • Nature: Likely Problems if Risk Occurs • Scope: What will be Affected if Risk Occurs • Timing: When and How Long will be Impact

  30. Risk Analysis: Assessment • Examine Accuracy of Risk Projection Estimates • Establish a Risk Referent Level • Level for Cost Overrun will Terminate Project • Risk Referent Level per Aspect of Project • Risk Triplicate: [r, l, x] • Risk, Likelihood, Impact of Risk • Four Steps for Risk Assessment • Define Risk Referent Levels for Project Aspects • Develop Relationship between r, l, and x for All Referent Levels • Predict Set of Reference Points for Termination • Attempt to Predict the Combination of Risks that will Affect Referent Levels

  31. Risk Analysis: Management/Monitoring • Risk Aversion is the Actions Taken to Avoid Risk • Triplet [r, l, x] used as Risk Management Basis • High Risk  Proactive Management/Aversion • Risk Management Incurs Additional Cost • Balance Management Costs with Additional Benefits • 80/20 Rule: • 80% of Project Failures Attributed to 20% of Identified Risks • Risk Management and Monitoring Plan has Three Objectives: • Assess if Predicted Risk Does Occur • Ensure Risk Aversion Steps are Properly Applied • Collection Information for Future Risk Analysis

  32. Typical SWE Risks (Boehm 1989) RISK RISK MANAGEMENT TECHNIQUE 1. Personnel shortfalls - Staffing with top talent; job matching; team building; key - personnel agreements; cross - training; pre - scheduling key people 2. Unrealistic schedules - Detailed multisource cost & schedu le and budgets estimation; design to cost; incremental development; software reuse; requirements scrubbing 3. Developing the wrong - Organization analysis; mission analysis; ops - software functions concept formulation; user surveys; prototyping; early users’ manuals 4. Developing the wrong - Prototyping; scenarios; task analysis; user user interface characterization (functionality, style, workload)

  33. Typical SWE Risks (Boehm 1989) 5. Gold plating - Requirements scrubbing; prototyping; cost – benefit analysis; design to cost 6. Continuing stream of - High change threshold; information hiding; re quirements incremental development (defer changes to later increments) 7. Shortfalls in externally - Benchmarking; inspections; reference checking; furnished components compatibility analysis 8. Shortfalls in externally - Reference checking; pre - award audits; award - fee performed tasks contracts; competitive design or prototyping; team building 9. Real - time performance - Simulation; benchmarking; modeling; shortfalls prototyping; instrumentation; tuning 10. S training computer - Technical analysis; cost – benefit analysis; science capabilities prototyping; reference checking

  34. Project Mgmt.: Implementation Strategies • Implementation Strategy Identifies How the Final System will be Developed • Four Approaches • Use a Previous Strategy for Past Projects • Use a Combination of Previous Strategies • Use a New Strategy • Use a Combination of New and Previous Strategies • Several Factors Impact on Strategy Selection • Expertise of Team Members • Time and/or Cost Constraints • Application Domain/User Expertise • Many Accepted Strategies in Use …

  35. Build it Twice Full Prototype • System is Implemented Twice • First Version is a Potential “Throw Away” fully Functional Prototype – Provide Insight for V2! • Second Version Starts After V1 Completed/Evaluated • Recommended for Inexperience Team – Why? • Version 1 Allows Implementers to Gain experience and Comprehend System Scope and Breadth • Domain Users Provide Valuable Input on V1 • Input Dictates Changes in V2 • Result: Improved V2 Implementation • Disadvantage: Increased Time and Cost • What about Today? What Situations May this be Relevant? What Process Model is Most Apropos? • Is there a Useful Variant of BITFP?

  36. Level-by-Level Top Down • System Decomposed into Smaller Modules • At Each Level, Modules are Developed/Integrated • Next Decomposition Starts when Current Level has been Completed • Integrates Modules as Developed – Reduces Big-Bang Integration Phase • Requires Stubs? Drives? Which One? • Disadvantages • Modules May be Decomposed Differently • Multiple Solutions for Same Module – Some of Which may be Less “Optimal” than Others • Locks Team into Decomposition Decision • Works Best for Experienced Teams… Why?

  37. Incremental Development • Refinement of Build it Twice Full Prototype • Develop System in Functional Increments • Increment is System with Defined Functionality • Successive Increments Increase Functionality • Large Systems are More Easily Developed, Tested, Deployed • Gets Increment into User’s Hands Quickly • Allows for Feedback to Project Team • Useful for Inexperienced & Experience Implementers • Where do we See Such Approach Today? • What Type of Applications Can Use this? • What Types of Tools are Available to Promote this? • Is it only Relevant for Large Scale?

  38. Advancemanship • Refinement of Level-by-Level Top Down • Two Components: • Develop Anticipating Documentation • Prior to System Development • Utilize Tools (Visio) for Screen Mock Ups • Write Detailed Usage Documentation (see web page) • Develop Software Scaffolding • Develop Some Supporting Software Prior to Developing the Entire Application • Focus on “Key” Features • Utilize Stubs and Drivers to Demonstrate to Users • Advantages? Disadvantages?

  39. Project Control: Work Breakdown Structure • Goal of Project Control: • Monitor Progress of Activities Against Plan • Early Detection of Deviation from Plans • Project Control Techniques Breakdown Project Goals into Intermediate Goals • Repeat with Intermediate Goals • Plan each Intermediate Goal w.r.t. Resource Requirements, Assignment, Schedule • Work Breakdown Schedule (WBS: Activity Tree of Goals • Root is major (Project) Goal • Children are Subgoals to Achieve Parent Goal

  40. Consider Compiler Project Compiler project Code Design Write Integrate manual and test Code Scanner Parser generator • Breakdown of Project into Parts Allows us to Attempt to Identify Resources for All Leaf Nodes • Leaves are Unit of Work Assignment • WBS May be used as Input to Overall Scheduling Process • Any Decomposition of Problem Assists in Estimation

  41. Project Management: Scheduling • Gantt Charts Can be Utilized for Scheduling, Budgeting, and Resource Planning • Bar Chart Represents an Activity • Horizontal Axis – Time (Days, Months, etc.) • Vertical Axis – Different Tasks/Goals (Subgoals)

  42. Gantt Charges for People Planning 1/1 4/1 7/1 10/1 Darius training Marta training vacation Leo training vacation Ryan training vacation Silvia training vacation Laura training vacation • Track Software team and When they are Available • Manage the Utilization of Software Engineers • What is Major Problem? • Don’t Show Interdependencies Among Tasks • Relationships of Goals, Subgoals, etc.

  43. CT Insurance Project Plan • Employed MS Project for Detailed Estimations of Effort Applied to a Schedule • Estimations Occurred After Significant Amount of Development Accomplished • Experience to Base Estimates • More Precise Guesstimates • Web Page Contains • MS Project Plan (24 pages) • Excel Spreadsheet on Hours/Timeline • Where are we Today? • Not Used Since Created (December 2003) • Interesting to do a Post-Mortem on Planning …

  44. CT Insurance Overall Plan • Nearly 3Year Period (9/2002 to 5/2005) • 24 pages 8.5 by 11 inches • 4 rows and 6 pages per row • Entire Plan • 5ft wide by 3 ft high • Below is ½ of the first main portion of plan • http://www.engr.uconn.edu/~steve/Cse2102/cidprojplan.pdf

  45. Focused Plan for Consumer Affairs

  46. 2nd Portion of Plan

  47. Scheduling: PERT Charts • PERT: Program Evaluation and Review Technique • Network of Boxes (or circles) and Arrows • Boxes represent activities • Arrows represent dependencies among activities • Activity at the head of an arrow cannot start until the activity at the tail of the arrow is finished • Boxes may be Notated with Start and End Dates • Boxes may be Designed as Milestones • To Construct a PERT Chart: • List all Activities Required for Project Completion with Estimated Time Lengths • Determine Interdependencies Between Activities

  48. Recall WBS of Compiler Project Compiler project Code Design Write Integrate manual and test Code Scanner Parser generator • For PERT – Focus on • Defining Which Activities can be Performed at Which Times • Understanding Dependencies Among Activities • Note: Implementation Strategy May Influence PERT

  49. PERT Chart for Compiler Project

  50. Advantages Forces Manager to Plan Highlights Interrelationships Among Tasks Identifies Critical Path in the Project (See Bold) Exposes Possible Parallelism in Tasks Assists in Allocating Resources Allows Scheduling and Simulation of Schedules Enables Manager to Monitor/Control Project Disadvantages Manager Controls Granularity of Tasks If Manger Imprecise, PERT is as Well Inaccuracies can Make PERT Ineffectual Charts for Large Projects may be Huge Definitely Need Automated Support Some Gantt Tools can Generate PERT Scheduling: PERT Chart

More Related