AGILE SOFTWARE DEVELOPMENT:EVIDENCE FROM THE FIELD Alan MacCormack (email@example.com) Harvard Business School Agile Development Conference June 2003
Introduction • Research Background: Product Development Flexibility • Multi-year, multi-industry research project • Common Theme: Highly uncertain and dynamic environments (tech + mkt) • Problem: Significant changes occur in the needs a product must address and the technologies it employs to meet these needs during a development cycle • Multiple Studies in the Software Industry • 30 Internet Software Projects in 96-98 timeframe (extreme uncertainty) • 30 HP Projects in 98-00 timeframe – broaden context of study • 100+ Projects in 2002 timeframe – analysis of data ongoing • This Presentation • Detail the major themes that emerge in the Software Studies • Explode some myths about more flexible practices • Communicate why it is HARD to do….PROPERLY!
Problems with Traditional Processes E.g. A Stage-Gate Model of Development Concept design Product and feature specifications Coding Testing Response Time DESIGN FROZEN START END Problem: When the market/technology advances at a faster pace than you can respond
Achieving Flexibility Stable Environments: A Stage-Gate Process Uncertain Environments: A More Flexible Process Note: “Speed” is a subtle concept in a more flexible process
A More Flexible “Model” of Development • Characteristics of such a process • Overlapping stages (Krishnan, Eppinger & Whitney, 1997) • Iterative process (Synch and Stabilize”, Cusumano, 1995; “Experiential”, Eisenhardt and Tabrizi, 1995; “Sense, Test and Integrate”, Iansiti and MacCormack, 1997) • Learning and adaptation (Tushman and O’Reilly, 1997) versus planning and execution
An Early Case Study Netscape’s Navigator 3.0 Project The point: >50% of the new code was developed after the first beta release!
The Aim: Integrated Technology and Market Streams Note: A flexible process results in “better’ performing products as perceived by the customer (i.e., there may still be trade-offs with other performance dimensions)
Research in Internet Software • Internet software industry • Industry “created” in 1993, with development of GUI to the internet • Tremendous uncertainty during period of study (96-98) • Thousands of new firms (e.g. Netscape, Yahoo!), hundreds of new applications (e.g. Internet telephony), variety of alternative technologies (Java, ActiveX) • Survey data on 31 projectsfrom 17 different firms • Projects include Microsoft Explorer 3.0 and 4.0, Netscape Navigator 3.0 and 4.0, My Yahoo!, Intuit QuickenMortgage, 411 Rocketmail, Planetall, Altavista Intranet • Average size of project: 375,000 lines of code (largest project = 1.5mn lines) • Data on product performance • Product Quality: Panel of experts rated features, technical performance, and reliability • Productivity: Resources consumed by project adjusted for size and complexity
A Note on Results of the Beta Analysis • Functionality at First Beta is a Better Predictor than Functionality at First System Integration • Naturally, these two measures are highly correlated • Suggests projects which integrate early but do not release this version of the product to the market fail to capture all the benefits from a more flexible approach • Interpretation: The dominant source of uncertainty in this emerging industry is related to the market, not to the underlying technologies (software code) • Field Work Suggests the Number of Discrete Betas is not as Important as the Nature of the Interactions with Users • Wide variation in beta distribution policy - wide versus narrow • Some firms are much more careful about which users are selected for early versions, and how they work with these users once beta testing begins • Having excessive numbers of betas creates problems in version control
Deciding on Beta Strategy HI My Yahoo! NetDynamics RISK OF EXTERNAL TESTING (Competitive imitation, impact of buggy product, support requirements, etc.) LO Microsoft Netscape LO HI NEED FOR EXTERNAL TESTING (Availability of internal test resources, are employees = users, sophistication of application, etc.)
Success Factors - Rapid Feedback Note: How you “configure” experimentation capacity is vitally important
The Value of Experience • In dynamic environments, the value of Experience is questioned (at least in academia!) • Knowledge atrophies quickly • Inertia leads to applying old or inappropriate skills • What type of Experience is valuable? • Not Experience measured by “years of tenure” • Experience measured by number of project “generations” • So how does it work • Setting direction for experimentation “strategy” • Learning at the “System” level
Success Factors - Architectural Design • Program Manager for Microsoft Internet Explorer 3.0 • ...the most important aspect of the project was that we developed the product architecture in a way that separate component teams could feed into the project. The idea was to build a good core infrastructure, and have the rest of the team add components on top of it. In fact, at the first integration, all we had was the core infrastructure. Most other features were missing. We didn’t have any support for Java at this point because the license for Java had only been signed a few days before... • Chief of Development for Altavista • ...our architectural design efforts are structured to give priority not to performance, but to independence. We create interfaces to buffer the impact of uncertainty – when one module changes, the others are therefore insulated. If we were trying to optimize the size and efficiency [of a design] we would not do this, but optimizing a design typically makes it more complex and subsequently very difficult to change...
A Modular and Scaleable System: Linux Evolution of the Linux Kernel Source: Forbes Magazine, August 10th 1998
Flexible Processes in Action Microsoft’s Milestone Build Process Vision statement Overall product specification Detailed feature specification Coding Testing feature set 1 feature set 2 feature set 3 Stabilization Source: Microsoft Office 2000, HBS Case, MacCormack 1999
Extending this Process The Evolutionary Delivery Model: First Stage = Process Design Vision Architecture design and overall specification Micro-project Micro-project Micro-project Feature design Coding Integration/Test User Feedback User Feedback User Feedback Stabilization Questions: How many Micro-projects? On what parameters should this decision depend? (Current research, with Mike Cusumano, Chris Kemerer)
Myth 1: It’s a new “Best” practice… Low Mkt Uncertainty (MAD<0.49) High Mkt Uncertainty (MAD>0.49) Note: Very divergent responses when projects faced with high uncertainty
Matching Process and Context Market Uncertainty Technical Uncertainty PROCESS CHOICES Platform Complexity Platform Uncertainty Other Relevant Parameters
Myth 2: It’s all about making late changes… Market Uncertainty and Late Changes to a Module Relationship is significant with p<5%
Myth 3: You trade-off quality, productivity… STUDY OF 30 HP PROJECTS INITIAL RESULTS • More complete functional specification is correlated with higher productivity • More complete design specification is correlated with lower defect rate • Lends support to a “Waterfall-style” model of development FINAL RESULTS • Trade-offs disappear when you account for other development practices that provide an alternative mechanism for providing information that a spec provides (e.g., prototype vs func spec)
Myth 4: We’re way ahead of the world… Source: Survey of 104 completed projects (see Cusumano et al, forthcoming, IEEE Software)
The BIG Managerial Challenge • A flexible process requires a change in mindset • Much perceived wisdom about “good” practice runs counter to a flexible process, e.g. a large number of design changes late in a process is bad - DEFINITIVELY NOT TRUE • Key is a process which allows you to pro-actively generate new information and respond to this by evolving the design in a controlled fashion - otherwise, it’s chaos… • An example from the field: • One firm volunteered data on a “good” and a “bad” project • The “bad” project had higher quality and took less resources to complete • “Bad” project involved a process of continual change, responding to market, competition, etc. - final result didn’t look anything like the original specification • “Good” project ran in a structured fashion, highly optimized design, difficult to change - final result closely resembled the specification (design had been “frozen in time”) The point: Which looks like the better process to a senior manager?
Want to Learn More? • CONCEPTS: Developing Products on Internet Time, with Marco Iansiti, Harvard Business Review, Sep-Oct 1997. • EVIDENCE: Product Development Practices that Work: How Internet Companies Develop Software, Sloan Management Review, Jan 2001. • CAVEATS: Exploring Trade-Offs between Productivity and Quality in the Selection of Software Development Practices, Forthcoming, IEEE Software. • HOW TO: Harvard Business School Case Studies • Microsoft Office 2000 • Living on Internet Time • As well as all the usual books and articles on agile, adaptive, XP etc.