1 / 31

Intro to Extreme Programming (XP)

Intro to Extreme Programming (XP). John Arrizza Mar 4/2004. Overview & Agenda. John Arrizza: 18 years experience in C/C++, C#, Perl, etc. High reliability, multi-process, multi-threaded applications. Currently a Team Lead for Nokia UI Software using “mostly-XP ™” Agenda What is XP? Why XP?

mariette
Télécharger la présentation

Intro to Extreme Programming (XP)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intro to Extreme Programming (XP) John Arrizza Mar 4/2004

  2. Overview & Agenda • John Arrizza: 18 years experience in C/C++, C#, Perl, etc. High reliability, multi-process, multi-threaded applications. Currently a Team Lead for Nokia UI Software using “mostly-XP ™” Agenda What is XP? Why XP? XP in Use Feedback Typical XP Iteration Recommendations Questions welcome throughout!

  3. What is XP? XP is a set of good software engineering practices

  4. Comparison (cont’d)

  5. Why XP? • XP has high exposure • It has been featured in IEEE magazine and other mainstream S/W press • There are ~50 XP User groups around the world. USA has the highest number, England, Germany and China are very high as well • It is being taught in Universities (new grads will have XP exposure)

  6. Why XP? • Some companies using or investigating XP: • National: Verizon, Chrysler, Cisco, JPL, StorageTek, Intel, Fannie Mae, IBM, NASA • SD Local: Systran, SAIC, Alaris, HP, Acelyris, others?

  7. Why XP? • XP fits corporate goals • #1 Market immediacy • I want it done • I want it done now • I want it done right (approx. zero bugs)

  8. Why XP? (I want it done) • XP delivers on these goals because of it’s high feedback: • A feature is done when the Acceptance Test passes: • Not subjective, it either passes the test or not! • Because of continuous integration, the onsite customer can actually see the new feature running immediately

  9. Why XP? (I want it done now) • Features are done in the order the customer asks for them. The timing is a business decision not a technical decision: • If the customer wants it done now, it is done now. • If the customer is willing to wait, it is done later.

  10. Why XP? (I want it done right) • The bug rate for XP projects is very low, typically near zero: • Because of UTs and ATs, domino or spurious errors are detected very early, usually by the developer during development • System integration errors, if any, are caught daily • If an error does slip through, a test is immediately written to detect it. That error will never show up again.

  11. XP in Use • Stories • Similar to Use Cases and User Requirements. • They capture a user request for new functionality. They are from the user & business point of view! • They use informal language but are specific enough that the developers can design and code from them. • They are estimable. There is enough information to decide how long it will take to complete the Story. If there is any ambiguity, the customer is on-site to ask more information. • They are testable (so the developers can objectively determine whether they are done) • They can be finished within one iteration (so every iteration accumulates completed business value)

  12. XP in Use • Iterations • An iteration is a fixed amount of development time. Usually one or two weeks. • Planning Game • Done at the start of an iteration, e.g. Monday • Customer determines what stories to do for that iteration • Developers break those stories down into Engineering tasks

  13. XP in Use • Release Planning • Customers write the Stories • Customer orders the Stories in business value order to satisfy longer term goals (I.e. a release) • Developers give feedback: • Estimates for each Story (in Points), Velocity (how many Story Points the development team can do in one iteration) • Is the Story testable? • Will the Story fit in an iteration?

  14. XP in Use • Engineering Tasks • These are the units that the developers work in • A set of engineering tasks are used to satisfy one or more Stories • Stories are comprised of one or more Engineering tasks

  15. XP in Use • Unit Tests (UTs & Test Driven Development) • Unit test is written first for a particular Engineering Task. The Unit Test fails… • Code is written... (usually very little code is necessary to satisfy the Engineering Task) • …Until the Unit Test passes. • Merciless Refactoring • Clean up the code to accommodate the new code just written. Very quick, very safe (100% unit testing available) • Clean up the design if necessary.

  16. XP in Use • Acceptance Tests (ATs) • When all Engineering Tasks for a Story are complete, run the ATs. • Did the Story’s AT run successfully? If NO, fix it! If YES: • check the code in • integrate with the rest of the system • run the entire AT suite

  17. XP in Use • Continuous Integration • Separate Integration machine • The Customer can see the running system with all the Stories completed to date at any time, I.e. fully tested, fully integrated, zero known bugs, etc.

  18. XP in Use • There are other practices: • Simple Design • Pair Programming • Collective Ownership • Coding Standard • Metaphor (out of use?) • 40-hour week

  19. Feedback • XP ties these practices together with a simple philosophy: high feedback • Without feedback, we are blind mice. With feedback, “you can run but you can’t hide!” • Try to find ways to increase feedback whenever and wherever we can: • Daily meetings • Bring the customer on-site • Etc.

  20. Why High Feedback? • People are fallible, feedback gives us confirmation of our beliefs and assumptions in real time • Does this feature actually work? • Is it what the user really wanted? • Does this function actually work the way we think it does? • I believe that by changing function X, that Jake’s function Y still works. Is that true? • I believe I’ve coded everything I’m supposed to – I’m done. Is that true? • Do all unmodified features continue to work exactly as they did before I made my change? • Do all modified or new features work as I expect them to?

  21. Why High Feedback? • To allow us to predict the future with more accuracy: • The shorter the time frame, the more accurate the prediction: The far future is a bad extrapolation of the past and the present. The near future is a pretty good extrapolation of the near past and the present • The more feedback we have of the accuracy of our predictions the better our predictions can become

  22. Typical XP Iteration • During a prior Release Planning session: • The Customers have already broken the application up into User Stories • The Developers have already roughly estimated all the Stories • We’ve done a few iterations already. So Developers know their Velocity

  23. Typical XP Iteration Day 1: The Planning Game • Customer: • Determine the highest business value Stories to do for this iteration • Developers: • Do quick design to break down the Stories identified by the Customer into Engineering Tasks • Estimate the Engineering Tasks, re-estimate the Stories if necessary. • Provide the Customer with our current Velocity. If the total Story Points is greater than Velocity, Customer chooses Stories to remove or replace with lower cost Stories. (For more attend Carlton’s excellent Planning Game Presentation)

  24. Typical XP Iteration • Day 1: • Developers volunteer for as many Engineering tasks that they can do in an Iteration • Day 1 – 5: • 10-15 minute Morning meeting to discuss status and any issues • Developers write Unit tests, write code and refactor mercilessly to satisfy the Engineering Tasks. • When all Engineering Tasks for a Story are done, the Story’s AT is run. • If successful, integrate and run entire AT suite. Fix any integration errors immediately

  25. Typical XP Iteration • Day 5 (assuming a 1 week Iteration): • Developers tally up the completed Story Points. This is the Velocity used for next iteration. • At any time: • A Developer can ask on-site Customer for details, clarification, etc. • A Customer can see the working system • A Customer can write up a new Story for submission at the next Planning Game

  26. Typical XP Iteration • Periodically: • Customer re-does the Release Plan to ensure: • deadlines will be met. • Stories are ordered in highest business value • Every 6-8 iterations: re-estimate all remaining Stories: • Usually enough common code has been done to lower many estimates. • Developers have much more knowledge about the system. New estimates will be more accurate.

  27. Typical XP Iteration • At any point, we have: • 100% unit tested code with 0 bugs, 100% mercilessly refactored code, I.e. clean code, clean design • All Acceptance Tests for completed Stories pass (there will be failing Acceptance Tests for Stories incomplete or not yet begun) • A running system that contains as much business value possible to date and very little incomplete code • The ability to stop the project with very little impact • The ability to pause the project and temporarily divert resources with very little impact

  28. Recommendations • Read: the White book “Extreme Programming Explained”; Green book “Planning Extreme Programming”; many others available • Hire an experienced XP Coach for the first project / first few iterations

  29. Recommendations • If you plan to use XP for a legacy project, introduce only one or two practices at a time. I recommend these as your first two: • 1 week Iterations. Fall out: • Better build and development environment • Better planning and delivery (which can lead into Stories, Release Planning, Planning Game) • Quicker turnaround times for errors and change requests • Pressure on QA to test recent changes (which can lead into Acceptance Tests, Unit Tests)

  30. Recommendations • Automated Acceptance Tests. Fall out: • Bugs! • Pressure to fix bad architecture, bad code, easier to maintain code (which can lead into Unit Tests, Merciless Refactoring, Simple Design) • Shorter development cycle becomes a possibility • Potential to safely rewrite/refactor for testability and maintainability

  31. The End

More Related