1 / 59

No Respect

Danny, Be the Ball. No Respect. Dart2 Quality Framework. Dan Blezek Jim Miller Bill Lorensen. Software Quality: Past, Present, Future. A brief history of quality. Very First VTK Dashboard Update information Builds – Irix, Solaris, WinNT Regression tests WinNT “Catastropic failure”.

thai
Télécharger la présentation

No Respect

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Danny, Be the Ball No Respect

  2. Dart2 Quality Framework Dan Blezek Jim Miller Bill Lorensen Software Quality: Past, Present, Future

  3. A brief history of quality • Very First VTK Dashboard • Update information • Builds – Irix, Solaris, WinNT • Regression tests • WinNT “Catastropic failure”

  4. Last VTK Dashboard • 8 Platforms • 650 Nightly Tests • 70% coverage (Still) • Nightly Purify

  5. A Bad Day

  6. The Continuous Build

  7. Continuous Update Continuous Build My first VTK checkin

  8. The Big Bat of Quality

  9. Bill “Yogi” Lorensen

  10. Lessons learned • Test on different platforms • Test nightly • Make it easy to add a test • Track changes daily • Keep historical information

  11. Two roads diverged in a wood, and I-- I took the one less traveled by, And that has made all the difference. Robert Frost Closed Source

  12. Dart (v1) • Tests • Reports • Dashboards • Captures state of the system • Distills data into information • Convert build log  errors/warnings • Summarize test execution • Rollup Coverage statistics

  13. Someone broke the build!

  14. Someone broke the build!

  15. Dart’s Power • Distributed testing • If I don’t have a platform, you do • Distill data from many tools • Distributed Extreme Programming • Know the state of the system • Instant feedback on changes

  16. Quality Statistics Original VTK dashboard • 8 platforms / 650 tests • 13.6 G over 4 years Current VTK dashboard • 29 Nightly platforms / 500 tests • 1-2 G / week Insight dashboard • 60 Nightly builds / 970 tests • 1-2 G / week

  17. Dart2 Design Goals • One Server, multiple Projects • Simple, flexible setup and management • Configurable presentation • Persist data on dashboard over time • Aggregate Dashboards • Authenticated submission, if desired • Resource management tools

  18. Implementation • Java • Many, many available packages • Cross-platform • Everything in one package • No extra OS packages required • Distribute as Jar and/or platform exe • Should be easily extensible • Even to non-Java programmers

  19. Packages

  20. Components/Concepts • Client, Submission • Test Hierarchy • Results • RDBMS • XML-RPC Server • Task Manager • Scheduler • Tracks • HTTP / Template Engine

  21. Client, Submission • Client: a unique platform • Need to define criteria • Currently Site / BuildName • Submission • One TimeStamped set of Test data • Particular to a sub-Project • slicer.itk • slicer.vtk

  22. Test Hierarchy • Test is a logic group of Results • Has a Pass/Fail/NotRun status • May contain other Tests • Has Hierarchial naming convention • itk.common.PrintSelf • SubTest information rolled up

  23. Results • Data produced by a Test • Examples: • Log of standard out • Image • ExecutionTime • Typed • text/string, text/url, text/xml, text/text • numeric/integer, numeric/double • image/png,image/jpeg

  24. RDBMS • Core of Dart2 • Bundled with Derby embedded RDBMS • Any JDBC compliant DB works • Stores “small” data • Images, large blocks of text in files • Jaxor • Object – Relational Bridge package • No fancy SQL required • Creates objects from rows in DB

  25. XML-RPC Server • Accepts Submissions • Administrative functions • HTTP transport • Easy submission through firewalls • Digester used to process XML • Executes code when tags found

  26. Task Manager • Tasks are units of work for the server • Project and Server Tasks • Scheduled, Event driven • When a Submission arrives, a Task is queued • QueueManager executes Tasks • Plug-ins allow Project specific Tasks • Simply implement the Task Interface

  27. Scheduler • Quartz Enterprise Scheduler • Executes Tasks • Uses enhanced “cron” syntax • Uses • Regular DB maintained • Purge unnecessary data • Archive aging data

  28. Tracks • Groups of Submissions • Dashboard consists of intersecting Tracks • Temporal Tracks • Time based, i.e. 12am start, 24hr duration • “Most Recent” Track • Last 5 Continuous builds • Project specific

  29. HTTP / Template Engine • Jetty is HTTP/Servlet server • FreeMarker • Data prepared in Servelet • Template processed • Returned to client via HTTP • Flexible • Easy to add new pages • No XSLT!

  30. Dart2 Current Status • Alpha version ready • Test server • http://www.na-mic.org:8081/Insight/Dashboard/ • Populated with Build & Test from public.kitware.com • Subversion Code Repository • svn co http://svn.na-mic.org:8000/svn/Dart • Web SVN: http://www.na-mic.org:8000/websvn/

  31. Acknowledgements • Andy Cedilnik • Bill Hoffman • Will Schroeder • Ken Martin • Amitha Perera • Fred Wheeler

  32. Dart2 Quality Framework Dan Blezek Jim Miller Bill Lorensen Software Quality: Past, Present, Future

  33. Desirable Qualities • Frequent testing • Identify defects as soon as they are introduced • Hard to find cause if not done frequently • Minimally invasive to daily activities • Automated testing • Automated report generation/summaries • Must be concise yet informative • Track results over time

  34. NAMIC Software Process Dan Blezek Jim Miller Bill Lorensen

  35. Motivation • Many algorithms, many platforms • VTK, ITK, Slicer, LONI • Linux, Windows, Mac OSX, Solaris, SGI • Many users • Many datasets • Many sources of problems!

  36. Motivation Negative example • MIT codes ITK algorithm for LONI pipeline • UCLA developer changes LONI • GE changes ITK • Time for release, everything’s broken

  37. Motivation Ensuring high quality software • System’s state must be known • If UCLA knew about MIT code, they would have been more careful w/changes • All the code works, all the time • As often as is feasible, compile and test the code

  38. Extreme Programming

  39. NAMIC Process • Light weight • Based on Extreme Programming • High intensity cycle • Design • Test • Implement • Supported with web-enabled tools • Automated testing integrated with the software development

  40. Software Process • Design Process • Coding Standards • Testing • Bug Tracker • Communication • Mailing lists, Discussion forum, Wiki • Tcons • Documentation • Releases

  41. Design Process • Take the time to design a good API • Plan for future use • Plan for future extension • Two routes • Code something, check it in • Others will tear it down & make it better • Put together a strawman • Solicit ideas, implement

  42. Coding Standards • Follow the package’s rules • ITK has certain coding standards • Style guidelines • Naming conventions • Macros

  43. Testing • If it isn’t tested, it’s broken. • Tests • Ensure your code works • Documents expected results • Others free to change

  44. Bug Tracker • Bugs assigned / taken by developers • Tracks progress to releases • Captures feature requests • Communication mechanism

  45. Documentation • Doxygen • Automatic API documentation • Algorithm references • Implementation details • Books / Manuals • Insight Book

  46. Communication • Email lists • Discussion forum • Wiki • Tcon

  47. Extreme Programming • Compression of standard analyze, design, implement, test cycle into a continuous process

More Related