1 / 54

Load Testing Templates: Artifacts of NYU’s Load Testing Implementation

Load Testing Templates: Artifacts of NYU’s Load Testing Implementation. Max Whitney & Erik Froese NYU/ITS/eServices. Repeatable Load Testing Wow, it really is that time consuming. Max Whitney NYU/ITS/eServices max@nyu.edu. Context. Current Environment. Blackboard Learning System

vinson
Télécharger la présentation

Load Testing Templates: Artifacts of NYU’s Load Testing Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Load Testing Templates:Artifacts of NYU’s Load Testing Implementation Max Whitney & Erik Froese NYU/ITS/eServices

  2. Repeatable Load TestingWow, it really is that time consuming. Max Whitney NYU/ITS/eServices max@nyu.edu

  3. Context

  4. Current Environment • Blackboard Learning System • Fall 2006 • 5,219 sections • 127,899 enrollments • 3,964 instructors & 38,624 students • Spring 2007 • 4,952 sections • 146,889 enrollments • 4,329 instructors & 38,559 students

  5. Under-served Communities • School of Medicine • Math and Sciences • School of Continuing and Professional Studies: The Virtual College, 100% online program • School of Law: LL.M. in Tax, hybrid online and in person program

  6. Long Time ListenerFirst Time Caller • NYU joined as a Sakai Educational Partner in Summer 2004 • Local pilots • Stern School of Business • Computer Science Department of Courant Institute of Mathematical Sciences • Enterprise Pilot: September 2006

  7. Phased Pilot Lacking the roll out support of commercial software, NYU is rolling out in phases, building the service with each phase: • Documentation • User Support • Pedagogy Support • Critical System Support: • Security • Integration • High Availability • Performance Assurance • Disaster Recovery

  8. Snowballing to Production

  9. January 2007 Phase 1 Dashboard (My Workspace) Web Content Announcements Resources Syllabus Assignments Roster Default Discussion Board Course Site Statistics Gradebook Help Email Archive 23 Course Sites 20 Faculty 891 Students Backup & Disaster Recovery Implemented Penetration Testing Passed & Security Report Filed Smaller Summer Sets - 2 Phases

  10. Critical System Support for Phase 3 - September 2007 • Integration with Student Information System • Performance Assurance

  11. Select Software Select Hardware Identify Network Requirements Identify Ancillary System Requirements Commit to Long-term Retention Method for Tests, Results & Configurations Describe Test Cases Identify User and Site Characteristics Identify Executive Stakeholders Identify Load Characteristics Implement Test Cases Create Test Users Create Test Sites Create Test Site Content Test Store Configuration and Results Reset Analyze Results Report to Executive Stakeholders Repeatable Load Testing

  12. Load Testing Software • Key questions • Is there a budget? • Are there reasons to go with a proprietary or an open source tool? • Does it matter if you can share your test case code? • Where should it be in the adoption cycle? • What internal expertise exists?

  13. NYU’s Choice • The Grinder 3 • No budget • Charged with using open source where mature • Others can use the resulting test cases • Neither bleeding edge nor nearing obsolescence • Jython/Python scripting in a Java framework

  14. Artifacts • Installation and Configuration Cheat Sheet • In progress • Shared

  15. Load Testing Hardware • Key questions • Buy or borrow? • Where should machines live? • How many? • Bleeds into Network Requirements

  16. NYU’s Choices • Console • Borrow • Least secure network consistent with data stored • Long term (near-permanent) cname • Agents • Borrow • On same subnet as application, on NYU-NET, off NYU-NET, across VPN, across NYU-DIAL, on NYU-ROAM • Number desired will be derived from test case characteristics • Number used will be negotiated with ‘volunteer’ organizations • Network Requirements • Remember to handshake with the NOC early

  17. Artifacts • Derivation of agent machine counts • In progress • Shared • Machinery • Not shared

  18. Ancillary Systems • Key questions • What are you really testing? • Are the any non-Sakai systems in the mix?

  19. NYU’s Answers • LDAP • Not testing the speed of our LDAP response • Don’t slow other production systems • Don’t expose real user data Set up an independent LDAP system with well specified test accounts and test passwords • Automated data processes • Future issue • Disable non-core processes for duration of load testing • Schedule specific load tests to learn the impact of batch processes on user response times

  20. Artifacts • None • Share test account specifications • Missing something shareable?

  21. Long Term Retention • Key questions • Retain results? • Retain test scripts? • Retain network and machine specs? • Retain configuration data? • Who is the data steward? • Where to keep it all?

  22. NYU’s Choices • Retain Network and Machine Specs • Retain Test Scripts • Retain Results • Retain a Record of Versions and Configuration Settings • Without a formal QA department, Dev is the Steward • Subversion repository

  23. Artifacts • Subversion repository recommendations • Will share own best practices for tagging/branching svn repository • Missing something shareable?

  24. Itemize Test Cases • Derived from existing Learning Management Systems in production • Derived from actual Sakai usage in production • Theorized for new tools

  25. Blackboard Usage • Statistics information is exported to offline database schema (bb_bb60_stats) • Query for counts • Time blocks: pre-semester, first week, midterms, reading week, finals, one-time events • Analyze events • Translate to Sakai equivalents • Derive relative counts of each activity for each time block “Administrators have open access to the statistics database to use for analysis and creating reports.” at http://library.blackboard.com/docs/r7/70/en_US/admin/bbas_r7_0_admin/advanced_system_reporting.htm and http://library.blackboard.com/docs/cp/learning_system/release6/administrator/advanced_system_reporting.htm

  26. Ah, Tedium

  27. Artifacts • Queries • Transliteration from leaf case to Sakai test • Multiplication factor from leaf case to event • In progress • Shared

  28. Sakai Usage • In pilot, little valid data • Biggest usage results turned out to reflect our pen testing

  29. Theoretical Usage • New tool means no historical data • Key is to identify and document all assumptions • Update assumptions as data accumulates

  30. Artifacts • Sakai site statistics queries • In progress • Shared • Assumptions worksheet • Not quite on the radar yet

  31. Elements of a Test Case • Name • Number - for correspondence to Grinder test • Human readable description • User type executing test case • Data assumed to exist before the test case starts • Success criteria - only successful tests are counted in timing statistics • Categorization - login, logout, content read, content create, discussion read, discussion write

  32. Artifacts • Test Case Documentation • Not started • Shared

  33. Implement Test Cases • The Grinder proxy captures clicks of a test case • Programmer effort • Clean up proxy output • Parameterize user data • Parameterize course site data • Check for success criteria • Not to be sneezed at • Test the Test Cases

  34. Artifacts • The Grinder Test Case Scripts • Not started • Shared

  35. Create Test Users and Sites • Now know user types required and relative counts of each type. • Now know data requirements for each test case, and the course sites and starting contents required. • Script the generation of users and sites, and of the data files used by The Grinder agents • LDAP account allocation • Permissioning within course sites • Content creation within course sites: seed content files, lorem ipsum

  36. Artifacts • User creation scripts • Course site creation scripts • Public domain content files • Not started • Shared

  37. Test • Set up grinder configuration: counts, ramp up speed, duration • Install The Grinder on agent machines to speak to common console • Check in to subversion all documentation, test cases and configuration data • Take a full back up of the target system Run the test.

  38. Test bit anticlimactic really

  39. Artifacts • Configuration files • Target system characteristics • Raw results • Not started • Shared

  40. Reset the System • Restore from the full backup

  41. Results • Commit the raw results to subversion alongside the tests and configurations used to generate the results

  42. Report • Report to stakeholders

  43. Repeat Until Done

  44. Test • Set up grinder configuration: counts, ramp up speed, duration • Install The Grinder on agent machines to speak to common console • Check in to subversion all documentation, test cases and configuration data • Take a full back up of the target system Run the test.

  45. Reset the System • Restore from the full backup

  46. Results • Commit the raw results to subversion alongside the tests and configurations used to generate the results

  47. Report • Report to stakeholders

  48. Repeat Until Done

  49. Test • Set up grinder configuration: counts, ramp up speed, duration • Install The Grinder on agent machines to speak to common console • Check in to subversion all documentation, test cases and configuration data • Take a full back up of the target system Run the test.

  50. Reset the System • Restore from the full backup

More Related