1 / 22

Testing @ WP5 International F2F-meeting for testing and quality assurance Sept . 20 th 2010, Vienna

Susanne Kamptner Test coordinator. Testing @ WP5 International F2F-meeting for testing and quality assurance Sept . 20 th 2010, Vienna. Agenda. 20th of September, 10h – ca. 18h #1. 10h - 10h15 Welcome / goals of the meeting / goals of interoperability test (Skulason / Kamptner)

johannes
Télécharger la présentation

Testing @ WP5 International F2F-meeting for testing and quality assurance Sept . 20 th 2010, Vienna

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Susanne Kamptner Test coordinator Testing @ WP5International F2F-meeting for testing and quality assuranceSept. 20th 2010, Vienna

  2. Agenda 20th of September, 10h – ca. 18h • #1. 10h - 10h15Welcome / goals of the meeting / goals of interoperability test (Skulason / Kamptner) • # 2. 10h15 - 10h45Presentation of general testing approach (Kamptner) • # 3. 10h45 - 11h15Status on national implementation and testing activities (all participants) ----- coffee break ----- • # 4. 11h45 - 12h45Presentation of experiences from first test session (Birgisson / Bausa / Helger) ----- lunch break ----- • # 5. 14h15 - 15h30Presentation and discussion of test dimensions and test scenarios (Helger / Kamptner) • # 6. 15h30 - 16hPresentation of Test case management Tool (Helger) ----- coffee break ----- • # 7. 16h30 - 17h30Planning of next test phase (all participants) • # 8.17h30 - 18hWrap up & feedback, To-dos and next steps (Kamptner / Skulason)

  3. Interoperability Testing#1/1 in PEPPOL WP5 means... • a measure of (software) quality assurance to make sure the defined specifications of WP5 deliverables work together exchanging PEPPOL documents • to do the PEPPOL project internal step of evaluation of the Pilot implementations within the scope of WP5 does not mean... • that we will do the (internal) testing of individual / national implementations and interfaces • that we´re going to do the full process of document transfer for all CEN-profiles with all profile´s details aims to ... • find potential bugs and problematic issues within the profile implementations before “real” Test Pilot partners might detect them • be a crucial measure of QA within the general PEPPOL quality framework

  4. Aim of Testing#1/2 In general terms: • the earlier you detect issues (and mitigate these) and • the more you know about the risk levels in terms of how an issue will affect the solution, • the less money and resources you have to spend correcting mistakes after the final release.

  5. General approach / terminology#2/1 The general testing approach is based on CEN/ISSS Test Guidelines Used Test Types: • Unit Testis done by technical staff to ensure, that the code works as defined; generally executed only for one component (e.g. national validation) • Integration Test phase 1 is done to ensure, that the overall technical and functional design of one solution / solution package works as defined; generally executed for more than one component / function (e.g. national validation and individual backend processing) • Integration Test phase 2is done to ensure, that the solution / solution package works together with external interfaces and data exchange (e.g. exchange of documents with second individual solution) • System Testis done to ensure, that the whole business process works together with external interfaces and data exchange (e.g. invoicing workflow with transport infrastructure) • Acceptance Testis done to ensure, that users will accept the solution for their business purposes (e.g. suppliers)

  6. The testing process (1)#2/2 Test planning Test- preparation Test case design Test – execution Defect Management • Test planning • Requirements analysis: Determine what aspects of a design are testable and with what parameters those tests work. • Test concept creation • Test environment and infrastructure definition: e.g. Hardware, Software • Definition of to be tested functions or components by using of requirements analysis • Definition of training for the testers: each tester must at least be familiar with the test process, the tools and the defect workflow. • Definition of acceptance criteria • Organize the test team and establish a plan: • who, when, how, allocation of ressources and budget,... Reporting

  7. The testing process (2)#2/3 Test planning Test- preparation Test case design Test – execution Defect Management • Test preparation • Kick-off workshop for test team • Organize and setup the test environment and infrastructure • Definition of test dataset and test scripts to use in testing software • Test case design • Write, review and release the test cases • Test execution and documentation • Testers execute the defined test cases based on the plan and report any errors found to the development team. • Defect Retesting: Once a defect has been solved by the development team, it is retested by the testing team. Reporting

  8. The testing process (3)#2/4 Test planning Test- preparation Test case design Test – execution Defect Management • Defect management • Record the defects and monitor bug fixing • Defect Retesting:Once a defect has been solved by the development team, it is retested by the testing team. • Regression testing:It is common to have a small test program built of a subset of tests, for each integration of new, modified, or fixed software, in order to ensure that the latest delivery has not ruined anything, and that the software product as a whole is still working correctly. • Reporting on outcome per phase • Once the test meets the exit criteria, the activities such as capturing the key outputs, results, logs, documents related to the project are to be reported. Reporting

  9. Preconditions for interop Test#2/5 Components delivered to the interoperability test must be test-ready, which means: • software artefacts MUST be unit tested on an individual (e.g. national) level • software SHOULD be integration tested with other national components (if necessary and if an end-to-end process has been chosen for specific test areas)

  10. Tools and support#2/6 Several tools will be provided to establish a shared testing process within WP5: • Overall WP5 Test plan • will be an outcome of this meeting • Test tool (web interface) containing: • Test cases (test case management) • Test execution (documentation of results on basis of test cases) • Shared documents (such us e.g. test guidelines, lessons learned,...) • Link with defects on OSOR • Test report template: • Needs to be used to report the outcome of a each test phase (sum of all test executions for a certain test type) by participant • OSOR forum: • Report bugs (defect management) • Share experiences on detailed technical questions (cross-WP) • Skype/Adobe Connect meetings: • Share and discuss lessons learned

  11. Status & Discussion#3 • Status implementation per participant • Status individual testing per participant

  12. Experiences first test session#4 Hands-on demonstration of findings 12

  13. Functional dimensions#5/1 • Generate invoice to send • Transformation (process: optional; test: required if present) • Transform outgoing invoice to UBL • Transform incoming invoice from UBL • Validation (process: optional; test: required) following the Temple of Validation • Technical structure (XSD) • Core data set (CEN BII) (Schematron) • Profile data set (profiles 04 and 06) (Schematron) • Optional customisation: cross-border / domestic / industry specific (Schematron) • Optional bilateral schemes (Schematron) • Visualization (optional) • Display on screen • Display on print-out • Translation of texts (where applicable) • Verify all BII Core fields are present • Process received invoice

  14. Object-to-test dimensions#5/2 • Data structure / involvement of partners • dummy data / dummy partners • real-life simulation / demo client with simulated CA/EO • real-life data / pilot partner •  applicable for all functionalities • Transformation (optional) • Transform valid and invalid documents • Transformation success or failure • Validation (required) • Test each validation layer with valid and invalid files • Document creation (optional) • manual • application generated

  15. Test scenario 1#5/3 • Sending process • Generate invoice (a. domestic, b. cross border)  Expected result: electronic invoice in any format • Optional transformation to PEPPOL Invoice (according to individual implementation)  Expected result: PEPPOL Invoice • Validate PEPPOL Invoice invoice (all layers)  Expected result: validation report with a clear “valid” or “invalid” • Route invoice  Expected result: document is retrieved by the sending AP implementation

  16. Test scenario 2#5/4 • Receiving process • Route invoice (a. domestic, b. cross border)  Expected result: document is retrieved by the destination AP implementation • Validate PEPPOL Invoice (all relevant layers)  Expected result: validation report with a clear “valid” or “invalid” • Optional transformation from UBL (according to individual implementation)  Expected result: electronic invoice in any format • Process invoice in backend system  Expected result: backend system specific behaviour

  17. Test packages#5/5 These are the test packages, each participant should perform based on the previously defined dimensions • Generate invoice to send • Transformation to UBL • Validation outgoing invoice • Route outgoing invoice to sending AP send on infrastructure  • Route incoming invoice from sending AP • Validate incoming invoice • Transformation from UBL • Process invoice in backend system • Perform full process (with a partner) as a sender • Perform full process (with a partner) as a receiver

  18. Process for each test track#5/6 Choose the test packages you want to test Decide for the profiles to test Choose the object-to-test-dimensions Choose and/or design the test cases you want to execute optional: Choose a partner you want to execute the test cases with Execute the test cases Document the test executions (test case tool, test report); report, fix and re-test bugs Share lessons learned General: Pilot requirements such as presented within the phone meeting from 30th of August have to be met – so each requirement has to be tested! 18

  19. Test Tool#6 Live-Demo of Test case management Tool www.phloc.com/peppol/config 19

  20. Test planning#7 Who will do which test package when: • on an individual level • together with a (dummy) WP5-partner • together with a real pilot partner • using the demo client • using also the WP8 transport infrastructure • ... -> planning template to be completed in the meeting and published after the meeting 20

  21. Wrap up & Feedback, To-Dos#8 Feedback: • might be tricky to get test data (invoice samples) • How to deal with issues found within the test?-> report and discuss at OSOR, if no solution can be founde -> escalation to WP manager To-Dos: • Carry on working out the reference model (Georg + beneficiaries) • deliver test invoices for Denmark, Greece, (France) • validate existing test invoices (all beneficiaries) • Publish documentation of the meeting (Kamptner) – 24.9.2010 • Establish an overall documentation on behalf of testing for newcoming PEPPOL partners (Kamptner, Helger, Lefteris, Arianna, Juhani / Gabriel) • Do the detailed test planning (each beneficiary; xls) – 27.9.2010 • Report a detailed statement of individual testing process to be executed (each beneficiary; reporting template) – 1.10.2010 • Do the detailed test execution reporting (each beneficiary) 21

More Related