1 / 21

Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects

Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects. What?. Test Types (1 of 4). Unit Evaluate the quality of a single developer story Perhaps a single ETL mapping or a session Mostly functional, some system meta data Component

rae-deleon
Télécharger la présentation

Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects

  2. What?

  3. Test Types (1 of 4) • Unit • Evaluate the quality of a single developer story • Perhaps a single ETL mapping or a session • Mostly functional, some system meta data • Component • Unit-test style verification of an assembly of units, perhaps a workflow • Integration • Evaluate the coherence of the full application, as far as it exists to date • Add data scenarios, e.g., nominal, dirty data, missing data, EOP processing • System • All the above, but most likely a subset conducted formally as a final certification • Add very technical tests of operational topics

  4. Test Types (2 of 4) • Functional: Does an object meet its business requirements? • Examples: Does it transform a particular set of input as predicted by an example from an SME? • Story Tests: Did the product owner accept a user story during the user demo at an iteration’s end? • Simulations: Does it transform an entire set of data as predicted by the project’s lead roles? • Alpha Tests: Can the team get the app to behave when they take the role of users?

  5. Test Types (3 of 4) • Scenarios: Consider a compound business situation: Can app support all needs at once? • Exploratory Testing: Consider the edges of the business requirements: Other functionality needed? • Usability Testing: Will the app sustain the business functions of each “user persona” we intend to support? • UAT: Does the app deliver on every case of a structured, business-run, system appraisal? • Beta Tests: Does the system perform when real users interact with it and when it’s loaded with realistic data?

  6. Test Types (4 of 4) • Performance Tests: Does the app have acceptable response times under a normal load? • Load Tests: …under the highest conceivable load? • Security Tests: Is the system sufficiently resistant to a wide range of techniques to reveal and/or compromise its data? • Non-Functional Requirements Tests: Does the application meet a wide range of criteria for long-term inclusion in the department’s IT platform and low total cost of ownership?

  7. Target Column Test Types

  8. Who?

  9. Project Artifact: QA Role & Responsibilities

  10. Where?

  11. Techniques to Choose From • Unit testing: developer’s standard testing technique • Systems analyst inspection: manual data inspection by systems analyst, esp. matters concerning business rules and source-to-target mappings • Actual-data analytics: scripts, usually of SQL commands, run against actual results • Actual-to-expected result comparisons: usually scripts employing SQL “minus” commands • UAT subset: the appropriate items from the evolving user acceptance script that the solution designer and product owner are accumulating • Full UAT: execution of the full user acceptance test script for the release

  12. How?

  13. Data-Based Testing Scenarios • Nominal (“happy path”) • Dirty Data (human-visible syntax flaws) • Corrupted Data (machine-visible syntax flaws) • Missing Rows / Tables (e.g., no customers record or file) • Incoherent Data (skipped or mis-sequenced files) • Duplicate Data (e.g., overlap between extracts) • End-of-Period (e.g., month, quarter, year) • Archiving (purging of data that’s too old) • Catch-Up (usually three days per run day) • Restart (test operation instructions) • High-or Full-Volume (performance and 1-in-a million errors) • Resource Outage (FTP node goes down)

  14. Where?

  15. When?

More Related