qapp outline n.
Skip this Video
Loading SlideShow in 5 Seconds..
QAPP outline PowerPoint Presentation
Download Presentation
QAPP outline

QAPP outline

157 Vues Download Presentation
Télécharger la présentation

QAPP outline

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. QAPP outline

  2. Element 1: Title Page with Approval Signatures • Title of QAPP • Name(s) of organizations implementing project • Approval personnel • Assistance agreement or contract number(s)

  3. Element 2: Table of Contents • List of all required elements and their page numbers • Appendices • References

  4. Element 3: Distribution List Lists people who will get original and revised QAPP • Everyone who does the work • Everyone who manages them • Funding agency

  5. 4-Organization of Project • Governmental Entities, Contractors, and Key Individuals. • Roles and Responsibilities. • How often will these be done? • How will each person do their job? • To whom will they report?

  6. Examples of Agencies • Required Agencies • Tribal Environmental Program • USEPA • Other Agencies • TAMS Center • State and Local Partners

  7. Possible Contractors • Sampling • Laboratory • Data Analysis • QA/QC Audit

  8. Examples of Key Individuals • Required Tribal employees • Air Quality Program Manager • Quality Assurance Coordinator • Optional positions (may be contractors) • Environmental Specialist • Environmental Technician

  9. Program Manager:Roles & Responsibilities • Oversees monitoring project • Prepares or reviews quarterly & annual reports for submittal to EPA • Ensures staff is hired and trained

  10. Program Manager:Roles & Responsibilities (cont.) • Prepares & maintains project work plan & budget • Communicates with Environmental Director & EPA Project Officer • Responsible for approval & modifications of project QAPP

  11. QA Coordinator: Roles & Responsibilities • Prepares or coordinates preparation of QAPPs • Reviews and approves corrective actions • Conducts system audits

  12. QA Coordinator:Roles & Responsibilities (cont.) • Oversees or conducts method performance audits • Prepares QA reports • Conducts or oversees data verification, validation and assessment

  13. Environmental Specialist: Roles & Responsibilities • Conducts sample transport, handling & exchange • Delivers samples to laboratory (by mail or actual drop off) • Signs off on chain of custody forms

  14. Environmental Specialist: Roles & Responsibilities (cont.) • Conducts quarterly calibrations, quarterly audits • Records sample information on data forms • Reports all aspects of monitoring project to Program Manager

  15. John W. Smith Director Navajo Tribe Laboratory Alexandria Washington Air Quality Program Manager EPA as appropriate Sue Jones QA Officer Tom Lamb Air Quality Technician Samuel Vaughn Air Quality Specialist Michelle Winston Air Quality Specialist Project Organizational Chart (Example)

  16. 5-Project Background • History • Context • Assume an “ignorant” reader (e.g., member of public)

  17. Why is this work important? • Are there health effects in your community that may result from this problem (asthma, bronchitis)? • Reduced visibility?

  18. More reasons why this work is important: • Concern about possible regional transport of pollutant (ozone precursors)? • Increased development, more roads, businesses, residents? • Concern about children’s exposure?

  19. Summarize Existing Information • Previous results from earlier studies? • Results from nearby areas? (if you did not gather the data, you may not be able to use it with your data but it can be useful to you in planning) • Results from emissions inventory? • Results from compliance monitoring of nearby facilities?

  20. Who are the Decision-makers? • Tribal Council • Tribal Environmental Office director • EPA Region

  21. Element 6: Project Description Ondrea Barber Salt River Pima-Maricopa Indian Community

  22. Summarize Purpose of Project Why are we making these measurements?

  23. Standards What standard will the measurement results be compared against (if applicable)?

  24. Field Work-Summary • What kinds of measurements are being made? • What kind of samplers are being used? • How many measurements over what time period? • Site locations

  25. Field Activities-Summary • Routine field sampling • Sample collection • Monthly calibrations/audits • Instrument maintenance

  26. Laboratory Work • How are the samples being analyzed? • state the method • can refer to a standard method) • Who is doing the analysis?

  27. Schedule 1. Hiring deadlines 2. Training • Dry runs with equipment 3. Field measurements 4. Analysis 5. Reporting

  28. Assessments • How will you check on yours and the lab’s work to ensure data is good (summarize)? • Who is involved and what are their roles (summarize)?

  29. Records of Assessment • Internal assessments (readiness review) • External assessments a. PEP audits for PM2.5, NPAP for others b. Technical Systems Audits

  30. YOUR Assessments of the Analysis Lab • Initial review of their QAPP and calibration certificates when you agree to the contract • Onsite visit during the contract • Ongoing review of their QC results

  31. Records • BRIEF description of project’s records, • Information on where they are stored • Ensure that detailed information is in Section 19 • What reports are required?

  32. Element 7: Project Quality Objectives Mathew Plate US EPA Region 9

  33. Project Objectives • Why are we making these measurements? • Conformance with NAAQS • Obtain baseline data • To determine need for additional monitoring • Health risk evaluation

  34. Project Objectives • What will we do with the results? • Compare with NAAQS • Report to community, EPA, health officials

  35. Systematic Planning • Required by grant regs: 48 CFR 46 • Performance criteria • QAPP or equivalent • Data assessment • Corrective action • QA training for management and staff

  36. Why Systematic Planning • Quality is the extent to which our data is sufficient for purposes it is being used • We need a process that defines objectives for our monitoring data and ensures that we know when these objectives are met • Program objectives should be developed in consultation with decision makers

  37. Decision Maker(s) Those who use data for decisions or conclusions, such as • Is a standard violated? • Should we take action to improve the air quality? • What is the air quality now, so that we will know if it gets worse or better? • Should we be taking more measurements?

  38. Who are the Decision Makers for Air Monitoring Data? • Required Decision Makers for EPA grants • EPA • The Tribe’s Environmental Program • Other Decision Makers the Tribe may consider in quality planning • State and local organizations • Researchers

  39. Types of Objectives • Project and Program Objectives • Data Quality Objectives (DQOs) • Based on Program Objectives • Qualitative and quantitative • Measurement Quality Objectives (MQOs) • Specific criteria which when met should produce data of acceptable quality • Quantitative

  40. Information in DQOs • How data will be used • Type of data needed • How data should be collected

  41. DQOs Work Backwards... From Degree of uncertainty you can tolerate • Acceptable degree of uncertainty in each measurement & number of measurements to take To

  42. DQO Functions DQOs • Link answers to actual measurements • Set limits on uncertainty so that data produce required uncertainty in the answer

  43. Example "We know that we meet the standard with 80% confidence—this means there is a 20% chance that we could be wrong and we are higher than that standard."

  44. Balancing Cost vs. Degree of Uncertainty • Balances costs of taking many samples with desired uncertainty in Taking many samples with expensive devices yields low decision errors • Taking few samples at low cost yields high decision errors • Result—may have to change objectives, e.g. minivols to see if you need to monitor and ask for more $

  45. Accuracy • DQOs are concerned with determining the accuracy of measurements

  46. Accuracy Accuracy = Total error • Includes both bias and precision • Measured by true audit and/or by evaluating method quality objectives

  47. Translating DQOs into Useable Criteria • DQOs should be defined in terms of data quality indicators • Criteria set for the data quality indicators are method quality objectives • MQOs are set by using empirical data, conservative assumptions, statistical assumptions, and/or common sense.

  48. Data Quality Indicators • These are sometimes Called the PARCCS • Precision (P) • Bias (A) (bias is sometimes called accuracy) • Representativeness (R) • Completeness (C) • Comparability (C) • Detectability (S) (also called sensitivity)

  49. Precision well different measurements of the same thing under prescribed similar conditions agree with each other “Random” component of error— sometimes high, sometimes low

  50. Precision Precision =“wiggle” (variability within many measurements of the same thing) • You are trying to estimate the variability within the population of “all” your measurements of the same thing (concentration) • Two ways to estimate precision for a single instrument • If you have enough equipment, side-by-side, can be two or more devices measuring the same concentration • If you have only one continuous instrument, you must estimate precision by how much the measurement fluctuates over time when it is measuring the same concentration?