1 / 18

FSEA 2009 Spring Conference St. Petersburg, Florida May 21, 2009 Rock J. Vitale, CEAC, CPC

The Do’s and Don’ts of MDL Verification Studies. FSEA 2009 Spring Conference St. Petersburg, Florida May 21, 2009 Rock J. Vitale, CEAC, CPC Patrick A. Conlon Environmental Standards, Inc. MDL Historical Highlights.

Télécharger la présentation

FSEA 2009 Spring Conference St. Petersburg, Florida May 21, 2009 Rock J. Vitale, CEAC, CPC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Do’s and Don’ts of MDL Verification Studies FSEA 2009 Spring Conference St. Petersburg, Florida May 21, 2009 Rock J. Vitale, CEAC, CPC Patrick A. Conlon Environmental Standards, Inc.

  2. MDL Historical Highlights Procedure for Method Detection Limit (MDL) determination was promulgated on October 25,1984, in Appendix B to 40 CFR Part 136. • Regulated community and laboratories have technically criticized the procedure over the decades. • Procedure can represent an inaccurate expression of analyte/method sensitivity (and is labor intensive).

  3. Other Issues With 40 CFR Part 136 • The Optimization Factor – MDL studies can and have been artificially optimized using instrument conditions and spiking and preparation factors to provide “amazingly low MDLs” for laboratories attempting to gain a competitive edge. • The Reality Factor – The reporting of positive results based on “amazingly low” MDL studies has resulted in NPDES permit violations when, in reality, the reported detection is instrument noise (viz., false positive).

  4. MDL Historical Highlights2000 - Present • October 2000 – US EPA agreed to review the MDL procedure in response to lawsuits by several industry groups. • June 2003 – NELAC standards that required laboratories to calculate an annual Limit of Detection (LOD; viz., MDL) for each analyte, method, and matrix performed were approved. LOD calculations are only required when the laboratory reports below the Limit of Quantitation (LOQ). • June 2003 – NELAC requirement that accredited laboratories must perform annual LOD (MDL) and LOQ verification studies on each instrument used to report sample data was approved. LOQ verification is only required when the laboratory does not calculate an LOD.

  5. MDL Historical Highlights (Cont.) • October/November 2004 – The US EPA decided that stakeholder meetings (eventually called FACDQ) should be held to discuss the issue; withdrew proposed MDL procedure that was no different than the original. • February 2005 – SW-846 MICE line MDL requirements deleted from SW-846 methods and chapters because “it is not a true indication of method sensitivity.” • 2005 through 2007 – FACDQ holds numerous meetings and scores of conference calls; initiates pilot studies. • December 2007 – FACDQ released its Final Report.

  6. 2003 NELAC Requirements • Appendix C.3.1. Limit of Detection (LOD) • “The laboratory shall determine the LOD for the method for each target analyte of concern in the quality system matrices. All sample-processing steps of the analytical method shall be included in the determination of the LOD.” (bold added for emphasis)

  7. Let’s Consider • Good requirement - further clarity is needed due to the many different sample preparation types, matrix factors, and cleanup procedures. • Is a continuous liquid/liquid LOD verification acceptable for reporting separatory funnel analyses? • Where in the processing of Method 5035 volatile samples should “sample processing” begin; if not defined, variability among laboratories will result. • Method 5035 volatile preservatives are known to affect method performance. • Should separate volatile sodium bisulfate and methanol extraction LOD verifications be performed?

  8. 2003 NELAC Requirements Appendix C.3.1. Limit of Detection (LOD) “The validity of the LOD shall be confirmed by qualitative identification (bold added for emphasis) of the analyte(s) in a QC sample in each quality system matrix containing the analyte at no more than 2-3X the LOD for single analyte tests and 1-4X the LOD for multiple analyte tests. This verification must be performed on every instrument that is to be used for analysis of samples and reporting of data.”

  9. 2003 NELAC Requirements “shall be confirmed by qualitative identification of the analyte(s)” Laboratories frequently have very different definitions of “qualitative identifications.” Does the signal have to represent an acceptable qualitative identification (i.e., valid mass spectrum with all appropriate mass ions present - the same as reportable sample identification)?

  10. 2007 NELAC Interim Standard V1 M4 1.5.2.1: Limit of Detection (LOD) “The validity of the LOD shall be verified by detection (a value above zero) of the analyte(s) in a QC sample in each quality system matrix.” • So detection below the LOD is considered an acceptable verification. • Can detection be manually accomplished when not detected by automated systems? • Many laboratories believe that manually searching is acceptable for purposes of analyte detection verification – but as a practical matter, a manual search is not performed for every sample.

  11. 2007 NELAC Interim Standard • There is clearly difficulty in setting a standard for a valid verification. • Both qualitative and quantitative criteria are normally required for sample reporting! • Spiking at 2-3 times the calculated MDL tells the user that the method analysis is capable of seeing the analytes at the MDL? • Three-times the MDL is the traditional lower-end guidance for establishing an LOQ (formerly known as a PQL). • Is this an LOD verification or an LOQ verification?

  12. NELAC LOQ Verification 2003 Appendix C.3.2 Limit of Quantitation (LOQ) “The validity of the LOQ shall be confirmed by successful analysis of a QC sample containing the analytes of concern in each quality system matrix 1-2 times the claimed LOQ. A successful analysis is one where the recovery of each analyte is within the established test method acceptance criteria or client data quality objectives for accuracy. This single analysis is not required if the bias and precision of the measurement system is evaluated at the LOQ” (bold added for emphasis). 2007 NELAC Interim Standard Volume 1, Module 4, 1.5.2 “A successful analysis is one where the recovery of each analyte is within the laboratory established test method acceptance criteria or client data quality objectives for accuracy.” (bold added for emphasis)

  13. Let’s Review • The 2003 standard provided for the determination of precision and bias at the LOQ as a verification of the LOQ. • Some methods have LCS/QC sample criteria embedded, but it is clear that these criteria are often not applicable to the LOQ. • Client DQOs must be properly addressed in the general requirements for tenders and contracts. • Clearly, a standard and uniform approach is needed for general method validation.

  14. NELAC Requirements 2007 NELAC Interim Standard Volume 1, Module 4, 1.5.2 “An LOD shall be performed each time there is a change in the test method that affects how the test is performed, or when a change in instrumentation occurs that affects the sensitivity of the analysis (bold added for emphasis). “The LOD, if required, shall be verified annually for each quality system matrix, technology, and analyte.”

  15. Let’s Consider • Updating ICP inter-element correction factor? • Changing a GC or LC column? • Cleaning the MS source? • Increasing the photomultiplier signal? • Changing the instrument configuration, then changing it back? “or when a change in instrumentation occurs that affects the sensitivity of the analysis.”

  16. If Rock Were King, … • Eliminate all the “bad actor” compounds/analytes that simply do not work with the method with which they are paired on the methods lists. • Eliminate having laboratories report down to the LOD (never below the LOQ) and setting that as a requirement for accreditation to take the “heat off laboratories” from their clients/requestors. • Focus on defining proper (and universal) quantitative and qualitative requirements for establishing and verifying LOQs. • Require an LOQ LCS in every batch of samples and apply proper universal quantitative criteria.

  17. The Need for Further Definition • Properly define “All sample-processing steps” that need to be applied to LOD verifications. • Properly define “qualitative detection.” • Where the LOD verification spike is > 2 times the LOD, shouldn’t we require that the analyte must be detected at or above the LOD. • Properly define the preparation/analysis items that could alter sensitivity and that would require a new LOD verification.

  18. Thank You “Setting the Standards for Innovative Environmental Solutions” Rock J. Vitale, CEAC, CPC Technical Director of Chemistry/Principal Environmental Standards, Inc. 1140 Valley Forge Road P.O. Box 810 Valley Forge, PA 19482 610.935.5577 rvitale@envstd.com www.envstd.com

More Related