340 likes | 490 Vues
Development of Chemistry Indicators. Steven Bay Southern California Coastal Water Research Project (SCCWRP) steveb@sccwrp.org. Presentation Overview. Workplan update and response to comments Project status Preliminary results Data screening Normalization SQG comparison.
E N D
Development of Chemistry Indicators Steven Bay Southern California Coastal Water Research Project (SCCWRP) steveb@sccwrp.org
Presentation Overview • Workplan update and response to comments • Project status • Preliminary results • Data screening • Normalization • SQG comparison
Chemistry Indicators • A methodology for interpreting sediment chemistry data relative to impacts on benthic organisms (e.g., an SQG approach with numeric values) • Link to pollutants of concern • Familiar approach • Many available data • Several challenges to effective use • Bioavailability • Unmeasured chemicals • Mixtures
Objectives • Identify important geographic, geochemical, or other factors that affect relationship between chemistry and effects • Develop indicator(s) that reflect contaminant exposure • Develop indicator(s) that are protective and predictive of impacts • Develop thresholds for use in MLOE framework
Approach • Develop a database of CA sediment quality information for use in developing and validating indicators • Address concerns and uncertainty regarding influence of regional factors • Document performance of recommended indicators • Develop both empirical and mechanistic indicators, if possible • Both types have desirable attributes for SQO use • Investigate existing and new approaches • Emphasis is on priority chemicals identified as likely causes of impairment
Approach • Evaluate SQG performance • Use CA data • Use quantitative and consistent approach • Select methods with best performance for expected applications • Describe response levels (thresholds) • Consistent with needs of MLOE framework • Based on observed relationships with biological effects
SSC Comments • More detail needed regarding data screening, matching, establishment of validation dataset • Lack of clarity regarding the respective roles of empirical and mechanistic guidelines • Approaches not interchangeable • How will mechanistic guidelines be developed/validated? • Should use all available approaches, but how? An evolving and thorough process, an overview is included in this presentation A conceptual plan is included in this presentation, your input is welcome
SSC Comments • Clarify how metals normalization results will be used • Provide greater independence of chemistry line of evidence • More detail needed regarding calibration of guidelines and comparison of performance (within CA and nationally) Will explore utility in improving guideline performance and establishing background concentrations Agree this is an important goal, part of motivation for using mechanistic guidelines and metal normalization A revised comparison approach is proposed that is more consistent with MLOE framework
Tasks • Prepare development and validation datasets • Develop and refine SQGs • Evaluate SQGs • Describe response levels
Task 1: Prepare Datasets Substantial progress made Create high quality standardized datasets for development and validation activities • Evaluate data quality and completeness • matched chemistry and biology • Appropriate habitat • Data quality, nondetects • Calculate derived values • e.g., sums, means, quotients • Normalize data • e.g., metals, TOC • Stratify and subset data • Independent validation data • Address geographic or mixture patterns
Data Screening • Appropriate habitat and geographic range • Subtidal, embayment, surface sediment samples • Chemistry data screening • Valid data (from qualifier information) • Estimated nondetect values • Completeness (metals and PAHs) • Toxicity data screening • Target test method selection • Valid data (control performance) • Lack of ammonia interference • Selection of matched data • Same station, same sampling event
Validation Dataset • Used to confirm performance of recommended SQGs • Independent subset of SQO database • Approximately 30% of data, selected randomly to represent contamination gradient • Includes acute and chronic toxicity tests
Metal Normalization • Metals occur naturally in the environment • Silts and clays have higher metal content • Source of uncertainty in identifying anthropogenic impact • Background varies due to sediment type and regional differences in geology • Need to differentiate between natural background levels and anthropogenic input • Investigate utility for empirical guideline development • Potential use for establishing regional background levels
Reference Element Normalization • Established methodology applied by geologists and environmental scientists • Reference element covaries with natural sediment metals and is insensitive to anthropogenic inputs • Use of iron as reference element validated for southern California • 1994 and 1998 Bight regional surveys
Chromium (mg/kg) Iron (%) Reference Element Normalization
Chromium (mg/kg) Iron (%) Reference Element Normalization
Reference Element Normalization Nickel Copper Use iron:metal relationships to: Estimate amount of anthropogenic metal for use in SQG development Identify background metal concentrations
Task 2: Develop/Refine SQGs Work in progress Investigate a variety of approaches or refinements and pursue those with the best potential for success. Focus on mixture models, empirical and mechanistic • Apply existing approaches (off the shelf) • Refine existing approaches • Calibrate existing approaches • Develop new approaches
Mechanistic vs. Empirical SQGs • Differences in utility for predicting impacts and determining causation • Both types of information needed for interpretation of chemistry data • Mechanistic SQG results will be useful for subsequent applications needing to identify cause of impairment • Anticipate chemistry LOE score will be based on combination of SQGs • Complementary, not interchangeable • Several strategies possible, looking for input on recommended approach
Guideline Calibration • Use of CA chemistry/effects data to adjust empirical guideline models or thresholds • LRM: model and thresholds • Effects range: CA-specific values and thresholds • AET: CA-specific values • Consensus & SQGQ-1: thresholds • Comparisons between existing and calibrated SQG results used to guide recommendations • Only use calibrated values if improved performance can be demonstrated
Task 3: Evaluate Approaches Work in progress Document and compare performance of candidate SQGs approaches in a manner relevant to desired applications • Compare overall discriminatory ability • Identify applications • Quantify performance • Validation dataset • Standardized measures • Compare performance and identify the most suitable approaches
Performance Comparison • Approach • Focus on empirical guidelines • Compare among candidates to select a short list • Compare to existing approaches to evaluate need for new/calibrated approaches • Previous strategy for comparison • Current work plan: Binary evaluation (effect/no effect) • Calculate several measures of performance
C D B A Performance Measures Negative Predictive Value =C/(C+A) x 100(percent of no hits that are nontoxic) Specificity=C/(C+D) x 100(percent of all nontoxic samples that are classified as a no hit) Positive Predictive Value =B/(B+D) x 100(percent of hits that are toxic) Sensitivity=B/(B+A) x 100(percent of all toxic samples that are classified as a hit)
Performance Comparison Proposed revised strategy • Evaluate ability to classify stations into multiple categories • More consistent with MLOE approach • Less reliance on a single threshold • Magnitude of error affects score • Utilize both toxicity and benthic impact data
Observed Toxicity Predicted Effect From SQG High Moderate Marginal Reference High 60 30 20 1 Moderate 33 50 25 0 Marginal 10 14 65 6 Reference 3 7 20 25 SQG 1
Kappa Statistic • Developed in 1960-70’s • Used in medicine, epidemiology, & psychology to evaluate observer agreement/reliability • Similar problem to SQG assessment • Can incorporate a penalty for extreme disagreement • Sediment quality assessment is a new application
Observed Toxicity Kappa = 0.48 Predicted Effect From SQG High Moderate Marginal Reference High 60 30 20 1 Moderate 33 50 25 0 Marginal 10 14 65 6 Reference 3 7 20 25 SQG 1(good association between adjacent categories)
Observed Toxicity Kappa = 0.27 Predicted Effect From SQG High Moderate Marginal Reference High 60 1 20 30 Moderate 33 50 0 25 Marginal 14 10 65 6 Reference 20 7 3 25 SQG 2 (Poor association between adjacent categories)
Task 4: Describe Response Levels Methodology under development Determine levels of response for the recommended SQG approaches • Relate SQGs to biological effect indicator responses (benthos & toxicity) • May use statistical methods to optimize thresholds • Select response levels that correspond to objectives for performance and beneficial use protection
Summary • Work on many key elements underway • Priority is to build upon existing approaches • Many of the technical obstacles have been dealt with • Overall approach is consistent with SSC recommendations • Include empirical and mechanistic approaches • Expect to succeed in selecting recommended SQGs for use in MLOE framework • Much work remains, especially for development of thresholds