1 / 33

Outline

“Benchmarking” Exposure Priority Setting Tools U.S. EPA Exposure Based Chemical Prioritization Workshop April 6 th /10 M.E. (Bette) Meek McLaughlin Centre University of Ottawa. Outline. Context Increasing Efficiency What We’ve Learned from Exposure Profiling Drawing on Canadian experience

jerom
Télécharger la présentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Benchmarking” Exposure Priority Setting ToolsU.S. EPA Exposure Based Chemical Prioritization WorkshopApril 6th/10M.E. (Bette) MeekMcLaughlin CentreUniversity of Ottawa

  2. Outline • Context • Increasing Efficiency • What We’ve Learned from Exposure Profiling • Drawing on Canadian experience • “Benchmarking” • “Groundtruthing” • Some Potential Options in Moving Forward

  3. The 4-Step Paradigm Risk Assessment Hazard Identification Dose Response Assessment & Characterization Exposure Assessment & Characterization Risk Assessment & Characterization Introduced in the mid 1980’s by the U.S. National Academy of Sciences

  4. Risk Assessment – Qualitative/Quantitative • Hazard Identification (qualitative - is it toxic?) • Effects seen in animals and/or humans • Dose Response Analyses (quantitative – how toxic to humans?) • Shape of the dose response curve • Range of observation & inference • Quantitative relevance to humans • Exposure Estimation • Risk Characterization • Comparison of Exposure and Effect

  5. hazard identification/ characterization dose-response exposure estimation risk characterization political social economic Engineering Risk Assessment(organizing & analyzing to set priorities & guide management) Risk Management(decision & action) Increased Efficiency McLaughlin Centre University of Ottawa

  6. Keep in Mind: • The appropriate performance indicator for chemicals programs is not priority setting, testing nor assessment but rather: • Effective and efficient management of risk • Now

  7. Evolving Mandates for Existing Chemicals Dealing with “grandfathered” chemicals • Canada • “Categorization” (i.e., systematic priority setting) for 23, 000 chemicals by Sept., 2006 under the Canadian Environmental Protection Act (CEPA) • Environmental, Consumer • Europe • Registration, Evaluation and Authorization of Chemicals (REACH) (2007) • Volume trigger and hazard based • Consistency between Existing and New Chemicals • Industry Responsibility • U.S. • Voluntary Testing Initiatives • Renewal of the Toxic Substances Control Act (TOSCA)

  8. Implications of Regulatory Developments to Consider All Chemicals • Need for increased efficiency in risk assessment & management • Processing much larger numbers of substances • Requires more integrated approach across compounds (less chemical-specific; more contextually relevant) • More/better predictive tools & more efficient testing • Drawing on new technologies • Requires a shift in approaches, testing and assessment strategies

  9. The NAS 4-Step Paradigm The Need to Move On Hazard Characterization Dose Response Assessment & Characterization Exposure Assessment & Characterization Risk Assessment & Characterization Earlier focus on exposure & how effects are induced (mode of action)

  10. The Challenge to the Risk Assessment Community We need to be much more: • Efficient • Predictive • Application Driven (i.e., risk management) • Reponsive, & • Communicative McLaughlin Centre University of Ottawa

  11. The Challenge to the Risk Assessment Community (Cont/d) • Broadly drawing upon the existing experience on relatively limited numbers of chemicals, to “inform” efficient assessment and management of the remainder • Essential basis for “benchmarking” • CEPA “Categorization” of Existing Chemicals offered unique opportunity • Required consideration of “all”

  12. CEPA 1999 Existing Substances Program CATEGORIZATION of the Domestic Substances List (DSL) (First Phase) (n=23,000) INCREASING REFINEMENT OF PRIORITIES + COMPLEXITY OF ASSESSMENT Decisions of Other Jurisdictions Public Nominations Greatest Potential for Human Exposure Substances that are Persistent or Bioaccumulative DECREASING NUMBERS OF SUBSTANCES “Inherently Toxic” to Humans “Inherently Toxic” to non-Human Organisms SCREENING ASSESSMENT (Second Phase) No further action under this program Risk Management CEPA-Toxic IN-DEPTH ASSESSMENT - Priority Substances List (Third Phase) DECREASING UNCERTAINTY No further action under this program Risk Management CEPA-Toxic

  13. Simple and Complex Priority Setting Tools EXPOSURE Simple Exposure Tool (SimET) - Relative ranking of all DSL substances based on submitters (S),quantity (Q) and expert ranked use (ERU) Complex Exposure Tool (ComET) - Quantitative plausible maximum age-specific estimates of environmental and consumer exposure for individuals based on use scenario (sentinel products), phys/chem properties & bioavailability Potential for exposure influential in setting priorities Included simple use profiling for all 23, 000 chemicals, more complex use profiling for priorities HAZARD Simple Hazard Tool (SimHaz) - Identification of high or low hazard compounds by various agencies based on weight of evidence and expert opinion/consensus Complex Hazard Tool (ComHaz) - Hierarchical approach for multiple endpoints & data sources (e.g., (Q)SAR) including preliminary weight of evidence framework

  14. The Simple Exposure Tool - SimET SimET is the tool by which we “binned” and relatively ranked all 23,000 substances Based on three different lines of evidence, derived from the limited information provided for all substances on the DSL: quantity (estimated annual quantity of use, Q), number of submitters (S) use (sum of normalized expert ranked use codes, U), reflecting two workshops “Ground-truthed” against more robust and recent data on use Commercial chemical profiles Mandated use surveys Use far more important than volume as the critical driver

  15. Potential for Exposure (Greatest, Intermediate & Lowest) Score for each substance = ∑ (use x relative ranking for PE ) – e.g., direct consumer use, dispersive environmental, industrial, etc.

  16. Post Categorization Refinement of Exposure (Complex Tool) Objective: • Multi-tiered approach for consumer and environmental exposure • generic scenarios, defaults and most common use/product categories in early stages • E.g., sentinel products - consumer product that yields the highest exposure for one of its component substances • Increasing refinement in subsequent stages Methodology: • Comparison of algorithms and default values in consumer exposure tools as basis for iterative approach • 8 models/algorithms • Range of sources of default values • Development of use profiles for hundreds of chemicals based on robust search strategies

  17. Chemical Identity Measures of Dose-Response for Critical Effects Physical/ Chemical Properties Substance Profile Production Quantity Production Quantity Bin + Release Factor Near-field Far-field Emissions Age Specific Variables Human Exposure Far Field Sentinel Products Priority for Assessment SP1 SP2 SP3 SPn Overview of Early Tier

  18. Tier 1 Model Comparison – Sample of Most Common Product Categories

  19. What Did We Learn re Exposure “Surrogates”? • Simple use profiling can be discriminating • Importance of consumer vs. environmental exposure • Persistence/bioaccumulation ≠ exposure • Volume ≠ exposure; use profiling more influential • implications for selection criteria for current testing programs • Importance of early and iterative use profiling • Much of the information is publically available

  20. What Did We Learn re Exposure Tools? Criteria for consideration of algorithms and default values: • Transparency, Defensibility • “Validation”/Acceptance and Use • Scope • Relevance • Complexity for various iterations Observations: • Transparency was limited • “expert judgment” • No consistency in default values, or algorithms for “screening” vs. more robust models • Limited incentive to harmonize or draw upon the work of others • Range of coverage is limited • Early tiers for consumer exposure provided limited additional discrimination

  21. The Challenge to the Risk Assessment Community (Cont/d) So, how do we get there? • Broadly drawing upon the existing experience on relatively limited numbers of chemicals, to “inform” efficient assessment and management of the remainder • Essential basis for “benchmarking”

  22. Potential for “Benchmarking” • Relative rankings for “potential for exposure” for 23, 000 substances (use “weighted”) • More detailed use profiling for several hundred substances • Detailed multimedia exposure estimates for a proportion (approx. 100 Priority Substances), including: • Compilation of phys/chem properties • Probabilistic estimates based on national monitoring data for a portion • Measured consumer exposure for a portion • Some with biomonitoring data

  23. Individual sublists are then rank ordered using the PE Score (the highest score = the highest priority)

  24. Considerations for “Benchmarking” • Select chemicals for quantitative “anchoring” of “surrogates” balancing: • broadest representation of “chemical space” • i.e., range and nature of uses, physical/chemical properties • Availability of data, as a basis for robust exposure estimates • Monitoring, biomonitoring • Objective is to select for the simplest and most discriminating “determinants” of exposure • “Designing” to limit exposure

  25. Evolution of Risk Assessment 21st Century 1980s 2000s 1990s RA/RM Paradigm Guidelines/Methods Dosimetry/PbPK Mode of Action Susceptible Populations Mixtures Toxicity Pathways Integrated Approaches CompTox 1983 2007 1994 2009

  26. How Quickly does Risk Assessment Evolve? • In 1984, the “benchmark dose” was introduced • Now receiving widespread acceptance 25 yrs. later • Much guidance, many recommendations on analysis of uncertainty since the 1980’s • Several NAS & EPA reports • Incorporation of PBPK modelling • Since the late 1980’s • Formal consideration of mode of action; chemical specific adjustment factors (National & International) • Since the late 1990’s • Tiered assessment (‘94 NAS report) • Problem formulation • EPA Guidance on Risk Characterization (2000)

  27. Barriers to Change • A function (in part) of: • The past focus on “Hazard” • Traditional approaches “codified” “institutionalized” (requiring limited expert interpretation), easy to explain to stakeholders & somewhat consistent • Lack of long term planning for regulatory science • Meeting short term deadlines for numbers of assessments vs. longer term investment in predictive methodology • Communication: Lack of “user friendly” tools • Lack of coordination of regulatory risk assessment/research

  28. The Future of Chemical Risk Assessment • Optimistic re the availability of more predictive tools drawing on greater exposure and biological knowledge • Less optimistic re the will and capability to efficiently implement change • Collective leadership

  29. IPCS Framework for Combined Exposure to Multiple Chemicals Early and Continuing Consideration of Exposure Criteria for Considering an Assessment Group • What is the nature of exposure and are the components known? • Is exposure unlikely or very low taking into account the context? Is there a likelihood of co-exposure within a relevant time frame ? • What is the reason to believe that components act similarly or interact? • Information on chemical structure • Hazard or other biological data (tox or efficacy)

  30. Sample Tiered Exposure and Hazard Considerations Mixture or Component Based Tiered Exposure Tiered Hazard Yes, no further action required Assessments Assessments Tier 0 Tier 0 Simple semi-quantitative estimates of exposure Dose addition for all components Input from exposure or hazard assessments (iterative process) Is the margin of exposure adequate ? Tier 1 Tier 1 Refined potency based on individual POD, refinement of POD Generic exposure scenarios using conservative point estimates Increasing refinement of hazard models (MOA) Increasing refinement of exposure models Tier 2 Tier 2 Refined exposure assessment, increased use of actual measured data More refined potency (RPF) and grouping based on MOA Tier 3 Tier 3 PBPK or BBDR; probabilistic estimates of risk Probabilistic Exposure Estimates No, continue See text for details

  31. More Information? Existing Substances Division Website – http://www.hc-sc.gc.ca/exsd-dse IPCS Harmonization Website http://www.who.int/ipcs/methods/harmonization/index.html

  32. A.1 Models and Sources of Algorithms Considered • ComET (developed by Health Canada and The LifeLine Group) • ConsExpo v. 4.0 • ECETOC (from Targeted Risk Assessment 2004 Technical Report No. 93) • Soap and Detergent Association 2005 • EAU – Cosmetic Workbooks (Health Canada’s Environmental Assessment Unit) • CEM v. 1.2 (from US EPA E-FAST) • U.S. EPA 1997 (Exposure Factors Handbook Volume I) • Health Canada 1995 (Handbook for Exposure Calculations)

  33. A.2 Default Values Considered • Versar Inc., 1986. Standard Scenarios for Estimating Exposure to Chemical Substances During Use of Consumer Products, Volume I and II • ConsExpo v 4.0 (and RIVM Factsheets) • SDA (Soap and Detergent Association) 2005. Exposure and Risk Screening Methods for Consumer Product Ingredients • ECETOC 2004. Targeted Risk Assessment, Technical Report No. 93 • ComET (Health Canada/LifeLine) • US EPA 1997. Exposure Factors Handbook Volume III – General Factors • EAU (Environmental Assessment Unit – Health Canada) 2005. The Cosmetics Exposure Workbook • Various books on product formulations to obtain weight fractions

More Related