1 / 22

A WORKABLE APPROACH FOR STATE COMPREHENSIVE ASSESSMENTS

A WORKABLE APPROACH FOR STATE COMPREHENSIVE ASSESSMENTS. Jen Stamp, Vermont DEC. Incorporation of probabilistic survey design into existing program First attempt (2000) Second attempt (2002). Rotational Hex Design (2002-2006) Description Results 2002-2004

farrah
Télécharger la présentation

A WORKABLE APPROACH FOR STATE COMPREHENSIVE ASSESSMENTS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A WORKABLE APPROACH FOR STATE COMPREHENSIVE ASSESSMENTS Jen Stamp, Vermont DEC

  2. Incorporation of probabilistic survey design into existing program • First attempt (2000) • Second attempt (2002) Rotational Hex Design (2002-2006) • Description • Results 2002-2004 • Upcoming analysis of entire dataset Vision for future design (2007-2011) & ongoing questions about probabilistic design Background on VT’s ambient biomonitoring program (1982)

  3. Vermont’s Ambient Biomonitoring Program(started in 1982) • Assess the biological health of Vermont's aquatic environment through macroinvertebrate and fish communities • Sample sites in each basin once every 5 years • Targeted sampling

  4. US EPA Probabilistic Survey Design • Sites are selected randomly, with each sampling location representing a known length of stream • Advantages: can use statistics with known confidence levels to assess the condition of large areas based on data collected from a relatively small representative sample of locations • Other option: conducting a census, in which every stream unit in the state is examined = not practical

  5. Objective:create a sustainable random sampling program that can be integrated into our existing ambient biomonitoring framework, that doesn’t put undue burden on our limited resources, yet provides results with reasonable statistical confidence

  6. First attempt (2000) Worked with the US EPA on an experimental design that blended Vermont’s existing site locations with random site selection • Compiled database of bug and fish data from 1990-2000 (= 301 bug sites & 153 fish sites • Selected random sites using the random tessellation stratified (RTS) survey design (Stevens 1997) • An association rule (nearest distance) was used to link the random sites with existing sites • The biological data from the selected sites was used to calculate the % of Vermont’s streams in excellent, good, fair & poor conditions (bugs and fish were analyzed separately) • Experimented with different sample sizes (fish 50, 100 & 153; bugs 50, 200 & 301) to compare differences in results and confidence intervals

  7. EPA Statisticians: interesting exercise, but statistically unacceptableBottom line – it’s not randomConscious or not, existing sites were selected preferentially and this bias could not be definedBack to the drawing board… Conclusions

  8. EPA Statistician Second attempt (2002-2006) Terms of agreement: • 15 sites per year = reasonable workload (75 sites over 5 years) • Data from 15 sites per year have limited statistical worth, but when combined, the 75 total sites should provide data with more reasonable statistical validity • Each year the random sites will be located in the rotational basins that are being sampled as part of our existing biomonitoring program US EPA Solution – Rotational hex design • Overlay the rotational basins with a hexagonal grid (comprised of 15 hexs per basin) • Use GIS technology to randomly select stream segments to be sampled within each hexagon

  9. Vermont’s Rotational Hex Design

  10. Site Selection • EPA provided us with a list with 3 random picks for each hex • Goal = sample at least 1 site within each hex • Common reasons for site elimination • 1st order stream • lake or pond outlet • accessibility issue • Redraws were requested if we were unable to do any of the 3 sites

  11. HEX Stats 2004 Hex Stats • Total reach length and # of reaches in each hex is different • Therefore each hex is weighted differently - those with more stream miles receive more weight in the analysis • Hexagon weight equals total reach length in hex ÷total reach length within entire basin i.e. for Hex 1, weight equals 50366.43÷ 1503668.75 = 0.0335

  12. 2004 Rotational Basin Biological Assessments • Assumption: the biological rating assigned to a site represents the overall condition of the wadeable streams within that hex (excluding 1st order & non-wadeable) • 5 categories: excellent, very good, good, fair, poor • If bug and fish ratings differed, we used the lower of the two ratings • Biocriteria was used for high gradient streams; best professional judgment was used for slow winders/low gradient streams

  13. Mix of agricultural, urban & forested areas. 2002 Mostly forested with scattered urban areas. 2003 Mix of agricultural, urban & forested areas. 2004

  14. Confidence Intervals • Calculation of CI’s was difficult because sites were not weighted equally • Two techniques: jackknife & cdf.est function (R software) • As expected, the data had very large confidence intervals when analyzed on a yearly basis Interpretation of graph: There is a 95% probability that between 0 and 46% of the assessed streams in the 2004 sampling frame were in fair and poor condition (not attaining our WQ standards). CI’s are about 25-30%.

  15. Thiessen polygons In the works… • How will we analyze the combined 75-site dataset? • Plot the sites, use GIS to create Thiessen polygons around each point, derive new statistics (i.e. # stream miles associated with each point) from these polygons • Accurately determine the # of stream miles in the 1:100,000 NHD layer that were excluded from assessments (1st order, non-wadeable, lake/pond outlets)

  16. Vision for future design (2007-2011) • Stop using the hex design & start using the Generalized Random Tessellation Stratified (GRTS) design (Stevens & Olsen 2004) to randomly select sites so that each site will be weighted equally (=easier analysis) • These randomly selected sites will continue to be located within the appropriate rotational basins for a given year • Use the updated 1:5,000 VHD stream layer (=big improvement over the 1:100,000 NHD layer) • Will be able to differentiate between perennial and intermittent streams • Explore other ways to further refine site selection by creating groups (i.e. based on geomorphic traits, land use, gradient), then randomly select sites from within these groups • Determination of sample size will be a balancing act between resources available and the attainment of reasonable confidence intervals

  17. Questions we are struggling to resolve regarding probabilistic surveys… • How valuable is the data that we are collecting? What will the data be used for? • State of the state’s waters report 305(b) • Tracking long-term large-scale trends (useful at the federal level) • Is it truly representative of overall stream conditions throughout our state? • How does the random sampling fit in with our existing program? What should be our priority? • How reproducible are the results?

  18. The End Contact info: Doug BurnhamVermont DECWaterbury, VT (802) 241-3784doug.burnham@state.vt.us

  19. Strahler order (RF3 layer) 7100 miles of rivers and streams in Vermont, minus 1st & 5th orders = 2719 miles assessed/4380 unassessed

  20. Scale – 1:24,000 vs 1:100,000

  21. Confidence Limits • Normal Approximation using t-distribution • Sample size = 75 • Assumes equal weighting (unequal weighting will make intervals wider)

More Related