1 / 73

Wednesday Show and Tell

Wednesday Show and Tell . Theresa Crimmins Christina Wright Cuyler Smith Dave Press Gareth Rowell John Boetsch Lisa Nelson John Gross. Theresa Crimmins USA National Phenology Network. www.usanpn.org. Theresa Crimmins Network Liaison, USA-NPN (520) 792-0481

bell
Télécharger la présentation

Wednesday Show and Tell

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Wednesday Show and Tell Theresa Crimmins Christina Wright Cuyler Smith Dave Press Gareth Rowell John Boetsch Lisa Nelson John Gross

  2. Theresa CrimminsUSA National Phenology Network

  3. www.usanpn.org Theresa Crimmins Network Liaison, USA-NPN (520) 792-0481 theresam@u.arizona.edu

  4. USA-NPN Progress To-Date • Plant protocols available online (200+ species) • Phenology data/information system functional • Data entry available online • Data visualization tools in development • Animal protocols in development (available 2010) • Mammals, birds, amphibians, reptiles, fish, insects (>150 species)

  5. Example USA-NPN Plant Phenology Protocol (≈ I&M ‘how to observe’ SOP)

  6. NPS-NPN Partnership Strategy • Monitoring and research: Establish wide-spread monitoring of phenology in NPS units (NPS scientists, staff, visitors…) – develop training materials, encourage research in parks • Information management: Develop databases, data transfer mechanism, metadata requirements, “one-stop shopping” site for NPS • Communications: Widely disseminate information and knowledge of the importance and management implications of phenology – develop interp & education messages, fact sheets…

  7. Pilot in NETN Parks – Summer 2009 • Goal: Find the appropriate fit for phenology monitoring and use of phenology data within NETN parks. • Status: • Currently identifying target audiences, locations, methods • Identifying major ecological, methodological questions • Products: Draft field methods standard operating procedures for phenology protocols following Oakley et al. (2003) guidelines • Evaluation of pilot program – end of 2009 • Participants:NETN, Appalachian Trail, Acadia NP, Appalachian Mtn Club, Appalachian Trail Conservancy, The Wildlife Society, USA-NPN

  8. CA/PWR Cooperative Efforts • Recent call to engage interested parties • I&M, park biologists and resource managers, interpreters represented (as well as NPN, UCSB, Project BudBurst) • Working group established to pilot plant phenology monitoring • Next steps: • Select plant species & develop protocol for pilot monitoring • Collaborate with UCSB on two phenology monitoring training sessions – fall 2009

  9. Christina WrightSoutheast Coast Network

  10. National Park Service U.S. Department of the Interior SECN SharePoint 2009 Data Management Conference Show and Tell E X P E R I E N C E Y O U R A M E R I C A

  11. News and Updates

  12. Project Tracking

  13. Completed Projects Repository

  14. Data Management

  15. Data Entry using InfoPath Forms

  16. Generate Reports

  17. Links to Monitoring Data

  18. Resource Briefs

  19. Cuyler SmithSouthwest Alaska Network

  20. Data Manager’s Conference 2009 Bear-A-Where…Aerial Bear Survey Software Tools at the Southwest Alaska Network Cuyler Smith Data Manager - SWAN

  21. Data Manager’s Conference 2009 “Beaerial” survey events… • Generate random transects (initial flight paths) • Fly random transects with 1/2 observers and 1 pilot • Spot bear! • Deviate from flight path to count/identify attributes • Return to transect • Continue survey

  22. Data Manager’s Conference 2009 Software Involved • Existing • ArcMap 9.3 • ArcPad 7.1 • Custom (GeoNorth) • ArcMap Random Transect Generator (Pre-survey transects to be flown) • ArcPad Bear Tracking Applet (During survey to catalog attributes)

  23. Data Manager’s Conference 2009 Random Transect Generator (ArcMap)…

  24. Data Manager’s Conference 2009 Aerial Survey Tracking (ArcPad)…

  25. Data Manager’s Conference 2009 For more info… • Contact Bill Thompson, PI • Bill_Thompson@nps.gov “No matter how much you push the envelope, it’ll still be stationery!”

  26. Dave PressSan Francisco Area Network

  27. Streamflow Monitoring Database: A FAB Example David Press Ecologist / Data Manager San Francisco Area Network

  28. SFAN Streamflow Monitoring The specific monitoring objectives are to: • 1. Monitor the variability and long-term trends in streamflow using fixed, continuous, water stage recording stations by producing annual mean daily and monthly discharge estimates for core streamflow monitoring stations in GOGA, PRES, PINN, and PORE. • 2. Monitor the frequency, magnitude and duration of peak flow events at fixed, surface water level monitoring stations by producing peak and daily summaries of stage height and discharge for core streamflow monitoring stations in GOGA, PORE and PINN. • 3. Monitor the timing, frequency, magnitude and duration of unnatural or extreme low water/low flow events in stream reaches known to support threatened and endangered aquatic species in the dry season in GOGA, PORE, and PINN watersheds.

  29. Simplified Data Work Flow Import Mean Gage Height & Measured Discharge Develop Stage-Discharge Rating Download raw datalogger files in field Correct recorded stage (instantaneous) Calculate Discharge (instantaneous) Transfer data to recording gaging station spreadsheet template Perform QA/QC measures Identify data gaps, erroneous data

  30. Streamflow Measurements • Once a month at each sampling site • High stream flow events • Once a month at water quality sampling sites also

  31. Gareth RowellHeartland Network

  32. Monitoring Databases, ODBC and R Access Bird Monitoring Database R qry_BirdSppRich ODBC

  33. Some R code… >library(RODBC) >channel <- odbcDriverConnect() >BRich <- sqlFetch (channel, "qry_BirdSppRich") >attach(BRich) >names(BRich) >summary(SppRich_Yr+1) >hist(main="Bird Species Richness, All Parks x Years", SppRich_Yr+1) • Data displayed in R • Data from ODBC • Connection to Access

  34. Why use ODBC and R • Easy to “re-select” database for data exploration • Data are clean (not corrupt) going from database to R • R provides extensive collection of statistical tests and graphics • R and ODBC are widely used -- excellent documentation • R and ODBC are free

  35. John BoetschNorth Coast and Cascades Network (Olympic NP)

  36. 2009 I&M Data Management Conference Tucson, Arizona – April 2009 Reusable Project Application ComponentsJohn Boetsch - NCCN / Olympic NP

  37. What and Why? 2009 I&M Data Management Conference Tucson, Arizona – April 2009 • Multiple project database applications • Faster development & more consistent look & feel for project staff • Why not?

  38. Quality Assurance Tool 2009 I&M Data Management Conference Tucson, Arizona – April 2009 • Uses pre-built queries to check the completeness, structural integrity and logical consistency of the data set

  39. Quality Assurance Tool 2009 I&M Data Management Conference Tucson, Arizona – April 2009 • What you don’t know CAN hurt you(r analysis)! • Missing records • List of scheduled sites that weren't visited … were they simply not entered? • Missing values • Field coordinate source = 'GPS' but missing GPS model or GPS file name • Duplicate records • Locations with more than one coordinate record per sampling date - verify that these are intended • Illogical data combinations • Marker status = 'removed' but no removal date, or with a removal date and status <> 'removed'

  40. Quality Assurance Tool 2009 I&M Data Management Conference Tucson, Arizona – April 2009 • We talk a lot about how relational databases are useful for this type of quality assurance ... • This is a tool for rigorously checking data against a set of project-specific rules

More Related