110 likes | 247 Vues
This document outlines the functionalities and processes involved in the CHPS data archiving system developed by CNRFC. It details the setup for saving essential readme files, launching selected tgz datastore, and maintaining weekly archives on the F1 desk. Key components include integration with HEC-RAS, SSTG boundary data, and surge modeling from various sources such as ADCIRC and SLOSH. There's a focus on automated quality control and data conversion utilities to ensure consistent archiving practices, including the use of Python tools for streamlining processes.
E N D
Archiving CHPS Data • Developed by CNRFC • Saves readme with info on fews bin, • java bin, hec-ras version, OHD-CORE • version and chps patch#. • Un-tar’s and launches the selected .tgz • datastore in an SA. • Archived daily on F1 desk. • Starting to use \chps_archive partition • rsync \chps_share, \awips\rep, \data\flatfiles • to TB drive as backup. Open ticket with • NCF/HSB on LTO2 backup
HEC-RAS Surge Model • SSTG US Boundary • Stage DS Boundary • ADCIRC • SLOSH • ESTOFS • (via SBN now) • MDL ETSURGE • Surge Scenarios • (2007 MEOWS)
HEC-DSS Utilities • HEC-DSSVue Python • OFS Calbard, NOS tide to HEC-DSS • Automated QC limits, fill missing, shift • TS, Merges, Add/Sub/Mult Conversions. • Sheftodss Python • Older HEC Sheftodss conversion (HEC) • Archiving MAP. MAPX (1, 6hr), QINE, • STG • Converted historical MAP Cards to DSS
Simplified LagK TS View Courtesy Katelyn Costanza/Scott Lincoln