1 / 12

Science Data System – Data Requirements and Conceptual Design DESDynI

Science Data System – Data Requirements and Conceptual Design DESDynI. Science Data Systems in the Decadal Survey. Era Workshop, June 25-26, 2009. David Cuddy DESDynI SDS Study Lead JPL. Overview of Presentation. Mission Overview End-to-End Data Flow Diagram

Télécharger la présentation

Science Data System – Data Requirements and Conceptual Design DESDynI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Science Data System – Data Requirements and Conceptual DesignDESDynI Science Data Systems in the Decadal Survey Era Workshop, June 25-26, 2009 David Cuddy DESDynI SDS Study Lead JPL

  2. Overview of Presentation Mission Overview End-to-End Data Flow Diagram Key Driving SDS Design Requirements Science Data Product Suite – Radar SDS Architecture & Functional Diagram Algorithm-to-Production Software Process Data Storage and Archive Strategy Data Distribution Challenge to DAACs

  3. DESDynI Mission Overview * The information presented herein has not factored in data system required or products generated from this tandem spacecraft, it is TBD on how NASA and DLR will divide responsibility and work if the collaboration consummated. • DESDynI: Deformation, Ecosystem Structure and Dynamics of Ice • Time-frame: Launch readiness 2016-2018, depending on budget; Operation 3 years nominal, extendable to 5 years • Mission Objectives: • Determine the likelihood of earthquakes, volcanic eruptions, and landslides • Predict the response of ice sheets to climate change and impact on the sea level • Characterize the effects of changing climate and land use on species habitats and carbon budget • Monitor the migration of fluids associated with hydrocarbon production and groundwater resources • Orbit: • LEO, SSO • Mission Architecture/Instrument: • One spacecraft with L-band Radar at 761 Km, 8-day repeat • Another spacecraft with multi-beam Lidar at 400 Km, 91-day repeat • DLR collaboration in progress: one more L-band radar spacecraft flying in formation with the above Radar spacecraft *

  4. DESDynI End-to-End Data Flow – Radar Legend NASA-Supplied Project-supplied Lidar Data Products Provider External Data Archive & Distribution (DAAC) DESDynI-Radar S-Band MOS/GDS SDS Validated Science Data Products Lidar Data Products S-Band, Ka-Band NASA TDRSS Station L0 b / L1 Data Products Information Management Life-of-Mission Data Storage Project & Instrument Team Data Access Radar Telemetry Radar Instrument Data Processing Science Data Analysis GSFC Comm Cloud Cmds Plb HK RT HK RT & Plb HK Plb HK Flight Ops Monitoring & Control L2/3, L3/4 Science Data Products S/C Provider L0b, L1, L2/3 Data Products S/C Cmds L2 / 3 / 4 Science Data Processing Instrument Cmds Instrument Monitoring Science Data Processing Realtime (RT) & Playback (Plb) HK

  5. Key Driving SDS Design Requirements • Total Mission Data Volume**: 30 TB/day (10 TB L0b, 20 TB L1, 29 GB L2+) or 10.8 PB/year • Processing Loading: Sized to meet respective product availability (latency) requirements with no backlog and with margin to include tbd reprocessing Notes: * Based on averaged 10% duty cycle of Quad-pol data acquisition @ 2.6 Gbps and 15% duty cycle of dual-pol data acquisition @ 1.3 Gbps. Studies are underway to examine mission scenarios that would reduce this number by upwards of 50%. ** Not including data for priority science, operational/decision support, nor reprocessing; assuming all L0 processed to L1 & ~1/4 to L2/3/4 TB – Terabyte (10**12 bytes); PB – Petabyte (10**15 bytes) Data Acquisition Volumes:Radar – 4.9 TB per day* Data Product Types:35 standard products (1 L0b, 10 Level 1’s, 18 Level 2/3’s, 6 L3/4’s) Data Product Availability:

  6. DESDynI Science Data Product Suite - Radar * Multilook/averaging is range bandwidth and application dependent • Level 0 Radar Instrument Products (1) • Level 0b – Level 0a formatted for processing/archiving • Level 1 Radar Instrument Products (10) • SLC – single-look complex (I/Q) • SLD – detected SLC (amplitude) • Quicklook product – uncalibrated detected image from L0b • MLC – multi-look complex created from SLC by 16:1 averaging* (4:1 in each dimension) • MLD – detected MLC (amplitude) • MLD browse – browse derived from MLD via 16:1 averaging • Interferogram (I/Q; 16:1 averaging* from full res interferogram) • Phase browse (4:1 averaging from phase of interferogram) • Coherence (32-bit; the same dimension* as the interferogram) • Coherence browse (coherence data averaged by 4:1)

  7. DESDynI Science Data Product Suite - Radar • Level 2/3 Science Products (18) • (4) Deformation & error map • 2D, 3D, mosaicked 2D, and mosaicked 3D • (10) Velocity & error map • 2D, 3D, DDInSAR, speckle tracking, and feature tracking • Mosaicked versions of these product types for regional mapping • (1) Geocoded PolSAR map (Stokes matrix from quad-pol MLC data) • (1) Geocoded SLD – for instrument CalVal support • (1) Geocoded MLD – for instrument CalVal and Operational/Decision support • (1) Geocoded Quicklook – for Operational/Decision support • Level 3/4 Science Products (radar+lidar data fusion products) (6) • (1) Sea ice thickness map • (1) Canopy closure map • (2) Forest structure and forest structure change maps • (2) Biomass and biomass change maps

  8. SDS Architecture & Functional Diagram • SDS Central Node - • manages information and resources (info on data, computing, services, etc) across SDS • provides a web-based portal for discovery and access to data and services • SDS Instrument Product (Radar) & Expert (‘Solid Earth Deformation’, ‘Ecosystem Structure’, and ‘Dynamics of Ice’) Nodes - • provide catalog, testbed, processing (L0b & up), ‘Life-of-Mission’ storage, and distribution functions Real-time Access to Catalog, L0b and higher products, Ancillary Data, Science Analysis Software Notes: DAAC – Distributed Active Archive Center Pipeline data flow

  9. SDS Node Architecture – Instrument & Expert Nodes Data and Services Registration @ SDS-Central Receipt/Delivery of Data Products Query/Receipt of Metadata Processing Control System Profile/ Registry Server Product Receipt/Delivery Server File Mgmt Resource Mgmt Product Catalog Product Repository Workflow Mgmt PGEs (Computational processing with product-specific PGEs): L0b, L1, L2, .... PGEs (Computational processing with product-specific PGEs): L0b, L1, L2, .... Algorithm Testbed PGEs (Computational processing with product-specific PGEs): L0b, L1, L2, .... Notes: Processing Control System provides a pipeline for cataloging, process management, computational processing, and workflow management Profile/Registry server allows for distributed queries of data within the node Product/Delivery Server allows for on-the-fly processing and delivery of data products from the node. Algorithm testbed/PGE supports instrument or science algorithm development, science product generation, and data analysis functions Product Repository includes Life-of-Mission storage and data product staging area

  10. Algorithm-to-Production Software Process Algorithm Software Formulation & Prototyping ATBDs * Algorithm Software Delivery Review Production Software Development - Single Executable Testing - Multiple Executable Pipeline Testing - System I&T Production Software Release Review SDS Testbed SDS Operational System Production Software Deployment * ATBD: Algorithm Theoretical Basis Document

  11. Data Storage and Archive Strategy • Distributed Storage Strategy • Data resides where it is generated • Data exists for project use • Centralized Information Management • Manages information and resources across SDS • Provides a web-based portal for discovery and access to data and services • Delivers validated products to DAAC’s for long term archive • DAAC’s to provide archive and distribution to the public • DAAC’s are TBD • Investigate trade off of data storage vs data re-generation • Rolling archive/storage for high volume products

  12. Data Distribution Challenge to DAACs • Data volume: • Multi-terabytes per day • Total number of data sets: 35 • DESDynI addresses three major disciplines: • Solid Earth Deformation • Ecosystem Structure • Dynamics of Ice • Interest in DESDynI data will include scientific investigators, government agencies, corporate and public users • DESDynI data may have much greater public interest • Will ESDS be ready? • DAAC’s have a formidable job ahead for DESDynI data

More Related