1 / 39

The ESA Scientific TESTBED Scenarios

The ESA Scientific TESTBED Scenarios. F. MARELLI (ACS) Fulvio.Marelli@acsys.it. Roma 16.09.09. The ESA testbed in CASPAR. The Main Objective: to preserve the “ability” to process data (our case: GOME Level 1b to Level 1c)

montana
Télécharger la présentation

The ESA Scientific TESTBED Scenarios

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The ESA Scientific TESTBED Scenarios F. MARELLI (ACS) Fulvio.Marelli@acsys.it Roma 16.09.09

  2. The ESA testbed in CASPAR The Main Objective: to preserve the “ability” to process data (our case: GOME Level 1b to Level 1c) • Several Level1c products can be obtained on demand from a single Level1b (applying different calibrations) • I need to preserve not only the Data and related Knowledge but also the data process workflow • Large, complex and interrelated dataset and RepInfo • Similar issues involve many other Earth Observation instrument datasets

  3. The web application

  4. Deployment • - Caspar – Nas

  5. Client • The testbed client runs as a web application: • Requisites : • Windows and Linux compliant • Tomcat 6 • Java 6

  6. Il Testbed Scenario 0 - Preparation phase Scenario 1 - Data ingestion Scenario 2 – Data Access Scenario 3 – Preservation

  7. Data to preserve • What we have to preserve to be able to process data from L1B to L1C: • GOME Level 1 products (YYYY/MM/DD/*.lv1) • PSD.pdf: Level 1 and Level 2 Product Specification Document • license.doc: License for the GOME data products on FTP and CD • disclaimer.pdf: Disclaimer for GOME Level 1 and Level 2 data product summarizing the status of the current GDP data quality • ERS-Products.pdf: ERS Ground Stations Products specification • ProductSpecification.pdf: Product Specification Document of the GOME Data Processor • The Ozone, the ERS-2 satellite, the GOME sensor, etc. • Processors (precompiled versions of the Level 1 extraction software) • 'C' files with the Level 1 extraction software (source code). • readme_1st.doc: Summary of files and documents • readme.doc: OS details for the precompiled versions of the software • release_l01.doc: GDP Level 0->1 Processing Release Notes • user_manual.pdf: GDP Level 1 and Level 2 Extraction Software Manual • howtouse_l01.doc: Brief explanation on how to use the software • .

  8. Preparation phase Satellite_ERS_Instrument GOME User GOME Expert GOME Scientist Satellite_ERS_Introduction PIR GOME L1b Data GOME L1 Spec GOME L1C Data L1b  L1c processor Gome proc. Manual Gome proc. Source Code Registry KM DAMS PACK

  9. Phase 1 – Data ingestion Data Producer Level 1b AIP Level 1C Proxy AIP Processor Executable AIP SIP SIP Processor Source Code AIP Processor Help Docs AIP L1 Processor GOME L1b data Level 1 Docs AIP PDS PACK FIND RepInfo Registry KM • The data producer ingests into CASPAR the whole Level 1B product, • the processor (if not already ingested)

  10. Phase 2 – Search and retrieval Level 1C Proxy AIP Level 1b AIP Processor Executable AIP Processor Source Code AIP Processor Help Docs AIP Level 1 Docs AIP • Two different CASPAR users, with a different knowledge, are browsing the archive searching for a specific Level1C product. They retrieve from the CASPAR the requested data and the appropriate Representation Information needed have a full data comprehension, depending on their knowledge. Gome Expert FIND Gome User PDS Additional RepInfo KM Registry PACK

  11. Data browsing ontology

  12. Phase 2.1 - L1C Creation • The users need to re-create the Level 1C product: • The Caspar system allows 3 different solutions depending on the user knowledge • L1B + Processor Executable downloading • L1B + Processor Source code (and RepInfo) for downloading and customized processor execution • On – Demand L1C AIP creation and Ingestion, using the PDS transforming module procedure

  13. Phase 4 - System update • An event affecting the L1B  L1C processor happens (e.g. typically a new release using different libraries). This event generates the basic necessity to have a new proper version of the processor ingested into the system. • At the same time all the Representation Information needed to retrieve the appropriate processor version must be updated. CASPAR should update all the links between processor and data, in order to keep track of both the old and the new version of the Processor and guarantee the user the chance to perform the L1B product processing.

  14. Preservation Scenario Gome L1 Dataset L1BL1C processor Gome L1 products L1BL1C source code Related documents User Community CASPAR notifies Uses Gome data (L1 prod & L1BL1C processor) Processor recompiling PDS Find POM Events chain 1. OS Change Get processor source code New processor PDI Updates Provenance RepInfo 2. Alert 3. Proc. Recompiled notifies 4. Proc. Reingested 5. Docs&links updated

  15. Sys Admin Ontology

  16. Which components are used

  17. Which components are used

  18. Testbed Login procedurre

  19. User interface

  20. Phase 1 • Ingesting data

  21. Ingesting L1B file

  22. Ingestion Report

  23. Ingesting the Processor

  24. Phase 2 • Searching and retrieving data

  25. Searching data

  26. Search Results

  27. Search Results – Different user

  28. L1C on demand creation

  29. L1C created and Ingested

  30. Phase 3 • Update procedures

  31. Creating and reading messages

  32. The control panel

  33. Retrieving what is needed

  34. Retrieving what is needed

  35. Upgrading the processor

  36. Validation procedure • Comparing the results of processes performed by different processors on the same Level1B file by using the Level 1C on demand creation function • Today the Caspar system stores and allows the access all that data needed to re-create L1B  L1C Processor: • OS Emulator • FFTW Library • Ansi C Compiler • Processor source code • This representation information was used to perform the upgrade procedure to transfer the on demand Level1C creation process on SUN Solaris (SPARC Chipset)

  37. Validation procedure • The current RepresentationInformation level of detail can be enhanced : • Refining the granularity of existing RepInfo • Adding supplementary RepInfo to cover/map the whole operational field • Processor source code • FFTW Library • Ansi C Compiler • OS Emulator • Processor Emulator (e.g. QEMU) • Chipset architecture (e.g. IA-36; RISC) • Those adds would reach the deepest level of inspection allowing the CASPAR system to provide all that is needed to re-create the executable when facing a change on the higher levels of the RepInfo hierarchy

  38. Going beyond the Testbed • The ACS role: responsible both for Esa scientific testbed technical implementation and for system integration activities and unit/system testing • Creation of a CasparLab for project's results evaluation using GOME data • Installation and usage of the whole system components and functionalities • Single Caspar features evaluation and testing • Ability to test the system on a distributed environment • Ability to exchange information with third party ESA LTPD components

  39. THANKS!

More Related