1 / 135

Prepared By: Kevin Garrett 1 Chris Grassotti 1

Microwave Integrated Retrieval System for NPOESS Preparatory Project: MiRS NPP/ATMS Integration into NDE Test Readiness Review January 19, 2011. Prepared By: Kevin Garrett 1 Chris Grassotti 1 Sid-Ahmed Boukabara 2 Limin Zhao 4 Flavio Iturbide-Sanchez 1

carol
Télécharger la présentation

Prepared By: Kevin Garrett 1 Chris Grassotti 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Microwave Integrated Retrieval Systemfor NPOESS Preparatory Project:MiRS NPP/ATMS Integration into NDETest Readiness ReviewJanuary 19, 2011 Prepared By: Kevin Garrett1 Chris Grassotti1 Sid-Ahmed Boukabara2 Limin Zhao4 Flavio Iturbide-Sanchez1 Wanchun Chen3 Leslie Moy1 1 I.M. Systems Group 2 NOAA/NESDIS/STAR 3 Dell 4 NOAA/NESDIS/OSPO/SSD

  2. Review Agenda 1. Introduction 1:30 – 1:40 K. Garrett 2. CDR Report 1:40 – 2:05 C. Grassotti 3. Software Architecture 2:05 – 2:35 Context and System Layer C. Grassotti 4. Unit Test Readiness 2:35 – 3:15 MiRS DAP Test Readiness K. Garrett QC DAP Test Readiness C. Grassotti 5. Risks/Actions 3:15 – 3:25 K. Garrett 6. Summary and Conclusions 3:25 – 3:30 K. Garrett Discussion 3:30 – 4:00 All

  3. INTRODUCTION • CDR Report • Software Architecture • Unit Test Readiness • Test Readiness of Individual Units: MiRS • Risks/Actions • Summary and Conclusions • Discussion

  4. Section 1 – Introduction Presented by K. Garrett

  5. Project Objectives • Technical Objective • Adaptation of MiRS to NPP ATMS and integration within NDE and OSPO • Science Objectives • Improved temperature and moisture profile retrievals • The extension of the retrieved products to non-standard surfaces including sea-ice and snow-covered land • The retrieval in all-weather conditions including cloudy and precipitating conditions • An improved set of retrieved surface properties whose derivation is based on the retrieved emissivities instead of directly from the brightness temperatures

  6. TRR Objectives • Objectives of the Test Readiness Review • Goal #1: Gather all MiRS stakeholders to review the overall system integration of MiRS into the NPOESS Data Exploitation (NDE) environment • Goal #2: Review of MiRS Software Architecture • Goal #3: Review of MiRS Software Unit Test Readiness • Goal #4: Review of CDR action items and actions taken • Goal #5: Identify new or outstanding risks w/mitigation strategies • Follow the STAR EPL Guidelines for TRR

  7. MiRS Stakeholders • Development Team • S.-A. Boukabara, K. Garrett, F. Iturbide-Sanchez, C. Grassotti, W. Chen, L. Moy • OSPO Partners • L. Zhao, J. Zhao, T. Conrad • NDE Partners • P. MacHarrie, L. Fenichel, D. Powell, J. Silva, G. Goodrum • MiRS Oversight Board • F. Weng (chair), R. Ferraro (STAR), L. Zhao (OSPO), J. Silva (NDE), T. Schott (OSD) • Oversight Panels • SPOP, PREPOP, ICAPOP, LSPOP • MiRS Users • Dan Pisut (NOAA EVL), Tony Reale (STAR), Joe Turk (JPL), Ben Ruston (NRL), Sheldon Kusselson (SAB), Stan Kidder (CIRA), Kevin Schrab and Andy Adman (NWS), Denise Hobson (AFWA), B. Yan and M. Kim (JCSDA), G. Huffman and W. McCarty (NASA/GSFC), J. Tesmar (FNMOC), P. Wang (Taiwan Weather Bureau), J. Janowiak (UMD), Paul Field (UKMET), K. Okamoto (JMA), M. V. Engel (IAO SB), B. Lambrigtsen (JPL), Peiming Dong, Qui Hong and Hu Yang (CMA), Universities (Graduate School of Chinese Academy of Sciences, Universiadeglistudidi Roma), Franklin Robertson and Clay Blankenship (NASA/MSFC), Tom Auligne (NCAR), D. Vila (CPTEC, Brazil), W. Han (Chinese Met. Admin.), D. Cimini (IMAA/CNR), M. Itkin (MPI-M, Hamburg), T. Greenwald (SSEC)

  8. MiRS Timeline/History(1/3) • Development Phase Begins (Sep. ’05-Sep. ’06) • Go-ahead with product development (Sep. ’05) • Preliminary Design Review (Oct. ’05) • Critical Design Review (Sep. ’06) • Pre-Operational Phase Begins (Jul. ’07) • Operational/backup processing capabilities in place. • SPSRB approves product to go operational • Operational Phase Begins (Aug. ’07) • Operational/backup processing capabilities reach ops status • Code transitions to operations (MiRS Release I) • SPSRB updates product metrics web pages • OSD updates Satellite Products database • Transitioned to Operations (Aug’07 – Jun’10) • N18, N19, Metop-A, F16, F18 • Also running in STAR: NPP/ATMS, AMSR-E, FY3 • Recently processing TRMM/TMI, Simulated GPM/GMI

  9. MiRS Timeline/History (2/3) • NPP ATMS Preliminary Design Review (Sept ’09) • Running with proxy data in daily processing at STAR • Identified Risks/Actions • NPP ATMS Critical Design Review (June ’10) • Provided technical data regarding overall system (operations, requirements, ATB, system design, architecture, quality assurance) • Identified Risks/Actions • QC DAP Delivered: Dec 2010 • Testing in Progress • Product Oversight Panel Meetings: • SPOP: S. Boukabara • PREPOP: F. Iturbide • ICAPOP: K. Garrett • Requirements for MiRS NPP/ATMS and NDE integration outlined in the“Research to Operations Project Level 1 Requirements Document for Microwave Integrated Retrieval System for NPOESS Preparatory Project” (Updated April, 2010)

  10. MiRS in NDE Timeline (3/3) Sep’09 PDR for NPP ATMS Jan’10 QC for NPP Review Jan’11 TRR for NPP ATMS Jun’10 CDR for NPP ATMS Mar’09 Initial version of MiRS Integrated at NDE L+90 Official DAP delivery to NDE Jun’09 Development with proxy data begins Oct’11 NPP scheduled for launch (was Jan’11) Jul’09 Jan’10 Jul’10 Jan’11 Jan’09 Oct’11 Oct’11-L+90 Post launch activities Calibration Validation FG/bias/preclassifier SRR (Jan ‘12) etc Nov’09-Dec’10 Enhanced MiRS development (advanced footprint matching, proxy data SDRs, QC DAP) Aug’09 Updated version of MiRS implemented at NDE and STAR (full ATMS functionality) Feb’09 Integration into NDE work begins Mar ‘11 CUTR for NPP ATMS Oct’09 Process multiple sample data granules from IDPS Dec’10 QC DAP Delivery May’09 Official version of MiRS integrated

  11. Introduction • CDR REPORT • Software Architecture • Unit Test Readiness • Test Readiness of Individual Units: MiRS • Risks/Actions • Summary and Conclusions • Discussion

  12. Section 2 – TRR Report Presented by C. Grassotti • Review of CDR Action Items • TRR Entry Criteria • TRR Exit Criteria • CDR Summary

  13. CDR Action Items • Total of 7 Action Items • For each AI, response was drafted (when available) describing the item, the action(s) (to be) taken, and status • Response sent to AI author(s) • At present: • 5 Items Closed • 2Remain Open

  14. CDR Action Items

  15. Action Item # 1: MiRS Operations Manual Submitted by: A. K. Sharma Description:Determine who will be responsible for writing the Operations Manual, one of the required SPSRB documents. When will this document be delivered to OSPO? Lead Organization:STAR Response: (From STAR): Members of STAR, OSPO, NDE attended SPSRB Documentation Guidelines Mtg on 2 Nov 2010. Based on discussions the following guidance was obtained: • Total of 4 Documents to be generated: Algorithm Theoretical Basis Doc (ATBD), System Maintenance Manual (SMM), Internal Users Manual (optional, depending on need), External Users Manual • The Operations Manual is no longer a required document, as it provided redundant information already contained in the ESPC Operations Procedures documentation. However, information relevant to the operation of a specific science algorithm (installation, monitoring, etc.) will be contained in the SMM. • SMM contents to be based on “Document Objects” (section reuse across some documents); SPSRB document guidelines outline which organizations (STAR/OSPO/NDE) are responsible for each document object. http://projects.osd.noaa.gov/spsrb/doc/System_Maintenance_Manual_Guidelines.doc • STAR is lead organization for all docs up until delivery to operations, but responsibility for individual sections is shared between STAR/OSPO/NDE. After delivery, OSPO/NDE will manage docs • For MiRS, MS Word may be used, following SPSRB version 2 guidelines. May be ported to Foswiki upon SPSRB approval of the software. • Document delivery upon promotion of NPP to operational status Status:CLOSED

  16. Action Item # 2: Mitigation for missing or bad ancillary data Submitted by: A. K. Sharma Description:Explain the mitigation plan for missing ancillary data or bad ancillary data (e.g. GFS required for the MIRS QC DAP) in NDE. Lead Organization:NDE/STAR Response:(From STAR): MiRS core retrieval algorithm does not rely on any ancillary data, however, the QC DAP will require a GFS forecast file for the Tier-2 QC. If the latest GFS forecast file is unavailable, then the NDE system will provide the latest available GFS file. The QC DAP software will have the ability to extract forecast fields valid for the periods around the observation (initial, 6-hr,12-hr forecast etc.), up to a 24-hr forecast. If the GFS data is more than 24 hours old (highly unlikely), then the QC DAP should not be executed due to uncertainty in the forecast fields. If the GFS data are available but are bad for given collocated ATMS Fields-of-View (FOVs), the MiRS will set quality flags, and those FOVs will not be included when generating statistical metrics to monitor. If all GFS data are bad within a whole collocated NPP ATMS granule, processing will still commence, but QC metrics will be set to fill values (-999) signifying to the operator that the ancillary data, and not the retrieval, are bad. Status:CLOSED

  17. Action Item # 3: MiRS QC DAP for trend analysis and QC monitoring Submitted by: A. K. Sharma Description:Explain how the MIRS QC DAP will be useful for trend analysis for monitoring data quality. Lead Organization:STAR Response:The MiRS QC consists of several components. For trend analysis, the more useful metrics would be: QC flag monitoring, which looks at rates of QC=0 and QC=2 occurrence, the channel NEDT monitoring, which looks at radiometric noise as a function of channel and time, as well as the radiometric monitoring, which looks at differences between observed and simulated brightness temperatures as a function of channel, scan angle, and time. Time series of these metrics are produced and updated each day within the STAR environment. In the STAR environment, geophysical performance monitoring (part of so-called Tier-3) is also being done in daily processing along with the radiometric QC. If it is deemed to be feasible by OSPO and NDE, and added as a requirement, then this level of QC monitoring, Tier-3, may be included with the QC DAP. This would include geophysical performance time series of temperature and water vapor (bias and standard deviation compared to NWP) at selected atmospheric layers.  Slides # 75 and #81 of the MiRS NPP/ATMS CDR show time series examples of NEDT and radiometric bias monitoring, respectively. Many more examples may be found on the STAR MiRS website itself under "Products Monitoring: Data Quality", "Radiometric Performance: Calibration Bias Monitoring", "Sensors Quality: NEDT", and "Geophysical Performance: Performance Time Series". In the context of NDE, the time series for all QC monitoring will be constructed as granules are processed by MiRS and the QC DAP initiated, with the first point of the time series dependent on NDE data retention policies. Status:CLOSED, Geophysical trend monitoring capability at NDE will be dependent on decisions by OSPO/NDE on whether to implement Tier-3 QC at NDE and add it as a requirement (see AI #7)

  18. Action Item # 4: NDE and OSPO product tailoring responsibilities Submitted by: Geof Goodrum Description:NDE and OSPO to coordinate and determine product tailoring responsibility. Meet with NDE and ESPC architects to clarify concepts and answer the following: • What is the ESPC architecture approach to sharing and blending products? • Use NDE distribution or the shared SAN to hand-off products to other OSPO systems? • What is the schedule for ESPC “To Be” architecture? Lead Organization:NDE/OSPO Response:(1) Limin, Geof and Joel met and discussed above questions. Data could be made available to ESPC through NDE DDS, but eventually the data should be made available through the shared SAN in the ESPC “to be architecture”, which no final decision has been made yet. The decision is expected to come out in Dec, 2010 time frame, but no final word yet so far. (2) Limin presented a briefing on the detailed tailoring capability that is provided by ESPC for current POES/Metop/DMSP version of MIRS products, and discussed the cost and benefits based on different tailoring scenario that NDE would provide. NDE team will finalize their responsibility and present their final tailoring solution to STAR/OSPO. (3) As to the blended products that need the MIRS ATMS products, since it involves ingesting products from multi-satellites, NDE will feed the MIRS ATMS products to ESPC to allow the heritage application merge MIRS ATMS products with that from other satellites and continue to generate and distribute the blended products. Status:OPEN

  19. Action Item # 5: Risk Analysis and Probability Matrix Summary Submitted by: J. Silva Description:MIRS team to get the Risk Analysis Probability and Impact matrix template from Jim Silva to summarize project risks. Lead Organization:STAR Response:Obtained template for Risk Analysis/Probability Matrix and updated to reflect MiRS CDR Risk Analysis. This analysis has been added to the risks/actions section of the CDR slides. Status:CLOSED

  20. Action Item # 6: Data Submission Agreement for MiRS NPP/ATMS in CLASS Submitted by: T. Schott Description:OSPO to review current ESPC/CLASS Submission Agreement (SA) with NDE to prepare for updated SA. Lead Organization:OSPO Response:Limin checked with Phil Jones at NCDC, and confirmed that a new SA is needed for the NDE MIRS products. Options are: (1) the existing MIRS SA can be referenced and/or provided as attachment; (2) write a completely new SA, starting by deleting all legacy references in the existing MiRS DSA. Question as to whether NCDC requires restarting process at Step 1 with a “request for archive”. Given the situation that archive data will be produced by NDE for the first 18 months after launch, and also NDE has information about how and what the products will be made available to CLASS, the SA will need be coordinated between NDE and OSPO. OSPO can take a lead on this, but Limin needs to know whom she can work with in the NDE team on this matter. Action is recommended for closing, as current SA has been reviewed. Status:CLOSED

  21. Action Item # 7: Assess feasibility of implementing Tier-3 QC processing within operations Submitted by: T. Schott and L. Zhao Description:OSPO to draft RFA to assess feasibility of implementing Tier-3 (science quality) into operations if it was to be included in the MiRS QC DAP. Lead Organization:OSPO Response:OSPO needs detailed information from STAR, regarding effort, cost, data resources, CPU and storage requirements, to determine if it is feasible to implement the Tier-3 QC monitoring into operation. STAR is available to discuss and provide additional information off-line on Tier-3 QC to OSPO. Status:OPEN

  22. CDR Action Items

  23. MiRS for NPP/ATMS TRR Entry Criteria • Entry # 1 – Review of CDR Report with Action Items, Responses, Status • Entry # 2 – Review of the TRR for MiRS NPP/ATMS in NDE • Software Architecture • Unit Test Readiness • Risks/Actions

  24. MiRS for NPP/ATMS TRR Exit Criteria • Exit # 1 – Test Readiness Review Report • TRR Report will be compiled and delivered after TRR • TRR Report to contain: • TRR Presentation • Actions • Comments

  25. CDR Report Summary • This CDR Report closes the CDR • Total of 7 Action Items: • At present, 5 of 7 items are closed, 2 remain open • 2 TRR Entry Criteria have been established • 1 TRR Exit Criterion has been established

  26. Introduction • CDR Report • SOFTWARE ARCHITECTURE • Unit Test Readiness • Test Readiness of Individual Units: MiRS • Risks/Actions • Summary and Conclusions • Discussion

  27. Section 3 – Software Architecture Presented byC. Grassotti

  28. Software Architecture: MiRS Background • MiRS software already tested, implemented for OSPO (N18,N19, Metop, F16, F18) • High code maturity • A risk reduction for NPP/ATMS • Code unit testing was also done during MiRS development for POES and DMSP before delivery to OSPO • Also running daily for AMSR-E and TRMM/TMI • Implemented for NPP/ATMS within STAR • Runs daily using proxy data • Installed and tested using sample and proxy data at NDE • TBD: integration with “live” data flow, daily processing, retention of multiple days/granules for QC testing. • Modular design facilitates unit testing down to each major processing unit of the system-level data flow (flow diagram to follow) • QC DAP components treated separately. MiRS retrieval codes independent from QC, but QC assumes that all MiRS products have been generated

  29. Software Architecture:Standards Compliance (1/3) • ISO Fortran 95 standards adherence is assessed using the Forcheck utility • Manual Verification that SPSRB F95 coding standards are adhered to as well (standards and guidelines) • http://projects.osd.noaa.gov/spsrb/standards_software_coding.htm • Standards upheld: • Use Implicit None in all modules and main programs. • Documented well enough. All modules/main programs/subroutines/functions have a header: • Description of the code • Modules used • Subroutines or functions contained • Data types included • Intrinsic functions used • Argument description (input/output) • Code history • In the code, major sections have their own comments to describe what the section of the code is doing. • No hard coded numbers. All Constants are defined in module Const.

  30. Software Architecture:Standards Compliance (2/3) • Standards upheld (continued) • All variables are initialized. • Use free format syntax but the maximum length in each line is less than 128 ( g95 will cut off any line longer than 128 characters). • No tab characters in all Fortran codes. • No GOTO statements in all codes. • All unused variables are removed. • Use private to ensure proper encapsulation. • No conditional statements using == or /= with floating points. • No numbered DO loops. Labeled DO loops are used. • All intrinsic functions are declared before using them.

  31. Software Architecture:Standards Compliance (3/3) • Use valgrind to help detect any memory leaks. Valgrind is a Linux program used to “detect many memory management and threading bugs, avoiding hours of frustrating bug-hunting, making your programs more stable”. Valgrind detailed information and usage: http://www.valgrind.org/ • Since last version, no memory leaks in MIRS(v7) • All dynamically allocated memories are deallocated when they are not used anymore. • No array index out of bounds. Array index out of bounds will usually cause “segmentation fault”. • All code changes are managed by Subversion: a source version control system • Full compliment of code review results related to coding standards to be provided in the Code Unit Test Review (March 2011)

  32. Software Architecture:Resource Requirements Hardware Requirements Not necessary for standard outputs BUT required for QC DAP Software Requirements *This is minimum required. Number of CPUs will affect the speed of execution. 8 CPUs are used in STAR to run three sensors daily.

  33. Software Architecture:Overview • The software architecture describes the structure of the system software elements and the external and internal data flows between software elements. • The MiRS software architecture is described in the MiRS System Description Document created for POES and DMSP and delivered to OSPO. Updated documentation will be delivered by STAR as specified in the CDR Report. • 3 Layers of design will be presented (STAR EPL Guidelines): • Context Layer - 0: External Interfaces • System Layer - 1: Flow Between Units • Unit Layer - 2: Flow Within Units Section 3 (this section) Section 4 (next section)

  34. The Context-Layer:STAR EPL Guidelines • The Context-Layer describes the flows between the system and its external interfaces • An external input is defined as a data source needed by the system that is produced or made available by a process external to the system • An external output is defined as a data sink that is produced by the system for an external user • External interfaces must meet standard criteria to be included in the system architecture

  35. MiRS Context Layer:External Interfaces MiRS External Interfaces NDE DHS Boundary Systems Configurations Process Req. NDE Product Invocation MiRS/QC Generation Manager Return Code Product Rule Sets Generation Output Files Input Files Specifications & PSF & PCF Working Directory PSF (MiRS output) Output Working Directory PCF (MiRS input) Forensics DAP Repository Specifications Product Files SAN NDE Distribution Server Input Files (HDF5, GRIB) Data Areas NWP not necessary for Core Products (QC only) IDPS ESPC Configurations Info Input (HDF5) Files GFS (GRIB) MIRS System DDS NDE Production Manager ECMWF,GDAS (GRIB) 1

  36. External Interfaces: Assumptions • Tailoring and (re)formatting done by NDE and OSPO (Context Layer) • Interactions with operational users and CLASS handled by NDE and OSPO (Context Layer) • No incompatibility between NDE and OSPO ESPC environment • Single version of MiRS to maintain

  37. MiRS External Interfaces • File format requirements for NPP ATMS • ATMS level 1b granules/geo formatted in HDF5 • PCF ascii file generated by NDE DHS • MiRS product outputs formatted in netCDF4 (CF conventions) • MiRS output PSF ascii file listing output files • MiRS readers and encoders support these formats and have been tested in NDE • Metadata: • Current MiRS Collection Level metadata available in ISO 19115 at CLASS (for POES/Metop/DMSP) • Expectation is for a similar Collection Level file for MIRS NPP ATMS products to be stored at CLASS • Metadata requirements for MiRS NPP ATMS will be outlined by updated Submission Agreement in the future • MiRS NPP ATMS Granule Level metadata to be contained inside the MiRS netCDF4 output header (following STAR metadata template)

  38. External Interfaces: Status • All external interfaces have been reviewed and approved; no changes since CDR • No open actions on external interfaces

  39. The System-Layer: STAR EPL Guidelines • The System-Layer data flow expands upon the Context-Layer data flow, showing the first layer of decomposition. • In addition to the System-Layer inputs and outputs, the major processing units are shown along with their inputs and outputs. • Each unit is designed as a stand-alone program for ease of testing and integration into a System-Layer scheduler.

  40. The MiRS System Layer: Processing Units • Each major step in the MiRS processing sequence is a stand-alone bash script and a corresponding Fortran 95 executable and namelist file and constitutes a Layer-2 Test Unit

  41. The MiRS System-Layer:npp_scs.bash • Some or all units in the system layer may be invoked by the top level driver script npp_scs.bash (in operations all 9 units run from same invocation of driver script). • The system layer is where NDE will invoke the MiRS software units. • Each unit is a bash script-function that drives a low-level fortran processing program. • When the system’s input data is available (ATMS granule), the NDE PGM will run the top level driver script. • Execute npp_scs.bash passing the “working directory” path as the argument • The NDE PGM will generate a Process Configuration File (PCF) which contains all input file locations and parameters required for processing, and is read in by the driver script. • All code units process sequentially, one ATMS granule at a time. • The NDE DHS must be able to run multiple instances of these units to process concurrently available granules. • Each instance will produce a PSF file containing a list of output product files if they were created successfully. • In STAR, the driver script is invoked through crontab and the PCF variables are specific to the STAR environment.

  42. MiRS System-Layer Process Flow: NDE Environment npp_scs.bash Log File Return value to PGM Local Processing Directories PGM d L1B Sensor Data TDRs rdr2tdr (HDF5) TDRs L1B Sensor Data (HDF5) tdr2sdr SDRs PCF SDRs fm FMSDRs FMSDRs chopp Chopped FMSDRs Local Processing Directories (working directory) Chop FMSDRs applyRegress REGRESS Retr Chopped FMSDRs 1dvar EDRs NDE SAN EDRs mergeEdr Merged EDR EDRs + Ancillary` vipp DEPs EDRs + DEPs mirs2nc SND (netCDF4 EDR) IMG (netCDF4 DEP) Process Status File

  43. System Layer Architecture: Status • All System Layer components have been reviewed and approved; no changes since CDR • No open actions on system layer components

  44. Introduction • CDR Report • Software Architecture • UNIT TEST READINESS • Test Readiness of Individual Units: MiRS • Risks/Actions • Summary and Conclusions • Discussion

  45. Section 4 – Unit Test Readiness Presented byK. Garrett - MiRS DAP C. Grassotti - QC DAP

  46. The Unit-Layer: STAR EPL Guidelines • The Unit-Layer data flow expands upon the System-Layer data flow, showing the second layer of decomposition. • This layer shows the detailed data flow into and out of each software unit. • And, it shows data flow within the software units (if applicable).

  47. Test Readiness: Environment and Configuration • For all unit tests the testing environment will be both the STAR Linux servers and the NDE computational systems • STAR Environment: Full end to end testing of MiRS DAP (RDRs to SND/IMG netCDF4) and QC DAP using proxy data for ~ 1 day of data and sample data using 1 granule • NDE Environment: • MiRS DAP: Testing of ~ 1 day of proxy data independent of DHS (minimum);Testing of ~ 1 day of proxy data interfaced with DHS; 1 granule of sample data independent of DHS • QC DAP: Testing of ~ 1 day of proxy data independent of DHS • For testing in STAR, configuration is the same as that used in the daily processing for all other sensors (e.g. f95, IDL, and bash scripts) • For testing in NDE, configuration is identical, except for high level process management (interaction with DHS). Output files for all processing steps to be compared against benchmark data produced in STAR test.

  48. Test Readiness: Test Data and Truth Data • Test data will include: • Proxy data generated using MIT/LL software from Metop-A (1 day 2011-01-15) • Proxy data generated using GRAVITE (1 day 2011-01-15) • Sample data generated by IDPS (1 granule TBD) • All input test data in HDF5 format following NPOESS Common Data Format Control Book (vol. III) • Truth data: NWP analysis from GDAS as well as EDRs and DEPs obtained from the operational sensors valid at the same time and location (i.e. not all parameters available for validation in NWP data; e.g. sea ice)

  49. All Units:Test Method, Sequence • The test method is to run using ATMS proxy data in stand-alone mode (single execution of bash script and f95/IDL code) with all the required input/output files defined at the script level. • For many steps the upstream input data files are simply the output generated by the previous required Layer 2 MiRS processing unit (e.g. fm→fmsdr2edr) • Confirmation of successful test will be determined by verifying creation of intermediate and output data (comparison with benchmark files), using log files, and/or diagnostic graphical tools (IDL) available in the STAR and/or NDE environment (maps, error plots, etc.)

  50. Unit Tests • The following will be presented for each of the 9 units: • Unit Description : Purpose and Function • Process Flow • Test Items • Input/Output Data • Test Risks

More Related