1 / 135

Microwave Integrated Retrieval System for NPOESS Preparatory Project: MiRS NPP/ATMS Integration into NDE Code Unit Test

Microwave Integrated Retrieval System for NPOESS Preparatory Project: MiRS NPP/ATMS Integration into NDE Code Unit Test Review May 26, 2011. Prepared By: Kevin Garrett 1 Chris Grassotti 1 Sid-Ahmed Boukabara 2 Flavio Iturbide-Sanchez 1 Wanchun Chen 3

olathe
Télécharger la présentation

Microwave Integrated Retrieval System for NPOESS Preparatory Project: MiRS NPP/ATMS Integration into NDE Code Unit Test

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Microwave Integrated Retrieval Systemfor NPOESS Preparatory Project:MiRS NPP/ATMS Integration into NDECode Unit Test ReviewMay 26, 2011 Prepared By: Kevin Garrett1 Chris Grassotti1 Sid-Ahmed Boukabara2 Flavio Iturbide-Sanchez1 Wanchun Chen3 Leslie Moy1 1 I.M. Systems Group 2 NOAA/NESDIS/STAR 3 Dell

  2. Review Agenda 1. Introduction 1:30 – 1:40 K. Garrett 2. TRR Report 1:40 – 2:00 C. Grassotti 3. Unit Test Plan 2:00 – 2:10 C. Grassotti 4. Code Unit Tests 2:10 – 3:20 Context and System Layer C. Grassotti MiRS DAP Unit Tests C. Grassotti QC DAP Unit TestsK. Garrett 5. System Test Plan3:20 – 3:25 K. Garrett 6. Risks/Actions 3:25 – 3:35 K. Garrett 7. Summary and Conclusions3:35 – 3:40 K. Garrett Discussion 3:40 – 4:00 All

  3. INTRODUCTION • TRR Report • Unit Test Plan • Code Unit Tests • Code Unit Tests of Individual Units: MiRS DAP and QC DAP • System Test Plan • Risks/Actions • Summary and Conclusions • Discussion

  4. Section 1 – Introduction Presented by K. Garrett

  5. Project Objectives • Technical Objective • Adaptation of MiRS to NPP ATMS and integration within NDE • Science Objectives • Improved temperature and moisture profile retrievals • The extension of the retrieved products to non-standard surfaces including sea-ice and snow-covered land • The retrieval in all-weather conditions including cloudy and precipitating conditions • An improved set of retrieved surface properties whose derivation is based on the retrieved emissivities instead of directly from the brightness temperatures

  6. CUTR Objectives • Objectives of the Code Unit Test Review • Goal #1: Gather all MiRS stakeholders to review the overall system integration of MiRS into the NPOESS Data Exploitation (NDE) environment • Goal #2: Review of MiRS Software Code Unit Tests • Goal #3: Review of MiRS System Test Plan • Goal #4: Review of TRR action items and actions taken • Goal #5: Identify new or outstanding risks w/mitigation strategies • Follow the STAR EPL Guidelines for CUTR

  7. MiRS Stakeholders • Development Team • S.-A. Boukabara, K. Garrett, F. Iturbide-Sanchez, C. Grassotti, W. Chen, L. Moy • OSPO Partners • L. Zhao, J. Zhao, T. Conrad • NDE Partners • P. MacHarrie, L. Fenichel, D. Powell, J. Silva, G. Goodrum • MiRS Oversight Board • F. Weng (chair), R. Ferraro (STAR), L. Zhao (OSPO), J. Silva (NDE), T. Schott (OSD) • Oversight Panels • SPOP, PREPOP, ICAPOP, LSPOP • MiRS Users • Dan Pisut (NOAA EVL), Tony Reale (STAR), Joe Turk (JPL), Ben Ruston (NRL), Sheldon Kusselson (SAB), Stan Kidder (CIRA), Kevin Schrab and Andy Adman (NWS), Denise Hobson (AFWA), M. Kim (JCSDA), G. Huffman and W. McCarty (NASA/GSFC), J. Tesmar (FNMOC), P. Wang (Taiwan Weather Bureau), J. Janowiak (UMD), Paul Field (UKMET), K. Okamoto (JMA), M. V. Engel (IAO SB), B. Lambrigtsen (JPL), Peiming Dong, Qui Hong and Hu Yang (CMA), Universities (Graduate School of Chinese Academy of Sciences, Universiadeglistudidi Roma), Franklin Robertson and Clay Blankenship (NASA/MSFC), Tom Auligne (NCAR), D. Vila (CPTEC, Brazil), W. Han (Chinese Met. Admin.), D. Cimini (IMAA/CNR), M. Itkin (MPI-M, Hamburg), T. Greenwald (SSEC), + several new users since TRR (Jan 2011)

  8. MiRS in NDE Timeline Sep’09 PDR for NPP ATMS Jan’10 QC for NPP Review Jan’11 TRR for NPP ATMS Jun’10 CDR for NPP ATMS Mar’09 Initial version of MiRS Integrated at NDE L+90 Official DAP delivery to NDE Jun’09 Development with proxy data begins Oct’11 NPP scheduled for launch (was Jan’11) Jul’09 Jan’10 Jul’10 Jan’11 Jan’09 Oct’11 Oct’11-L+90 Post launch activities Calibration Validation FG/bias/preclassifier SRR (Jan ‘12) etc Nov’09-Dec’10 Enhanced MiRS development (advanced footprint matching, proxy data SDRs, QC DAP) Aug’09 Updated version of MiRS implemented at NDE and STAR (full ATMS functionality) Feb’09 Integration into NDE work begins May ‘11 CUTR for NPP ATMS Oct’09 Process multiple sample data granules from IDPS Dec’10 QC DAP Initial Delivery May’09 Official version of MiRS integrated

  9. Introduction • TRR REPORT • Unit Test Plan • Code Unit Tests • Code Unit Tests of Individual Units: MiRS DAP and QC DAP • System Test Plan • Risks/Actions • Summary and Conclusions • Discussion

  10. Section 2 – TRR Report Presented by C. Grassotti • Review of TRR Action Items • CUTR Entry Criteria • CUTR Exit Criteria • TRR Summary

  11. TRR Action Items • Total of 5 Action Items • For each AI, response was drafted (when available) describing the item, the action(s) (to be) taken, and status • Response sent to AI author(s) • At present: • 4 Items Closed • 1 Item Remains Open

  12. TRR Action Items

  13. Action Item # 1: Include Test Procedures in Code Unit Test Review Submitted by: G. Goodrum Description:STAR to include test sequences and procedures at Code Unit Test Review. This will encapsulate testing of MiRS individual code units and verification of corresponding input/output. Lead Organization:STAR Response: These actions to be implemented in this CUTR Status:OPEN (expected closed after CUTR)

  14. Action Item # 2: Request Footprint Matching Code From IPO Submitted by: STAR Description:Pursue footprint matching codes for NPP/ATMS from IPO through T. Schott. This is FM code to optimize spatial resolution and sampling characteristics of ATMS SDRs, taking advantage of the ATMS sensor oversampling at many frequencies. This addresses Risk #3 from the MiRS CDR. Lead Organization:STAR/OSD Response:T. Schott submitted formal request for FM code. Code has been received. Code is in MatLab (no documentation provided), and only computes the Backus-Gilbert weighting coefficients. Following up with MIT to inquire about code that applies the coefficients/FM code. Status:CLOSED

  15. Action Item # 3: Meet with OSPO to Discuss MiRS Tier-3 QC Feasibility Submitted by: L. Zhao Description:MiRS IPT to meet with OSPO (L. Zhao) to discuss MiRS Tier-3 QC feasibility, resource requirements, etc. This relates to CDR AI #7 which requires OSPO to draft RFA for implementing Tier-3 QC into operations, if deemed feasible. Lead Organization:STAR/OSPO Response:MiRS IPT met with OSPO on 16 Feb 2011 to discuss Tier-3 feasibility. (Tier-3, Geophysical monitoring) Status:CLOSED

  16. Action Item # 4: Meet with NDE to Discuss MiRS QC DAP Implementation Submitted by: D. Powell Description:MiRS IPT to meet with NDE (D. Powell) to discuss MiRS QC DAP implementation and integration plan, and capabilities. Outcome to determine the utility of the various monitoring tools of the QC DAP. Lead Organization:STAR/NDE Response:MiRS IPT (K. Garrett) met with NDE to outline capabilities and requirements of QC DAP within NDE system Status:CLOSED

  17. Action Item # 5:Assessment of MSPPS Algorithms With ATMS Proxy Data Submitted by: R. Ferraro Description:Apply the MSPPS algorithms (primarily rain rate and TPW) in MiRS to ATMS proxy data to assess performances of products, and applicability to ATMS sensor. Lead Organization:STAR Response:This AI was based on miscommunication and was later withdrawn Status:CLOSED

  18. TRR Action Items

  19. MiRS for NPP/ATMS CUTR Entry Criteria • Entry # 1 – Review of TRR Report with Action Items, Responses, Status • Entry # 2 – Review of the CUTR for MiRS NPP/ATMS in NDE • Unit Test Plan • Unit Test Results • System Test Plan • Risks/Actions

  20. MiRS for NPP/ATMS CUTR Exit Criteria • Exit # 1 – Code Unit Test Review Report • CUTR Report will be compiled and delivered after CUTR • CUTR Report to contain: • CUTR Presentation • Actions • Comments

  21. TRR Report Summary • This TRR Report closes the TRR • Total of 5 Action Items: • At present, 4 of 5 items are closed, 1 remain open • 2 CUTR Entry Criteria have been established • 1 CUTR Exit Criterion has been established

  22. Introduction • TRR Report • UNIT TEST PLAN • Unit Test Readiness • Code Unit Tests of Individual Units: MiRS DAP and QC DAP • System Test Plan • Risks/Actions • Summary and Conclusions • Discussion

  23. Section 3 – Unit Test Plan Presented by C. Grassotti

  24. Unit Test Plan: Environment and Configuration • For all MiRS DAP unit tests the testing environment will be both the STAR Linux servers and the NDE AIX systems • STAR Environment: Full end to end test (RDRs to SND/IMG netCDF4) using Platinum-72 (P72 - proxy data in correct format) data set (single and multiple granules); selected visualization. STAR environment configured as expected in NDE. • NDE Environment: Testing of several granulesof P72 data independent of DHS • For all MiRS QC DAP unit tests the testing environment will be the STAR Linux servers • STAR Environment: Full end to end test (Collocation with GFS, Tier-1, Tier-2 QC, e-mail alert and time series). STAR environment is configured as expected in NDE. • For testing in STAR, configuration is the same as that used in the daily processing for all other sensors (e.g. f95, C, IDL, and bash scripts) • For testing in NDE, configuration is identical. • Most detailed results here will be shown from tests in STAR environment • However, final SND/IMG output files from all processing steps to be compared against benchmark data produced in STAR test.

  25. Unit Test Plan: Configuration in STAR Environment • Top level directory listing of MiRS DAP: • To create binary executables: cdsrc; make • Binary executables • Ancillary data • Documentation • README • Bash scripts • PCFs • Source code

  26. Unit Test Plan: Data Test data will include - • Input data: • P72 proxy data generated from IDPS (currently 1 orbit 2010-09-06) • GFS forecast data (2002-09-06) time shifted to match P72 • Intermediate and Output data: • Data produced by MiRS processing units • Image data: • STAR: Image maps of select products (intermediate/final) to confirm successful test • Benchmark data: • STAR/NDE: summary statistics (npoints,mean,variance,min,max) of select EDRs from final output SND/IMG files created in both STAR and NDE environments to be compared

  27. Unit Test Plan:Test Method, Sequence • The test method is to run using proxy data in stand-alone mode (single execution of bash script and f95/IDL code) with all the required input/output files defined at the script level. • For many steps the upstream input data files are simply the output generated by the previous required Layer 2 MiRS processing unit (e.g. fm→fmsdr2edr) • Confirmation of successful test will be determined by: • verifying creation of intermediate and output data (file listings) • contents of standard output file (accumulated from scripts, f90 code, etc.) • MiRS DAP: statistics of intermediate files; comparison with benchmark files produced in both STAR and NDE (final IMG/SND products), • QC DAP: examination of NEDT, QC, and BIAS (radiometric) files and alert files, time series PNGs • log files, and PSF • diagnostic graphical tools (IDL) available in the STAR environment • To the extent that the top level scripts, and lower level codes are integrated with one another these unit tests also constitute a test of the overall software system readiness (cf. System Test Plan)

  28. Unit Test Plan:Test Procedures for MiRS DAP • Go to /home/pub/chrisg/mirs_working_npp_nde/src • Type ‘make’ • Confirm compilation without errors and all binaries in bin/ • Create working directory in /home/pub/chrisg • mkdir wd_npp_p72_01 (for example) • Copy PCF and input data to working directory • Copy setup/npp_pcf.bash • Copy ATMS granule (for example): SATMS_npp_d20100906_t0435407_e0436121_b00003_c20110321195442103526_noaa_ops.h5 and GATMO_npp_d20100906_t0435407_e0436121_b00003_c20110321195442103526_noaa_ops.h5 • Run driver script scripts/npp_scs_nde.bash <working_directory> • Run with one step turned on in the PCF at a time to test each individual unit. • Verify output from each step was created successfully • Files exist • Statistics of data in file • Images (if applicable)

  29. Introduction • TRR Report • Unit Test Plan • CODE UNIT TESTS • Code Unit Tests of Individual Units: MiRS DAP and QC DAP • System Test Plan • Risks/Actions • Summary and Conclusions • Discussion

  30. Section 4 – Code Unit Tests Presented byC. Grassotti – Context and System Layer C. Grassotti - MiRS DAP K. Garrett - QC DAP

  31. Software Architecture:Overview • The software architecture describes the structure of the system software elements and the external and internal data flows between software elements. • 3 Layers of design (STAR EPL Guidelines): • Context Layer - 0: External Interfaces • System Layer - 1: Flow Between Units • Unit Layer - 2: Flow Within Units

  32. MiRS Context Layer:External Interfaces MiRS External Interfaces NDE DHS Boundary Systems Configurations Process Req. NDE Product Invocation MiRS/QC Generation Manager Return Code Product Rule Sets Generation Output Files Input Files Specifications & PSF & PCF Working Directory PSF (MiRS output) Output Working Directory PCF (MiRS input) Forensics DAP Repository Specifications Product Files SAN NDE Distribution Server Input Files (HDF5, GRIB) GFS (GRIB) GDAS (GRIB) Data Areas NWP not necessary for Core Products (QC only) IDPS ESPC ECMWF(GRIB) Configurations Info Input (HDF5) Files MIRS System DDS NDE Production Manager 1

  33. MiRS External Interfaces • File format requirements for NPP ATMS • ATMS level 1b granules/geo formatted in HDF5 • PCF ascii file generated by NDE DHS • MiRS product outputs formatted in netCDF4 (CF conventions) • MiRS output PSF ascii file listing output files • MiRS readers and encoders support these formats and have been tested in NDE • Metadata: • Current MiRS Collection Level metadata available in ISO 19115 at CLASS (for POES/Metop/DMSP) • Expectation is for a similar Collection Level file for MIRS NPP ATMS products to be stored at CLASS • Metadata requirements for MiRS NPP ATMS will be outlined by updated Submission Agreement in the future • MiRS NPP ATMS Granule Level metadata to be contained inside the MiRS netCDF4 output header (following STAR metadata template finalized 5/18/2011)

  34. MiRS System Layer: Processing Units • Each major step in the MiRS processing sequence is a stand-alone bash script and a corresponding Fortran 95 executable and namelist file and constitutes a Layer-2 Test Unit

  35. The MiRS System-Layer:npp_scs_nde.bash • Some or all units in the system layer may be invoked by the top level driver script npp_scs_nde.bash(in operations all 9 units run from same invocation of driver script). • The system layer is where NDE will invoke the MiRS software units. • Each unit is a bash script-function that drives a low-level fortran processing program. • When the system’s input data is available (ATMS granule), the NDE PGM will run the top level driver script. • Execute npp_scs_nde.bash passing the “working directory” path as the argument • The NDE PGM will generate a Process Configuration File (PCF) which contains all input file locations and parameters required for processing, and is read in by the driver script. • All code units process sequentially, one ATMS granule at a time. • The NDE DHS must be able to run multiple instances of these units to process concurrently available granules. • Each instance will produce a PSF file containing a list of output product files if they were created successfully. • In STAR, the driver script may be invoked through crontab or run manually and the PCF variables are specific to the STAR environment.

  36. MiRS System-Layer Process Flow: NDE Environment npp_scs_nde.bash Log File Return value to PGM Local Processing Directories PGM d L1B Sensor Data TDRs rdr2tdr (HDF5) TDRs L1B Sensor Data (HDF5) tdr2sdr SDRs PCF SDRs fm FMSDRs FMSDRs chopp Chopped FMSDRs Local Processing Directories (working directory) Chop FMSDRs applyRegress REGRESS Retr Chopped FMSDRs fmsdr2edr EDRs NDE SAN EDRs mergeEdr Merged EDR EDRs + Ancillary` vipp DEPs EDRs + DEPs mirs2nc SND (netCDF4 EDR) IMG (netCDF4 DEP) Process Status File

  37. Unit Test Results: MiRS DAP

  38. MiRS Unit Tests: STAR Environment • Run on standard Linux machine (e.g. orbit272L) • Codes precompiled using Linux compiler (ifort, HDF5, netCDF4) • Required: • Working directory (in NDE this will be created at run time by the PGM): this is where all file input/output takes place • Working directory contents: • Single ATMS granule: 2 HDF5 files containing geolocation (GATMO_*.h5) and radiometric data (SATMS_*.h5) • PCF: contains directory, variable specifications, flags to control execution of MiRS script (npp_scs_nde.bash), which steps to run, etc.

  39. MiRS Unit Tests: STAR Environment • In operations all processing steps will be run in same invocation of npp_scs_nde.bash • For unit tests script will be run multiple times with only the testing unit step turned on • Advantage: tests both the underlying processing software, as well as top level invocation scripts • Working directory is sole argument to bash script • Unit tests will be performed with P72 proxy data valid on 2010-09-06

  40. MiRS Unit Tests: STAR Environment • Example showing initial listing of working directory before first invocation of bash script • Test ATMS granule: SATMS_npp_d20100906_t0435407_e0436121_b00003_c20110321195442103526_noaa_ops.h5 and GATMO_npp_d20100906_t0435407_e0436121_b00003_c20110321195442103526_noaa_ops.h5 • Geolocation File • PCF • Radiometric File • At NDE, the working directory and its contents are created by the DHS each time a new granule is available for processing • File naming and format convention for simulated data identical to real data

  41. MiRS Unit Tests: STAR Environment • Example showing process switches portion of PCF: • Steps turned on

  42. MiRS Unit Tests: STAR Environment • Example showing paths portion of PCF: • Working directory • Root path (location of MiRS DAP)

  43. MiRS Unit Tests • The following will be presented for each of the 9 units: • Unit Description : Purpose and Function • Test Sequence/Results

  44. Unit rdr2tdr: Purpose and Function • Convert raw sensor data records to temperature data records (antenna temperatures) • Reformatted into MiRS internal format • Computessensorradiometric noise values (used to update the instrument noise matrix used in the 1dvar minimization) • Input: Level 1b in HDF5 format produced by external process at NDE (current testing with proxy data generator uses HDF5). For NPP ATMS, the Level 1b data may be actual TDRs or SDRs, rather than RDRs. • Output: TDR and NEDT files in internal format produced by rdr2tdr • It reads in a namelist which specifies operating parameters (passed from the PCF by npp_scs_nde.bash)

  45. Unit rdr2tdr Test Sequence –Script invocation • Step 1: Example showing invocation of bash script (same for subsequent unit tests) • Working directory • Bash script

  46. Unit rdr2tdr Test Sequence –Working Directory Listing • Step 2: Listing of working directory after invocation of bash script • NEDT file created (needed for QC DAP) • TDR file created

  47. Unit rdr2tdr Test Sequence –Standard output • Step 3: portion of standard output produced • working directory • Output confirms successful completion of rdr2tdr

  48. Unit rdr2tdr Test Sequence –Statistical summary of output file contents • Step 4: IDL utility scans TDR file and outputs basic statistics of TDRs in output file showing values within physically realistic limits • Channel • Nobs • Mean • Stdev • Min • Max

  49. Unit tdr2sdr: Purpose and Function • Apply antenna pattern correction (if selected) to convert tdr files to sensor data record (TBs or radiances) • In operations, no correction applied and the step is simply a reformatting to MiRS internal format (differences accounted for in radiometric bias corrections) • Input: TDR files in internal format produced by rdr2tdr • Output: SDR files in internal MiRS format • It reads in a namelist which specifies operating parameters (passed from the PCF by npp_scs_nde.bash)

  50. Unit tdr2sdr Test Sequence –Working Directory Listing • Step 1: npp_scs_nde.bash script invocation (as before) • Step 2: Listing of working directory after invocation of script • MiRS formatted SDR file created

More Related