1 / 182

NAEFS WORKHOP SLIDES

Zoltan Toth Oct. 6-8, 2008. NAEFS WORKHOP SLIDES. OUTLINE. Ensemble data processing Statistical corrections Proxy for truth Derived variables / products Verification Application areas Long term priorities GIFS. ENSEMBLE DATA PROCESSING FROM NOAA’S PERSPECTIVE. Objective

gigi
Télécharger la présentation

NAEFS WORKHOP SLIDES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Zoltan Toth Oct. 6-8, 2008 NAEFS WORKHOP SLIDES

  2. OUTLINE • Ensemble data processing • Statistical corrections • Proxy for truth • Derived variables / products • Verification • Application areas • Long term priorities • GIFS

  3. ENSEMBLE DATA PROCESSINGFROM NOAA’S PERSPECTIVE • Objective • Provide ensemble data that is • Statistically consistent with “truth” • Combines all sources of information into unified guidance • Seamless across time and spatial scales • Users • NCEP Service Centers • NWS WFOs, RFCs • Broader user community • Drivers • NAEFS • NUOPC, GIFS • National & international collaboration • NDFD expansion • Digital high resolution gridded ensemble data base • All probabilistic products can be derived from this

  4. REQUIREMENTS • Quality • Ensemble forecasts with highest possible • Statistical reliability & resolution • Fast convergence of statistical estimators • To reduce forecast sample size requirements • Computational efficiency • For operational applications • Operational considerations • Ease of implementation, maintenance, etc • Development environment • Modular design • To facilitate collaboration with other groups

  5. ENSEMBLE DATA PROCESSING STEPS • Bias correction • Remove lead time dependent model bias on model grid • Calibrate higher moments of ensemble • Combine information from all sources into single set of ensemble • Forecaster modification • Subjective changes to ensemble data (over US only?) • Proxy for truth • Create observationally based fine resolution analysis for use in downscaling • Downscaling • Interpret bias corrected ensemble on user relevant grid – NDFD • Additional variables • Derive further variables from bias corrected / downscaled NWP output • Derived products • Interrogate bias corrected / downscaled ensemble dataset

  6. BIAS CORRECTION • Method • Bayesian processor • Combines information from all sources • Fuses forecast data with climatological distribution (“prior”) • Adjusts “spread” according to skill observed in forecast sample • Outputs statistically corrected distribution (“posterior”) • Ensemble members adjusted to represent posterior distribution • Data sources • Reanalysis as prior (use new reanalysis when available) • Sample of past forecasts - most recent 60-90 days • Include control hind-casts when available • Latest analyzed or observed data • Current status • 35 NAEFS & SREF variables bias corrected • 1st moment corrected only • NAEFS & SREF processed separately • CMC + NCEP ensembles + GFS hires control combined • Plan • Bias correct all model output variables (~200, 2-3 yrs) • Include precipitation (use observationally based analysis as truth – 1-2 yrs) • Add hind-casts for NCEP ensemble control (1-2 yrs) • Process ensembles & hires forecasts from FNMOC, ECMWF (1-2 yrs) • Combine all forecasts into single guidance (2-3 yrs) • Ensemble & hires from NCEP, CMC, FNMOC, ECMWF, and SREF

  7. USE OF RESTRICTED DATA • Background • Use of ensemble data form certain centers may be restricted • Example • NOAA can use ECMWF ensemble in operational duties • Can use ECMWF ensemble to prepare NOAA products • Cannot redistribution ECMWF ensemble data • Three possible NAEFS scenarios • CMC & NMSM request ensemble centers to grant same privileges • Preferred solution • All centers can access and process same data • NCEP uses ECMWF ensemble as part of NAEFS bias correction / combination • ECMWF, CMC, NCEP ensembles readjusted to reflect NAEFS posterior cdf • Distribute entire adjusted ensemble • Acceptable to ECMWF? • As above, except distribute only adjusted CMC & NCEP ensembles • Consistent with bilateral agreement • Need confirmation from ECMWF • Latter two approaches limit ECMWF ensemble processing to NCEP • Single center for combination of all NAEFS data OR • Divergence in NAEFS products?

  8. PROXY FOR TRUTH • RTMA • Manuel Pondeca • Combined RFC Stage-4 & CPC precipitation analysis • Mike Charles

  9. DOWNSCALING • Method • Perfect prog • Establish relationship (“donwscaling vector”) between • Proxy for truth (high resolution observationally based analysis) & • NWP analysis (used as reference in bias correction step) • Level of sophistication • Climatological (statitical) • Regime dependent (statistical) • Case dependent (dynamical, using LAM) – most expensive • Sub-NWP-grid resolution variance • Need to be stochastically added in statistical methods • Outputs ensemble members statistically consistent with • Bais corrected forecasts on NWP grid • Proxy for truth on fine resolution grid • Data sources • Sample of • Fine resolution observationally based analysis fields • Corresponding NWP analysis fields • Current status • 4 NDFD variables downscaled using regime dependent downscaling vector • 2m temp, 10m u&v, surface pressure • RTMA used as proxy for truth • Plan • Add more NDFD variables by • Expanding RTMA analysis • Using SMARTINIT + downscaled NDFD variables

  10. DERIVED VARIABLES • Objective • Generation of variables not carried in NWP models • Input data • Bias corrected and downscaled ensemble data (NWP model output) • Methods • Model “post-processing” algorithms • Apply after downscaling for variables affected by surface processes • SMARTINIT • NDFD weather element generator • Other tools? • Text generation, etc?

  11. DERIVED PRODUCTS • Objective • Answer any weather related question • Interrogate dataset • Input data • Bias corrected and downscaled ensemble data • Including all derived variables • Methods • NAWIPS ensemble functionalities • Other packages?

  12. ENSEMBLE FUNCTIONALITIES List of centrally/locally/interactively generated products required by NCEP Service Centers for each functionality are provided in attached tables (eg., MSLP, Z,T,U,V,RH, etc, at 925,850,700,500, 400, 300, 250, 100, etc hPa) Additional basic GUI functionalities:- Ability to manually select/identify members Done - Ability to weight selected members Done, Sept. 05 Potentially useful functionalities that need further development: - Mean/Spread/Median/Ranges for amplitude of specific features (TBS)- Mean/Spread/Median/Ranges for phase of specific features (TBS)

  13. VERIFICATION • Critical for • Development purposes (internal) • Demonstrating user relevant improvements (external) • Not contributing directly to improvements in operations • Lower priority, this is what gets delayed • Metrics to measure • Statistical reliability • Statistical consistency with verification data (proxy for truth) • Statistical resolution • Skill in distinguishing between different verification outcomes ahead of time • Independent of reliability • Practical consideration • Use recursive time filter to capture spatial variations in skill • Suited for operational use • Easy maintenance (once set up, runs on its own) • Reflects latest statistics, once a month snapshot saved for quick archive

  14. VERIFICATION • Current status • Separate packages used for global & regional • Global – against analysis only • SREF partially integrated with EMC unified system • Next implementation • Unified ensemble/probabilistic package against analysis • Integrated with EMC unified system • Long term plans • Verification against observations • Use same statistical package as that used against analysis • Introduce user specific high impact metrics • Needed for THORPEX, NUOPC

  15. VERIFICATION • Internal links • Unified verification • Wave, ice, river, hurricane, etc downstream applications • All should use standard ensemble/probabilistic package when available • Outreach • OHD – Julie Demargne • Developed their own software – need to unify • MDL – Matt Peroutka • Developed their own software • NCAR – Development Test Bed - Barbara Brown • Developing their own software • NAEFS / NUOPC • Need common metrics • THORPEX TIGGE • Discussion about common metrics, how to share software • ESRL – Jennifer Mahoney – not successful

  16. THORPEX APPLICATION AREAS - 1 • Hydrometeorology • Plan THORPEX – Hydro Plan • Participants OHD, NCEP, ESRL • Testbed HMT, CTB • Suggested performance measures • Extreme hydro-meteorological events, incl. dry and wet spells (CONUS) • Quantitative extreme river flow forecasting (Global & CONUS) • Severe weather • Plan Needs to be developed/formalized • Participants NCEP SPC, EMC, ESRL/GSD, NSSL? • Testbed HWTB • Suggested performance measures • Fire weather index • Combined severe weather index

  17. THORPEX APPLICATION AREAS - 2 • Aviation • Plan No formal engagement yet, informal contacts only • Participants AWC, EMC, GSD, ? • Testbed NEXGEN • Suggested performance measures • Flight restrictions • Icing, visibility, fog, clear air & convective turbulence • Marine forecasting • Plan Initial contact, no formal plan yet • Participants OPC, EMC, ESRL/GSD, ? • Testbed ? • Suggested performance measures • High 2m winds • Significant wave height • Note – Funding is lacking on user’s side to engage with THORPEX

  18. THORPEX APPLICATION AREAS - 3 • Tropical Cyclones • Plan No formal engagement yet, informal contacts only • Participants TPC, EMC, PSD, CIRA, HRD? • Testbed JHT • Suggested performance measures • Extreme surface wind speed • Storm tracks • Precipitation: type & extreme amounts (related to wet periods) • Storm surge & wave height • Winter Weather • Plan T-PARC plans • Participants HPC, EMC, ESRL/GSD, ? • Testbed JCSDA, NCEP Alaska Desk • Suggested performance measures • Same as for Tropical storms • Resilient coastal communities • See Tropical and Winter storm applications

  19. THORPEX APPLICATION AREAS - 4 • Health / public safety / economic impact • Plan NOAA THORPEX Science & Implementation Plan • Participants EMC, ARL, DTRA, ? • Testbed ? • Suggested performance measures • Air quality • Hot and cold spells • Intra-seasonal forecasting • Plan To be developed based on White Paper, Workshop • Participants CPC, EMC, PSD • Testbed CTB • Suggested performance measures • Weeks 2-4 temperature, precipitation • Drought / flooding potential

  20. LONG TERM PRIORITIES – AS PERCEIVED • Transition to probabilistic forecasting - NFUSE • Ensemble seen as backbone • Increased emphasis on ensembles • More resources • Multi-center approach – NAEFS, NUOPC, GIFS • Increased resolution • More thorough processing • Statistical processing • Process all variables • Combine all info into single guidance • Link with GIFS • New NOAA forecast database • Statistically and subjectively corrected ensemble • Bias corrected, downscaled, calibrated • All variables • Interrogation tools to extract any forecast info from database • Multiple user groups • Priority application areas • Tropical cyclone forecasting • Hydrologic forecasting • Aviation forecasting

  21. Produce internationally coordinated advance warnings and forecasts For high impact weather events To mitigate loss of life and property To contribute to the welfare of all WMO nations With a particular emphasis on least developed and developing countries Use ensemble prediction systems for Assessing and mitigating weather and climate related risks by Quantifying forecast uncertainty Provide guidance on & coordinate use of Observational Numerical data assimilation Forecasting User application resources To ensure the highest quality guidance for high impact weather events OBJECTIVES OF GIFS

  22. TIGGE Development (2005-2007) TIGGE Implementation (2008-) GIFS Development (2007-) GIFS Prototypes (2008-) GIFS Products Phase Implementation(2012-) GIFS End-to-End Phase Implementation (2014-) STAGES OF DEVELOPMENT

  23. Ensemble data access Real time, directly from ensemble producing centers For product generating centers Flexible processing methods to handle missing data Product generation Distributed & coordinated among ensemble producing centers and RSMC (DCPCs) Major challenge – control change process, etc Product distribution Common web interface using WIS concepts (GISCs) Ensemble data Real time Archived Probabilistic forecast products Predesigned On demand User applications External support critical – GIFS-TIGGE WG has no expertise or resources RCs, SERA, CBS, SWFDPs, etc GIFS – CONCEPT OF OPERATIONS FOR PRODUCTS

  24. Charge Develop detailed technical plans Contribute to operational implementation of plans Organization Report to GIFS-TIGGE WG Membership GIFS-TIGGE WG members, colleagues, interested external experts Critical links SIMDAT, GO-ESSP, NOMADS, CHPS, RISA, GRADS, etc 3 Topics, 2 groups #1 Access to & distribution of real-time & archived ensemble data #2 Ensemble-based product & service generation for high impact events Joint - Common web interface for data and product distribution FOCUS GROUPS

  25. Global NWP Centers Global ensemble forecasts Statistical correction of their ensemble Product generation (combine ensembles from multiple centers, etc) Regional Specialized Meteorological Centers (RSMCs) Coordination / resource organization for high impact event related activities in region Observationally based hires analysis Collect relevant forecast data Feedback to Global Centers on utility of their data/products LAM ensemble integrations Prepare special products Training National HydroMet Services (NHMSs) Collect observations Set & communicate forecast product and service requirements Interpret climate & meteorological guidance Special product generation & user outreach GIFS PARTNERS - OPERATIONS

  26. THORPEX DAOS WG research on Adaptive observations Adaptive DA techniques THORPEX PDP WG research on Ensemble generation Statistical correction of ensembles Adaptive methods THORPEX Regional Committees – Regional focus on Product design / requirements Operational configuration Training WWRP Nowcasting WG Statistical downscaling of ensemble forecasts WWRP Mesoscale WG Seamless prediction from hours to weeks WWRP CHFP (formerly TFSP) Seamless prediction from days to seasons GIFS PARTNERS & LINKS – DEVELOPMENT

  27. WGNE Verification Subgroup Verificatoin of TIGGE forecasts WWRP SERA WG Measuring value added by and cost of GIFS – for forecasters and others Training Equitable use of adaptively allocatable forecast resources CBS ET-EPS Training for new GIFS products HEPEX Hydrologic user applications CBS Operational systems and requirements North American Ensemble Forecast System (NAEFS) Experience with fast operational implementation of multi-center ensemble system GIFS PARTNERS & LINKS – DEVELOPMENT - 2

  28. Build on success of Southern Africa SWFDP CBS project to expand from 5 to 16 countries Capacity building Empower regions to tackle their unique forecast problems Special consideration to IT & limitations CBS links critical Consider operational systems and requirements CBS interest What is possible today GIFS interest What is possible tomorrow Transition TOHRPEX research into operations PATH TO OPERATIONS

  29. Concurrent development in 4 regions Form separate subgroups in 4 regions from 10 Global and RSMC(s) in each region Use identical data from 10 Global centers Inter-comparability Develop products specifically tailored for each region Periodically exchange experience to cross-fertilize efforts Grant real time data access to GIFS partners to Global ensemble data Products for Testing operational data feed Engaging forecasters and other experts at global centers & regions Engage Global centers in Product development Provide regular feedback from RSMCs on product design/quality Contribute to forecaster training Ensures fast and high quality product development for all 4 regions Best use of regional data sources Best service for special product needs PROPOSED REGIONAL APPROACH

  30. Tropical cyclone forecasting – CXML data from multiple centers Focus group involvement Common web interface for CXML data Access Display Combination / product generation Probabilistic precipitation forecasting Subgroups to address special product needs in each region Regional observationally based analysis Real time data/product exchange among participants requested Product development / testing in parallel with Focus group technical developments Probabilistic 10m winds, 2m temperature next PROTOTYPES FOR GIFS

  31. NCEP/GEFS raw forecast 8+ days gain NAEFS final products From Bias correction (NCEP, CMC) Dual-resolution (NCEP only) Down-scaling (NCEP, CMC) Combination of NCEP and CMC

  32. Estimate expected growth in data exchange volume Work under EC – NOAA big pipe Alternative until EC-NOAA pipe Continue using ftp (2-3 yrs?) Set priorities for different data types (NA first) Establish high level (EC-NOAA) targets for ensemble resolution Coordinate operational timelines at CMC & NCEP Use downscaling methods For US & highly populated portions of Canada & Mexico, RTMA applications Joint development of observationally based precip and other analyses Collaborate on algorithms/software for worded uncertainty info generator for public Invitation from Mexico for next NAEFS workshop to be held there Possibility of holding jointly with THORPEX NA Regional Committee meeting NAEFS WORKSHOP OUTCOME

  33. Detailed request from Mexico for improved products / collaboration Explore joint aviation related product development Connection with NEXGEN 5D-cube Statistically reliable ensemble database Interrogation tool set to aswser questions Pursue hydrological applciations Comparisons and complementary coverage of NA domain Pursue joint wave ensemble application NCEP, FNMOC uses WAVEWATCH-3 NCEP to provide FNMOC with wind bias correction algorithm NCEP to consider running WAVEWATCH-3 with bias corrected CMC ensemble Until EC develops ows wave model Collaborate in development of meteorological and hydrologic verification systems Address shortcomings in statistical corrections Improve bias correction methods Bayesian considerations Include higher moment corrections Consider benefits from hind-casts NAEFS WORKSHOP OUTCOME - 2

  34. Consider extension to regional ensemble forecasting CMC parallel testing to start in 2009 Experimental data exchange / evaluation Evaluate potential for operational implementation (2010-11?) Collaborate on intra-seasonal forecasting Weeks 3-4 temp/precip MJO Consider extending NAEFS integrations to 30 days (2009-10?) Establish NAEFS standards for Quality – verification metrics against Observations & analyses Computational efficiency Operational considerations – ease of implementation / maintenance Etc Test / evaluate FNMOC inclusion into NAEFS Test by Aug 2009 Implement, subject to positive results, by Aug 2010 Discuss collaboration with CMA Coordinate NAEFS with other multi-center efforts NUOPC in US (FNMOC, AFWA, NCEP) GIFS internationally NAEFS WORKSHOP OUTCOME - 3

  35. BACKGROUND

  36. COMPONENTS OF SYSTEM – NCEP • Bias correction • Objective • Remove lead time dependent model bias on model grid • Calibrate higher moments of ensemble • Combine information from all sources into single set of ensemble • Data • NWP analysis as reference (“truth”) • Method • Bayesian processor • Proxy for truth • Objective • Create observationally based fine resolution analysis • To be used for downscaling, verification - RTMA • Data • All available observations • Fine resolution NWP forecasts (incl. hires land surface) as guess (no cycling of forecasts) • Method • Numerical analysis techniques • Downscaling • Objective • Bring bias corrected ensemble information to user relevant grid - NDFD • Data • Bias corrected ensemble (incl. hires land surface) • NWP analysis & fine resolution observationally based analysis • Method • Perfect prog + realistic (stochastic) fine resol. variability (considering hires land surface forecast)

  37. PROXY FOR TRUTH • Method • RTMA analysis • Based on • All available observations • Dynamical forecast as first guess • First guess not cycled on RTMA analysis => RTMA can draw close to data • 2D GSI used to combine info every hour • Data sources • List of observational data used • First guess • 1-hr 10 km RUC forecast • Current status • CONUS • 5 variables operational since 2006 • 2m temp, dew point temp, 10m u,v5 km • Alaska35 NAEFS & SREF variables bias corrected • 1st moment corrected only • NAEFS & SREF processed separately • CMC + NCEP ensembles + GFS hires control combined • Plan • Bias correct all model output variables (~200, 2-3 yrs) • Include precipitation (use observationally based analysis as truth – 1-2 yrs) • Add hind-casts for NCEP ensemble control (1-2 yrs) • Process ensembles & hires forecasts from FNMOC, ECMWF (1-2 yrs) • Combine all forecasts into single guidance (2-3 yrs) • Ensemble & hires from NCEP, CMC, FNMOC, ECMWF, and SREF

  38. DEVELOPMENT ENVIRONMENT • Goal • Ensure best methods are used in NOAA operations • Means • R2O • Best research results guide developments for operations • O2R • Articulate operational requirements to research community • Open source software development • Rules for engagement of community defined • Well defined functionalities, links, etc • Allows all parties to contribute • Eg, National Environmental Modeling System (NEMS) • Fair competition and/or complementary development among teams • Modular design of algorithms • Components to address each distinct problem can be developed independently • Best solution for each problem/component selected for operational system • Lumping separate problems together limits collaboration, would lead to sub-optimal choices • Operational implementation decisions • Version for each module chosen based on quality & other requirements • Single version selected from among candidate methods • Community input, and quality of final product maximized

  39. COMPONENTS OF SYSTEM – MDL • Bias correction • Bias correction & downscaling in one step • Lead time dependent bias removed wrt observing sites • Information downscaled to observing sites • Different numerical guidance products processed separately • GFS, NAM, etc • Experimental processing of ensemble spread • Proxy for truth • No explicit fine resolution analysis generated or used • Verification etc needs not met • Inherent “analysis” created when information spread onto NDFD grid • Downscaling • Bias corrected forecast info at observing sites spread onto NDFD grid • “Gridded MOS”

  40. EMC-MDL COLLABORATION • Compare quality of current operational / experimental products • Gridded MOS vs. Downscaled NAEFS • Ongoing • Kathy Gilbert, Val Dragostano – Zoltan Toth, Bo Cui • Proxy for truth issue unresolved • Need observations independent of MOS • MDL experimental ensemble guidance vs. Downscaled MOS • 10/50/90 percentiles to be evaluated • Matt Peroutka & Zoltan Toth • Proxy for truth issue • Proxy for truth? • Agree on best proxy for truth • Collaborate on • Improving RTMA • Creating best CONUS precipitation analysis & archive • Joint research into best downscaling methods? • Climate, regime, case dependent methods • Addition of fine temporal/spatial variability into ensemble

  41. BACKGROUND

  42. DISCUSSION POINTS - 2 • Practical implementation • Pre-compute • Freeze forecast system for years (3?) • Would be new to weather forecasting • Estimate loss of skill • Large human / cpu effort each time • Quantum effort • On the fly • As part of operations • One-time implementation effort • Easier maintenance afterward • Forecast system updates possible any time • Smaller sample with same cpu usage as with pre-computation • Due to continual update of hind-cast set (factor of 3) • Estimate loss of skill • Compare with that due to less frequent forecast updates in pre-computed hind-casts

  43. POSSIBLE FUTURE CONFIGURATION • Use a single DA / forecast system for all ranges • Ensemble size of ~20 • Drop in steps with increasing lead time • Resolution (horizontal, vertical, time frequency of output) • Frequency of issuance • Frequency of model updates for extensions • Execute control hind-casts as part of operations • Cost is 25% (up to100%?) of forecasts • Example • LEAD TIME ISSUANCE FREQUENCY RATIO • 3-hour 30 mins 6 • 12-hr hrly 12 • 36-hr 3-hrly 12 • 7 days 6-hrly 28 • 15 days 12-hrly 30 • 35 days daily 35 • 1 year weekly 52

  44. Zoltan Toth Environmental Modeling Center NOAA/NWS/NCEP Acknowledgements: Steve Lord, Roman Krzysztofowicz, Yuejian Zhu, Dingchen Hou, Bo Cui, Malaquias Pena, Mike Charles, DJ Seo, Hua-Lu Pan http://wwwt.emc.ncep.noaa.gov/gmb/ens/index.html 2nd DISCUSSION ON HINDCAST GENERATION

  45. OUTLINE • Why we need hind-casts? • Background on hind-casting • New reanalysis / CFS reforecast • Proposed GEFS hind-casts • Seamless weather-climate forecasts

  46. WHAT DO WE NEED THE HINDCASTS FOR? • End goal • Bias-free ensemble / probabilistic forecasts • Statistically consistent with proxy for truth • For user relevant variables • Observations or observationally based hires analysis • What’s available? • Ensemble forecasts • Model (not user) variables on coarse model resolution • Lead-time dependent bias in forecasts (model drift) • Problems • How to remove lead-time dependent bias (on model grid)? • Compare NWP forecast with analysis • Hind-cast sample needed • How to connect model vs. user variables? • Compare NWP analysis with observationally based analysis • Perfect prog, or downscaling • NO HINDCASTS ARE NEEDED

  47. BACKGROUND • Focus on 3+ days (NAEFS) first • Bias is larger at longer range, coarser resolution • Bias at short lead well estimated with most recent forecasts • Bo et al, Hamill et al • What is more important, bias correction or downscaling? • For bias correcting 1st moment of 2m temperature • Downscaling adds more value (Bo et al) • Effect from hind-casts limited • NAEFS is multi-center system • Combining different ensembles adds value • We control only NCEP part • Canadians skeptical of need for large hind-cast dataset • Believe in power of perfect prog applications • Effect from NCEP-only hind-casts limited

  48. WHAT WE UNDERSTAND • Larger sample, better bias estimation/correction • Urge to generate large hind-cast sample • Must use same (analysis/)forecast system for for- & hind-casts (Bo) • Generation of hind-casts is cpu-extensive • Gain from increased sample asymptotically tapers off to zero • Convergence rate of different bias estimation methods differ • Error in Bayesian methods converge faster to zero- Son et al • Operational forecast systems continually improve • Freezing a system reduces skill - Caplan • Skill of forecast system can be improved with cpu • Cpu needs competing with hind-casting for increased skill • Increased resolution • Improved physics • Larger ensemble

  49. WHAT WILL BE AVAILABLE • New coupled ocean/atmosphere reanalysis/reforecast • Designed for next generation CFS • Resource intensive (human, cpu) • Frozen for ~8 yrs • Infrastructure • GSI, GFS close to 2008 operational • New MOM4 • Time period covered • ~30 yrs • Frequency / configuration • Reanalysis every 6 hrs / T382L64, coupled • Reforecast every 30 hrs / T126L64, coupled • For bias correcting frozen CFS system

  50. PROPOSED APPROACH • Results driven • Strive for best overall performance • Balanced distribution of resources for • Better forecast system vs. • Larger hind-cast sample • Incremental • Implement • What has proven value • For which resources are available • Expand later based on additional/new evidence • Sustainable • Resource conscious • Implement / transition forecast system once into operations • Hind-casting uses same forecast system, run as part of operations • Single maintenance / transition process

More Related