1 / 20

Joint Ensemble Forecast System November 2004 Maj Tony Eckel, PhD HQ AFWA, Chief Air & Space Models Branch

The Proposed. Joint Ensemble Forecast System November 2004 Maj Tony Eckel, PhD HQ AFWA, Chief Air & Space Models Branch. Deterministic Forecasting . Ensemble Forecasting. ?. …etc. Ignores forecast uncertainty Potentially very misleading Oversells forecast capability.

dard
Télécharger la présentation

Joint Ensemble Forecast System November 2004 Maj Tony Eckel, PhD HQ AFWA, Chief Air & Space Models Branch

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Proposed Joint Ensemble Forecast System November 2004 Maj Tony Eckel, PhD HQ AFWA, Chief Air & Space Models Branch

  2. Deterministic Forecasting Ensemble Forecasting ? …etc • Ignores forecast uncertainty • Potentially very misleading • Oversells forecast capability • Reveals forecast uncertainty • Yields probabilistic information • Enables optimal decision making JEFS’ Goal Prove the value, utility, and operational feasibility of mesoscale, short-range ensemble forecasting to DoD operations.

  3. FY04 HPCMP Distributed Center Award • Apr 03: FNMOC and AFWA proposed a split distributed center to the DoD High Performance Computing Modernization Program (HPCMP) as a DoD Joint Operational Test Bed for the Weather Research and Forecast (WRF) modeling framework • Apr 04: Installation began of $4.2M in IBM HPC hardware, • split equally between FNMOC and AFWA • (two 96 processor IBM Cluster 1600 p655+ systems) • Fosters significant Navy/Air Force collaboration in NWP for • 1) Testing and optimizing of WRF configurations to meet • unique Navy and Air Force NWP requirements • 2) Developing and testing mesoscale ensembles based on • multiple WRF configurations to meet DoD needs • 3) Testing of Grid Computing concepts and tools for NWP • Apr 08: Project Completion

  4. Ensemble Forecast RequirementsAir Force (and Army) AFW Strategic Plan and Vision, FY2008-2032 Issue #3/4-3: Use of multi-scale (kilometer to meter resolution), ensemble, and consensus model forecasts, combined with automation of local techniques, to support planning and execution of military operations. “Ensembles have the potential to help quantify the certainty of a prediction, which is something that users have been interested in for years. The military applications of ensemble forecasting are only at their beginnings; there are years’ worth of research waiting to be done.” Operational Requirements Document, USAF 003-94-I/II/III-D, Centralized Aerospace Weather Capability(CAWC ORD) …will support ensemble forecasting with the following capabilities: 1) The creation of sets of perturbed initial conditions of the fine-scale model initialized fields in selected regional windows. 2) Assembly of ensemble forecasts either from model output sets derived from the multiple sets of perturbed initial conditions or from sets assembled from the output from different models. 3) Evaluation of forecasting skill of ensemble forecasts compared to single forecast model outputs. Air Force Weather, FY 06-30, Mission Area Plan (AFW MAP) Deficiency: Mesoscale Ensemble Forecasting “The key to successful ensemble forecasting is many different realizations of the same forecast events. Studies using different models - or the same model with different configurations - consistently yield better overall forecasts. This demonstrates a definite need for multiple model runs.” R&D Matrix MSA Shortfall D-08-07K: Insufficient ensemble forecasting capability for AFWA’s theater scale model

  5. Ensemble Forecast RequirementsNavy • No documented requirement or supporting Fleet request for ensemble prediction. • Navy ‘requirements’ are written in terms of warfighting capabilities. The current (draft) METOC ICD (old MNS) only specifies parameters required for support.However, ensembles present a solution for the following specified warfighter requirements: • Long-range prediction for mission planning, optimum track ship routing, severe weather avoidance • Tropical cyclone prediction for safety of operations, personnel safety • Winds, turbulence, boundary layer structure for chem/bio/nuclear dispersion (WMD support) • Cloud base, fog, aerosol for slant range visibility (aerial recon, flight operations, targeting) • Boundary layer structure/atmospheric refractivity (T, q) for EM propagation (detection, tracking, communications) • Surface winds (ASW, mine drift, SAR, flight operations in enclosed/narrow waterways) • Surf and sea heights (SOF, small boat ops, logistics) • Turbulence, cloud base/tops (OPARS, safety of flight) • Whenever the uncertainty of the wx phenomena exceeds operational sensitivity, either a reliable probabilistic or a range-of-variability prediction is required.

  6. Joint Global Ensemble • Description: Combination of current GFS and NOGAPS global, medium-range • ensemble data • Initial Conditions: Breeding of Growing Modes 1 • Model Variations/Perturbations: Two unique models, but no model perturbations • Window: Global • Resolution: 1.25 1.25 (~100 km) • Number of Members: 40 at 00Z • 30 at 12Z • Forecast Length/Interval: 10 days/12 hours • Timing • Cycle Times: 00Z and 12Z • Products by: 07Z and 19Z 1Toth, Zoltan, and Eugenia Kalnay, 1997: Ensemble Forecasting at NCEP and the Breeding Method. Monthly Weather Review: Vol. 125, No. 12, pp. 3297–3319.

  7. 5km 15km Joint Mesoscale Ensemble • Description: Multiple high resolution, mesoscale model runs generated at FNMOC • and AFWA • Initial Conditions: Ensemble Transform Kalman Filter2run on short-range (6-h), • mesoscale data assimilation cycle using high resolution global • model (T254 GFS and T239 NOGAPS) • Model variations/perturbations: • Multimodel: WRF-EM, WRF-NMM, COAMPS • Varied-model: various configurations of physics packages • Perturbed-model: randomly perturbed sfc boundary conditions (e.g., SST) • Window: East Asia (as directed by COPC) • Resolution: 15 km for baseline JME (early 2006) • 5 km nest later in project • Number of Members: 30 (15 run at each site) • Forecast Length/Interval: 60 hours/3 hours • Timing • Cycle Times: 06Z and 18Z • Products by: 14Z and 02Z ~7h production /cycle 2Wang, Xuguang, and Craig H. Bishop, 2003: A Comparison of Breeding and Ensemble Transform Kalman Filter Ensemble Forecast Schemes. Journal of the Atmospheric Sciences: Vol. 60, No. 9, pp. 1140–1158.

  8. Storage of principal fields FNMOC Medium Range Ensemble  18 00Z, 8 12Z NOGAPS, T119, 10 d  Analysis Perturbations: Bred Modes  Model Perturbations: None Calibrate Joint Global Ensemble (JGE) Products  Apply postprocessing calibration  Long-range products tailored to support warfighter planning Joint Ensemble Forecast System FNMOC AFWA NCEP Medium Range Ensemble  44 staggered GFS runs, T126, 15 d  Analysis perturbations: Bred Modes  Model Perturbations: in design lateral boundary conditions high res. first guess Observations Data Assimilation 3DVAR / NAVDAS Ensemble Transform Kalman Filter (ETKF) Generate initial condition perturbations Observations and Analyses Storage of principal fields “warm start” Calibrate • Joint Mesoscale Ensemble (JME)  30 members, 15/5km, 60 h, 2/day  One “demonstration” theater • Multi model (WRF, COAMPS) • Perturbed model: varied physics and surface boundary conditions • JME Products •  Apply postprocessing calibration • Short-range products tailored to support warfighter operations

  9. JEFS Production Schedule 00Z cycle data 06Z cycle data 12Z cycle data 18Z cycle data GFS ensemble Grids to AFWA and FNMOC NOGAPS ens. grids to AFWA Transform and calibrate JGE Make/Distribute JGE products Obtain global analyses Update JGE Calibration Data Assimilation Exchange data & run ETKF Run JME models Exchange output Make/Distribute JME Products Update JME Calibration 06Z production cycle 18Z production cycle 00 03 06 09 12 15 18 21 24(Z)

  10. Product Design Approach Tailor products to customer’s needs and weather sensitivities Forecaster Products  Designto help transition from deterministic to stochastic thinking Warfighter Products  Design to aid critical decision making (Operational Risk Management)

  11. JEFS Users For Product Design and Testing 20th Operational Weather Squadron (OWS) 607th Weather Squadron FIFTH Air Force

  12. JEFS Users For Product Design and Testing Yokosuka’s Naval Pacific Meteorological and Oceanographic Center SEVENTH Fleet

  13. Consensus Plot with Model Confidence Information • Consensus (isopleths): shows “best guess” forecast (ensemble mean or median) • Model Confidence (shaded) • Increase Spread in Less Decreased confidence • the different forecasts Predictability in forecast

  14. Probability Plot % • Probability of occurrence of any weather variable/threshold (i.e., sfc wnds > 25 kt) • Can be tailored to critical sensitivities, or interactive (as in IGRADS on JAAWIN)

  15. Current Deterministic Meteogram • Shows the range of possibilities for all meteogram-type variables • Box and Whisker plot more appropriate for large ensemble • Excellent tool for point forecasting (deterministic or stochastic) Multimeteogram

  16. Probagram (Probabilistic Meteogram) Probability of Warning Criteria at Osan AB JME 15/06 • “Heads up” for forecaster on what to focus on and when warning may be required • Encourage probabilistic thinking • Could be displayed in conjunction with an Extreme Forecast Index

  17. 5% 15% 80% Integrated Weather Effects Decision Aid (IWEDA) Bridging the Gap Stochastic Forecast Binary Decisions/Actions AR Route Clear & 7 Go / No Go T-Storm Within 5 ? IFR / VFR GPS Scintillation Bombs on Target Crosswinds In / Out of Limits Flight Hazards Decision Theory: Minimize cost of operating (in the long run) by choosing an optimal threshold of probability for taking protective action. - What is the cost of protecting? - What is the loss if the event occurs and I did not protect? Probabilistic IWEDA—a tool for Operational Risk Management (ORM)

  18. Optimal Threshold = 15% Decision Theory Example Forecast? YES NO Critical Event: sfc winds > 50kt Cost (of protecting): $150K Loss (if damage ): $1M Hit False Alarm Miss Correct Rejection YES NO $150K $1000K Observed? $150K $0K

  19. Challenges and Players • Model Perturbations • Ensemble Calibration • Products • Training, Education, and Sales • Analysis and Verification

  20. JEFS Milestones GFS ensemble grids NOGAPS ens. grids to AFWA Build JGE Archive Uncalibrated JGE Products Calibrate JGE JGE JME models perturbations Data Assimilation and ETKF Build JME Archive Uncalibrated JME Products Calibrate JME JME Implement 5-km nest and updated model perturbations 1 yr Jan 05 Apr Jul Oct Jan 06 Apr Jul Oct

More Related