1 / 16

James Brown

RFC Verification Workshop. Plans for expanding probabilistic forecast verification. James Brown. James.D.Brown@noaa.gov. Contents. 1. Future of EVS Release schedule Planned improvements 2. New verification techniques Real time forecasting Screening verification datasets

mzachary
Télécharger la présentation

James Brown

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RFC Verification Workshop Plans for expanding probabilistic forecast verification James Brown James.D.Brown@noaa.gov

  2. Contents • 1. Future of EVS • Release schedule • Planned improvements • 2. New verification techniques • Real time forecasting • Screening verification datasets • 3. Discussion and feedback survey 2

  3. 1. Future of EVS

  4. Release schedule • XEFS is outside of AWIPS • First limited release of EVS 1.0 beta • MARFC and CNRFC to conduct initial tests • Beginning 09/07 • Other RFCs?? • After initial tests complete (1-2 months) • Depends on expressions of interest 4

  5. Planned improvements • Managing workload • Batch processing (for forecast groups etc.) • Tailor interface for different users • Functionality for screening results • Improved documentation/help • Use cases for different metrics • Improved user’s manual and online help • Confidence intervals for metrics 5

  6. Planned improvements • Longer term goals • Common platform for ensemble forecasting… • …XEFS: common appearance and functions. • Led by HSEB, Steve Shumate and others. • Common platform for verification (with IVP). 6

  7. 2. New verification methods

  8. Prognostic verification • The principle • Live forecast issued X days into future • How well are they likely to perform? • How did similar forecasts perform in past? • Three sources of information: • a) The forecaster: what determines ‘similar’? • b) The past forecasts that are ‘similar’ • c) Past observations corresponding to (b)

  9. Huntingdon, PA: 1st June 1994

  10. Lead day 1 from previous example

  11. North Fork, CA: 13th June 2003

  12. North Fork, CA: 13th June 2003 Lead day 1

  13. Current status • Very early stage • Preparing a manuscript on methods • Generated several examples (/w code) • Critical questions • How to select ‘similar’ forecasts? • Conditioning may be simple or complex • What types of products (graphics etc.)? • What functionality to include in tools?

  14. ‘Meta-verification’ • Screening large verification datasets • Large volumes of data produced by EVS • End-users need condensed data • But a single summary metric = biased view • Better to build rules using several metrics • Make rules ‘aware’ of forecasting situations • Status • Will investigate possibilities soon (e.g. AI) 14

  15. 3. Discussion and feedback 15

  16. Questions What plans for ensemble verification? What priorities for diagnostic verif.? What priorities for real-time verif.? Ideas on specific products for each? 16

More Related