1 / 14

NWS TAF Verification

NWS TAF Verification. Brandi Richardson NWS Shreveport, LA. Do we care how our forecasts verify?. NO!. Do we care how our forecasts verify?. Yes! The NWS measures verification by many means Probability of Detection (POD) False Alarm Ratio (FAR) Critical Success Index (CSI)

missy
Télécharger la présentation

NWS TAF Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NWS TAF Verification Brandi Richardson NWS Shreveport, LA

  2. Do we care how our forecasts verify? NO!

  3. Do we care how our forecasts verify? • Yes! • The NWS measures verification by many means • Probability of Detection (POD) • False Alarm Ratio (FAR) • Critical Success Index (CSI) • Percent Improvement • Set goals for verification • Local offices add own flavor • Total IFR (IFR, LIFR, VLIFR)

  4. Why is verification important? • Need to know what to improve • Lose credibility if too many forecasts are wrong • Lose customers • Lose jobs • Additional training • New techniques • Improved model guidance • Need to know what we are doing well

  5. NWS TAF Verification • TAFs evaluated 12 times per hour (every five minutes), or 288 times per 24-hour period • TAFs compared to ASOS five-minute observations • ASOS = Automated Surface Observing System, located at TAF airports • Stats calculated by flight category • i.e., VFR, MVFR, IFR, LIFR, VLIFR

  6. Probability of Detection • How often did we correctly forecast a particular flight category to occur? • Also known as “Accuracy” • POD = V/(V+M) • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • M = missed events • Ex: VFR conditions forecasted…IFR conditions occurred • Ranges from 0 – 1, 1 being perfect

  7. False Alarm Ratio • How often did we forecast a particular flight category to occur that did not occur? • i.e., how often did we “cry wolf”? • FAR = U/(U+V) • U = forecasted and unverified • Ex: IFR forecasted…VRF occurred • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • Ranges from 0 – 1, 0 being perfect

  8. Critical Success Index • CSI = V/(V+M+U) • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • M = missed events • Ex: VFR conditions forecasted…IFR conditions occurred • U = forecasted and unverified • Ex: IFR forecasted…VRF occurred • Ranges from 0 – 1, 1 being perfect • Incorporates both POD and FAR • Overall score of performance

  9. Percent Improvement • Forecaster CSI vs. Model Guidance CSI • Did we beat the model? IFR will prevail… IFR?! It’s July and dew points are in the 20s! Take that! Forecaster GFS

  10. 2009 NWS Goals • The NWS has set goals for TAF forecasts • For total IFR (includes IFR, LIFR, and VLIFR) • POD ≥ 0.640 (64%) • FAR ≤ 0.430 (43%) • How do we measure up?...

  11. Examples of Local TAF Verification

  12. Examples of Local TAF Verification

  13. Examples of Local TAF Verification

  14. The Bottom Line • Sometimes we do get the forecast wrong. • Examination of TAF verification statistics helps to find our weaknesses and allows us to find ways to improve our forecasts. • The NWS strives to provide quality products and services to our aviation customers and partners.

More Related