1 / 31

WMO/CLIPS training module Predictability and prediction 4.4 Verification of SST forecasts

WMO/CLIPS training module Predictability and prediction 4.4 Verification of SST forecasts Michael Davey and Andrew Colman Seasonal Prediction Group version 21 November 2000. Verification of SST forecasts Introduction 1

cybill
Télécharger la présentation

WMO/CLIPS training module Predictability and prediction 4.4 Verification of SST forecasts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WMO/CLIPS training module Predictability and prediction 4.4 Verification of SST forecasts Michael Davey and Andrew Colman Seasonal Prediction Group version 21 November 2000

  2. Verification of SST forecasts Introduction 1 There are several ways of predicting sea surface temperature (SST) for several months ahead. The skill of such schemes is assessed by verifying the predictions against observations. In this module some common verification methods are described and illustrated. It is assumed that monthly average SST anomalies are predicted, as that is what most schemes forecast at present.

  3. Verification of SST forecasts • Introduction 2 • It is assumed that the SST predictions are deterministic: i.e. the forecasts provide a specific SST value. • (An extension to probabilistic predictions will be added when this module is revised.) • Verification examples are provided for: • a regional average (the Niño3 region: 5N-5S, 150W-90W) • a field of values on a global latitude-longitude grid. • The examples are taken from a statistical scheme for global SST prediction.

  4. Verification of SST forecasts • Introduction 3 • Common types of verification in use for SST: • graphs of timeseries • scatterplots • anomaly correlation • root mean square error • Some verification measures for time series will be described. • Note - significance and error bars are important aspects that are • not discussed in this module.

  5. Verification of SST forecasts • Observed SST - gridded datasets • By analysing and combining various observational sources (direct temperature measurements and remote sensing), several centres routinely produce globally complete gridded SST datasets that are convenient to use for verification. • Examples: • Hadley Ice and SST (HadISST) from the Met Office • 2DVarSST from NCEP • Monthly fields are available on 1 x 1 latitude-longitude grids

  6. Verification of SST forecasts Observed SST - Climatology (denoted SSTclim ) Climatological fields are also available for each month, by averaging conditions over several years. Climatologies are currently provided based on the 1961-90 period. For example, January 1961-90 climatology is the average of all January SST fields in 1961-90.

  7. Verification of SST forecasts Observed SST example - SST anomaly (SST - SSTclim) December 1997

  8. Verification of SST forecasts Observed SST Example -Timeseries of Niño3 SST anomaly ( SST - SSTclim ) from HadISST

  9. Verification of SST forecasts • Predictions - lead time • Definition: • A lead time of M months means that there are M months between the • time the prediction is made and the time that the prediction is for. • Examples: • For a forecast of SST in April that is issued in January, the lead time is 2 months • For a forecast of Oct-Nov-Dec SST that is issued in September, the lead time is 0 months

  10. Verification of SST forecasts Predictions - example- forecasts of Niño3 SST anomaly for lead time 0 to 6 months starting from end January, for 1990-1999

  11. Verification of SST forecasts Predictions - example Forecast of global SST anomaly for December 1997 at lead time 3 months

  12. Verification of SST forecasts Predictions - persistence One particularly simple prediction strategy is to assume that the observed SST anomaly remains unchanged (simple persistence). Another similar strategy is to decrease the observed SST anomaly as lead time increases, using a scale derived from e.g. lag correlation (damped persistence) For many regions SST anomaly persistence is an effective strategy for a few months range, and it is common to compare SST forecasts with persistence forecasts. Damped persistence is used for the comparisons in this module.

  13. Verification of SST forecasts Predictions - persistence example Damped persistence Simple persistence

  14. Verification of SST forecasts Verification - notation mean value (average) of N datapoints xi <x> = (1/N) sum (1 to N) xi variance: var = (1/N) sum (1 to N) ( xi - <x> )2 standard deviation: σ = var1/2

  15. Verification of SST forecasts Verification - notation SSTobs observed sea surface temperature SSTpred predicted sea surface temperature note: an anomaly may be relative to climatology, SST - SSTclim or relative to a mean value, SST - <SST> we will use the notation SSTA = SST - SSTclim , SST´ = SST - <SST>

  16. Verification of SST forecasts Verification - visual The timeseries of Niño3 forecasts in earlier slides included the observed SST anomalies for the same period, thus providing a simple visual comparison. A similar comparison is to plot the forecasts for a particular lead time vs observations:

  17. Verification of SST forecasts Verification - visual Example - side-by-side global maps of forecast and observed SST anomalies predicted for Dec 1997 observed December 1997

  18. Verification of SST forecasts Verification - scatterplot

  19. Verification of SST forecasts Verification - categories example - numbers of Niño3 area average SST anomalies in three categories (Below/Near/Above normal), from the data on the previous slide. Observed B N A B 15 5 0 Forecast N 7 13 6 A 1 4 17

  20. Verification of SST forecasts Verification - correlation anomaly correlation: r = < (SSTobs - <SSTobs>) (SSTpred - <SSTpred>) > / σobs σpred = < SST´obs SST´pred > / σobs σpred

  21. Verification of SST forecasts Verification - anomaly correlation - Niño3 example

  22. Verification of SST forecasts Verification - anomaly correlation example - Niño3 - contour map

  23. Verification of SST forecasts Verification - anomaly correlation - seasonal Niño3 example

  24. Verification of SST forecasts Verification - anomaly correlation - global map example 6 month lead 3 month lead

  25. Verification of SST forecasts Verification - mean square error - definitions mean square error mse = < (SSTpred - SSTobs)2 > = < (SSTApred - SSTAobs)2 > = < (SST´pred - SST´obs)2 > + ( <SSTpred - SSTobs> )2 mean square anomaly error msae = < (SST´pred - SST´obs)2 > root mean square anomaly error: r msae = msae1/2 normalised root mean square anomaly error: n r msae = r msae / σobs

  26. Verification of SST forecasts Verification - rmsae and nrmsae - Niño3 example normalised rmsae scaled by σobs rmsae

  27. Verification of SST forecasts Verification - normalisedr mse - global example 3 month lead 6 month lead

  28. Verification of SST forecasts Verification - skill score a skill scoremeasures the skill of one prediction scheme against another reference scheme root mean square skill score: rmsss = 1 - rmsae (forecast) / rmsae (reference)

  29. Verification of SST forecasts Verification - rmsss - Niño3 example - reference is damped persistence

  30. Verification of SST forecasts Verification - rmsss - global example 3 month lead 6 month lead

  31. Verification of SST forecasts References extensive further references can be found in these sources: Experimental Long-Lead Forecast Bulletin http://www.iges.org/ellfb Latif, M. et al., 1998: A review of the predictability and prediction of ENSO. J.Geophys.Res.,103(C7), 14375-14393. END

More Related