1 / 24

Mesoscale Atmospheric Model Validation Using Self-Organizing Maps (SOMs)

Mesoscale Atmospheric Model Validation Using Self-Organizing Maps (SOMs). John J. Cassano and Amanda H. Lynch Cooperative Institute for Research in Environmental Science University of Colorado at Boulder. Outline. ARCMIP Model simulations “Typical” Model Validation Strategy What are SOMs?

edric
Télécharger la présentation

Mesoscale Atmospheric Model Validation Using Self-Organizing Maps (SOMs)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mesoscale AtmosphericModel Validation UsingSelf-Organizing Maps (SOMs) John J. Cassano and Amanda H. Lynch Cooperative Institute for Research in Environmental Science University of Colorado at Boulder

  2. Outline • ARCMIP Model simulations • “Typical” Model Validation Strategy • What are SOMs? • Examples of using SOMs for model validation • Use of SOMs for AMPS Validation • Conclusions

  3. Overview of ARCMIP • ARCMIP • Arctic Regional Climate Model Intercomparison Project • The goal of ARCMIP is to improve the simulation of Arctic regional climate in numerical models • Initial ARCMIP experiments focus on atmospheric simulations • Specify sea ice concentration, sea surface temperature, and sea ice temperature • Evaluate models based on: • SHEBA column observations • Large scale atmospheric analyses • Satellite observations

  4. ARCMIP SHEBA Domain

  5. ARCMIP Model Runs • ARCSyM • Original ARCSyM simulation • ARCSyM cld/pbl • HIRHAM • Polar MM5

  6. “Typical” Model Evaluation Strategy • Compare model and observed fields directly • Time series of observed and modeled variables • Model validation statistics (bias, RMSE, correlation, etc.) • Case Study Evaluations • Compare model data with observational analyses • Ex. Difference of monthly or seasonal mean sea-level pressure

  7. “Typical” Model Evaluation Strategy • Advantages • Simple techniques with easy interpretation • Highlights differences between models and analyses and also inter-model differences • Disadvantages • Neglects differences in synoptic events • These events are the items of interest for operational weather forecasting applications • Similar seasonal mean SLP may mask differences in simulated synoptic climatology • Can be difficult to gain physical insight into the source of model errors

  8. What are SOMs? • SOM - Self-Organizing Map • SOM technique uses an unsupervised learning algorithm (neural net) • More details about the technique are given in the extended abstract • Clusters data into a user selected number of nodes • SOM algorithm attempts to find nodes that are representative of the data in the training set • More nodes in areas of observation space with many data points • Fewer nodes in areas of observation space with few data points • SOMs are in use across a wide range of disciplines

  9. Synoptic Pattern Classification: A Sample SOM

  10. Application of SOMsfor Model Validation Studies • Synoptic pattern classification • Frequency of occurrence of synoptic patterns • Determine model errors for different synoptic patterns

  11. Application of SOMsfor Model Validation Studies • Use model or analysis data as a training set for the SOM algorithm • Once the SOM is trained data can be mapped to the nodes • Can do this with: • Training data • Other analyses • Model predictions • Determine frequency of occurrence of each node (synoptic pattern)

  12. Frequency of Occurrence: ECMWF Analysis

  13. 42: +5 / +1 / +6 / +5 86: -5 / -7 / +2 / -3 43: -6 / 0 / +4 / +2 49: -2 / -5 / -6 / -4 91: +10 / +4 / -2 / +4 49: -2 / +7 / -4 / -4 Frequency of Occurrence: Model ErrorsECMWF: ARCSyM / ARCSyM cld/pbl / Polar MM5 / HIRHAM

  14. Model Errors: Frequency of Occurrence

  15. Model Errors: Frequency of Occurrence • All models underpredict frequency of occurrence of strongest Aleutian low node and overpredict frequency of occurrence of weaker Aleutian low node • Polar MM5 and HIRHAM simulate nearly correct distribution of strong Aleutian low, moderate Arctic high, and strong Arctic high synoptic patterns • ARCSyM simulation underestimates strong Arctic high synoptic patterns • ARCSyM cld/pbl simulation overestimates strong Arctic high synoptic pattern

  16. Misprediction of Synoptic Patterns • Consider all of the time periods in which the validation data (analyses) map to a particular node • For these time periods determine which nodes the model predictions map to • From this analysis we can determine biases in the model prediction of specific synoptic patterns • Percent of cases that map to the correct node • Mis-mapping of model predictions between nodes

  17. ARCSyM cld/pbl

  18. Model Errors for Synoptic Patterns • Compare model predictions to in-situ atmospheric measurements at SHEBA site • Calculate model validation statistics for all time periods that map to each node • Look for model errors that vary from node to node

  19. Antarctic MesoscalePrediction System (AMPS) • Experimental real-time mesoscale modeling system that covers Antarctica • Nested model domains with grid spacing from 90 km to 3.3 km • Has provided NWP support to U.S. Antarctic Program forecasters as well as other international Antarctic interests

  20. Use of SOMs for AMPS Validation • Evaluate frequency of occurrence of synoptic patterns predicted by AMPS as a function of forecast duration • Map 6h, 12h, 18h, … forecasts to SOM trained with 0h analysis fields • Mis-mapping of AMPS forecasts • Provide guidance to USAP forecasters as to which synoptic patterns are or are not well forecast • Point validation of forecasts • Calculate error statistics at points of interest (Williams Field skiway) for different synoptic patterns • Are certain synoptic patterns prone to bias (e.g. error in predicted wind speed or direction)?

  21. Conclusions • The use of SOMs provides an alternate method of evaluating model performance • Identify synoptic patterns which are over or underpredicted • Determine model tendency for mis-prediction of certain synoptic types • Provide information on model errors related to specific synoptic patterns • Validation results from this technique are useful for both climate simulations and NWP • Validation results from SOMs can be applied to operational weather forecasting concerns

More Related