1 / 47

Monitoring Climate: Understanding Past, Present, and Future Changes

This extended version of the 2007 Margary Lecture explores the importance of monitoring climate, the uncertainties in climate models, and the challenges in representing complex processes. It also discusses predictions for future climate changes based on different scenarios.

rsnyder
Télécharger la présentation

Monitoring Climate: Understanding Past, Present, and Future Changes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAGESScottish Alliance for Geoscience, Environment & Society Why Monitor Climate? An extended version of the 2007 Margary Lecture Prof. Simon Tett, Chair of Earth System Dynamics & Modelling: The University of Edinburgh

  2. Margary • Margary died in 1976 and his obituary was written by Manley. • Published several papers in the QJ but his major work was running the Phenological network. • One of the last gentlemen amateurs. He wrote serious books on Roman roads… • Lecture established for “Broader interest which he and many others found in the manifestation of the British weather”. From Sparks et al, 2000

  3. “Reconstructions of climate data for the last 1000 years ... indicate this warming was unusual and is unlikely to be entirely natural in origin” Reconstructions of past temperatures from several different investigators. Graphic supplied by Tim Osborn, UEA

  4. Global temperatures

  5. Outline • Climate models and why they are uncertain • Case for monitoring climate. • Ozone depletion as example of how nature surprised us. • How and issues with past climate records • How? • Example model/data comparision

  6. Models are not the real world. • Despite the increasing complexity of Earth System Modelling models they are not the real world. • Choices are made about how and what to model • These choices lead to different outcomes • We care about the “emergent properties” of the models not their detailed evolution.

  7. Modelling the Climate System Main Message: Lots of things going on! Karl and Trenberth 2003

  8. Meteorology is (roughly) fluid dynamics on rotating sphere. Navier-stokes on rotating sphere Continuity Continuity + thermodynamics + moisture + radiation…+ some simplifications to remove sound and other fast waves

  9. Representing the fields: Gridpoint models Represent space as a grid of regular (in long/latt co-ords)

  10. Sub-grid. • Recall equations of motion • Split into large scale average and residual. Reynolds averaging Get large-scale terms that result from sub-grid scale motions…

  11. Parameterisation • Like the closure problem for fluid dynamics. • Key processes: • Convection (which involves latent heat release from water vapour condensing) • Clouds in general. • Boundary layers. • Need to simplify radiation calculations into relatively small number of broad bands and assume radiation only goes up and down. Can verify calculations through comparison with line-by-line calculations. • Friction… • Many specialists work in each area. An atmospheric model (Weather) is a complex piece of software. Numerical methods for dynamics are complex as are parameterisations. • Parameterisations also contain many empirically defined constants which need to be “tuned”. Model tuning quite time consuming and aims to get a reasonable simulation of current climate.

  12. Parameterized Processes Model’s do not have enough resolution to resolve these processes. So they are represented in terms of the large-scale flow (what gets simulated). Many of these processes act at scales of 1-10km. Slingo From Kevin E. Trenberth, NCAR

  13. “Mass-flux” parameterization Detrainment Environmental subsidence Entrainment into cloud Uplift Rain (& snow)

  14. What are we trying to parameterize? How we parameterise What is there…

  15. Future modelling • Since the 1960’s super-computer performance has doubled every 18 months (or so) • Implies can double the resolution of models every 10 years. • Still would take many decades to get to 1-10km global modelling. • Bottom line will need to parameterize processes for many decades to come.

  16. Chaos • Numerical models of atmosphere (and ocean) show sensitivity to initial conditions • For atmosphere practical limit of deterministic forecasts is 10 days. • Small uncertainties amplify and affect evolution of large-scale state • For climate purposes this means that future forecasts are probabilistic and detailed evolution of system unknowable.

  17. Predicting the Future Results based on multi-model archive. Typically show average across all model simulations with uncertainties from range Scenarios used to drive models. Self-consistent atmospheric concentrations of CO2 and other greenhouse gases. Based on different human development paths

  18. Projections of Future Changes in Climate Best estimate for low scenario (B1) is 1.8°C (likely range is 1.1°C to 2.9°C), and for high scenario (A1FI) is 4.0°C (likely range is 2.4°C to 6.4°C).

  19. Projections of Future Changes in Climate Projected warming in 21st century expected to be greatest over land and at most high northern latitudes and least over the Southern Ocean and parts of the North Atlantic Ocean

  20. Projections of Future Changes in Climate Precipitation increases very likely in high latitudes Decreases likely in most subtropical land regions

  21. Models are not the real world. • Despite the increasing complexity of Earth System Modelling models they are not the real world. • Choices are made about how and what to model • These choices lead to different outcomes • We care about the “emergent properties” of the models not their detailed evolution. (as we have learnt that models are chaotic and thus their detailed evolution is un-predictable.)

  22. Ozone Depletion as an example of a failure of environmental modelling. • In the early 1980’s theory (and models) suggested that CFC’s would only cause moderate stratospheric O3 depletion. • “… United States National Research Council report projected that continued use of CFCs at then-current rates would … depletion of the total global ozone layer by only about three percent in about a century. ...”

  23. Ozone Depletion – ObservationsHalley Bay, Farmen et al, Nature 1985 Ozone depletion over Antarctica much larger than expected. Reason: models only used gas-phase chemistry. But ozone depletion occurring on polar stratospheric clouds 57-73 80-84 Oct ‘84

  24. The Discovery of the Ozone Hole • 1985: British Antarctic Survey balloon measurements show much less ozone than normal at 10-20 km altitude in spring. • 1999: ozone at 15-20 km, where it normally peaks, was almost completely depleted.

  25. 2007 Sea-ice (its ½ what is should be) Is this unexpected? Are we missing something fundamental in our understanding of the Earth system? Is this the “ozone” moment?

  26. Observations are direct evidence of change. • The public believe that observations of climate change are very direct evidence of change. Seem to be using them as view as to what is to come. • Drives need for monitoring as wants answers soon after “interesting” events. • Apparently more trustworthy than models. (Though in some cases models are more reliable than observations!) • Communicating uncertainties (which general public are unaware of). • I.e. need to escape from sterile debate on what warmest year is.

  27. What is the problem with observations? • Observing system not stable • Climate changes slowly compared to obs. system. • Examples:

  28. Bias corrections • As observing practice or location changes this introduces biases. • For example a move of a temperature sensor can cause a change in average temperature recorded due to sensor being in a different micro-climate • Key are systematic biases – lots of small random changes will just introduce a small amount of uncertainty when averaged over a large number of observations • Estimate biases in a variety of ways. Thus they are uncertain. Relatively small uncertainty for SST; very large for changes in tropospheric temperature

  29. Examples: • Change from buckets to engine intakes as way of measuring Sea Surface Temperature. Affected many sensors. • Increasingly large number of buoy SST measurements • Orbit drift in polar orbitors.

  30. Uncertainties – incomplete coverage. SST example

  31. Uncertainties in observations • Sampling • Depends on the variable (annual-mean temperature anomalies vs daily rainfall) • Where they are and their correlation scales. • Temperature with long correlation scales is less uncertain than extreme daily rainfall with very short correlation scales.

  32. Communicating Uncertainties John Kennedy, Met Office Hadley Centre

  33. Extreme events have consequences Tewkesbury 2007Photograph: Daniel Berehulak/GettyImages Met Office provisional figures show that May to July in the England and Wales Precipitation is the wettest in a record that began in 1766. We must learn from the events of recent days. These rains were unprecedented, but it would be wrong to suppose that such an event could never happen again…. (Hazel Blears, House of Commons, July 2007)

  34. Models can generally reproduce what has happened… SPM-4 likely shows a significant anthropogenic contribution over the past 50 years Observations All forcing natural forcing

  35. So Why Observe? • The rate of climate change is unprecedented and so past climate conditions will no longer be a guide to future climate conditions. Need combination of models and observations to provide data for decadal infrastructure planning. • How to determine which modelling choices are right (or best)? • Depends on the purpose of model. • Test models ability to simulated observed change as that is directly relevant to what is to come. • Provide evidence of change to support policy action. • Allow Detection & Attribution of climate change (to support policy action..) [It’s the sun wot did it and other sillyness]

  36. How (GCOS monitoring principles)? Effective monitoring systems for climate should adhere to the following principles: 1. The impact of new systems or changes to existing systems should be assessed prior to implementation. So we know what the change did 2. A suitable period of overlap for new and old observing systems is required. ditto 3. The details and history of local conditions, instruments, operating procedures, data processing algorithms and other factors pertinent to interpreting data (i.e., metadata) should be documented and treated with the same care as the data themselves. So we can figure out when changes happened rather then looking for break points. 4. The quality and homogeneity of data should be regularly assessed as a part of routine operations. So the data is homogeneous 5. Consideration of the needs for environmental and climate-monitoring products and assessments, such as IPCC assessments, should be integrated into national, regional and global observing priorities. The observing system is not just to estimate the mean climate or for weather forecasting but to look for relatively small changes early,.

  37. How (GCOS monitoring principles) -- cont? 6. Operation of historically-uninterrupted stations and observing systems should be maintained. Long homogenious records are valuable 7. High priority for additional observations should be focused on data-poor regions, poorly-observed parameters, regions sensitive to change, and key measurements with inadequate temporal resolution. Observed where we don’t have data and where new data would help most. 8. Long-term requirements should be specified to network designers, operators and instrument engineers at the outset of system design and implementation. Don’t spec; Don’t get! 9. The conversion of research observing systems to long-term operations in a carefully-planned manner should be promoted. Research data has a short lifetime relative to climate change. 10. Data management systems that facilitate access, use and interpretation of data and products should be included as essential elements of climate monitoring systems. No point collecting the data unless people can use it.

  38. Digitisation as a source of new data • Available observed weather data are limited before 1950 and almost non-existent before 1850. • Many more observations exist, in logbooks, reports and other paper records (mostly in the UK). If we digitised them we could improve the climate record and extend it back to 1800. • Hadley Centre digitised observations from Royal Navy Ships logbooks for WW2. These give a much-improved picture of 1940s climate.

  39. Example of model-data comparison • Changes in European Precipitation and Temperature. • Source data: • CRUTEMP land-sfc temperatures • GPCC precipitation data • Multi-model archive N-Euro W-Euro Med

  40. Expected change in annual cycle from multi-model ensemble Dashed lines are precip; Solid temperature Split into warm (May-Oct) and Cold (Nov-Apr)

  41. Expected changes with time

  42. Do Models agree with Observations? 20th century – warm season

  43. 20th century – cold season

  44. Late 20th century – warm season

  45. Late 20th century – cold season

  46. Do Models agree with Observations? Not natural Problems with capturing change in Nov-Apr precip

  47. Summary • Hope I convinced you that climate models are uncertain • Ozone hole shows possibility of surprise • Is Arctic sea-ice changes a surprise too? • Need observations so we know what the climate system is doing. • Example comparison between observations and models showed some issues with simulations of winter precipitation change.

More Related