Mathematical Science of Understanding and Predicting Regional Climate: A School and Workshop
(28 Feb - 11 Mar 2011)

Jointly organized with National Center for Atmospheric Research, USA, Singapore-Delft Water Alliance,
Tropical Marine Science Institute, NUS

~ Abstracts ~


Assessment and selection of members from global model ensembles for downscaling experiments with PRECIS in south-east Asia
Richard Jones, Met Office Hadley Centre, UK

"PRECIS was developed at the Met Office Hadley Centre to make available a system for generating high resolution climate change information for any region of the world by providing, free of charge, a regional climate modelling system with a simple user interface and appropriate training to climate centres in developing countries. A recent development in the PRECIS modelling system allows users to downscale 17 members of the Hadley Centre's QUMP (Quantifying Uncertainty in Model Predictions) perturbed physics ensemble (HadCM3Q0-Q16) in addition to several other GCMs. Limitations in the computing facilities available to many climate centres mean that in many cases it is not feasible to run a large ensemble and so we look at criteria that might be used for selecting 'sub-sets' of three or more models.

We describe here the approach taken in recommending a sub-set of 5-7 members from the 17 QUMP models for downscaling over South East Asia at Vietnam's Institute for Meteorology, Hydrology and Environment (IMHEN). In this case study, we use analyses of the QUMP GCM simulations to:

1) Eliminate any ensemble members that perform poorly in their simulations of the major features of south east Asian climate (specifically, the Asian summer monsoon);
2) Select from those remaining, a sub-set that capture a broad range of responses in temperature, monsoon characteristics and precipitation;
3) Assess whether the complete, or subset of, QUMP ensemble represents the range of responses from the multi-model CMIP3 ensemble.

We find that, in this case, the QUMP models perform similarly well in replicating the key characteristics of the onset, position and strength of the Asian summer monsoon, and the associated wet-season rainfalls. We therefore do not find strong grounds for elimination of any models, and base our selection solely on sampling the range of responses, in terms of magnitude and patterns of change, recommending a sub-set of 5 members which represent the range of projections across the full ensemble.

Finally, we discuss the implications for the implementation of similar methodologies for different regions, and interpretation of the systematically-sampled small-ensemble simulations of regional climate. Due to the differences in spread of regional precipitation changes between the two ensembles, we recommend that those using the PPE ensemble to explore the range of plausible outcomes at the regional level should not attach confidence to projections solely on the basis of consensus in either the QUMP or CMIP3 ensembles. QUMP can be used with PRECIS to generate a range of plausible outcomes which should be interpreted in light of additional information from projections from CMIP3, or in future, CMIP5."

« Back...


Managing uncertainty in complex models: An introductory course
Tony O'Hagan, University of Sheffield, UK

Uncertainty Quantification for the outputs of complex simulation models is becoming increasingly important. When we have uncertainty concerning the values of simulator inputs, uncertainty about the structural accuracy of the simulation model itself, plus uncertainty about the numerical accuracy of the computer code, how does this feed into uncertainty about the outputs? How is uncertainty reduced if we can calibrate the simulator against observations of the real system?

These are technically difficult questions, and are made more difficult in practice by the fact that leading-edge simulators are highly computer- intensive. A large research project in the UK has been addressing these issues using novel statistical methods. A major component of that approach is the use of emulators to build statistically valid surrogates for computationally demanding simulators. This short course will be an introduction to those methods. The material will be generic but illustrated using two climate case studies.

Learning objectives

After attending the course, delegates should have the following knowledge and skills.

* Awareness of the objectives of the MUCM research project.

* How to find advice, guidance and procedures in the MUCM toolkit.

* Understanding what an emulator is and the key steps in building one.

* Appreciation of the importance of validating an emulator and the basics of validation.

* Understanding what is meant by uncertainty analysis, sensitivity analysis and calibration, and how those tasks are addressed using emulators.

« Back...


Simulators and emulators
Tony O'Hagan, University of Sheffield, UK

Simulation models are used in many fields to understand, predict and control complex processes. Implemented in computer codes, these models are often highly computationally expensive to run, taking hours, days or even longer for a single run. Leading climate simulators certainly fit this description. With such simulators we cannot make large numbers of runs. And yet there is increasing demand to be open about the uncertainties in the predictions of such models, and this means making many runs to explore the implications of uncertainty concerning the required inputs.

An important statistical tool is emulation, which consists of using a moderate number of simulator runs to build a statistical representation of the simulator - the emulator. Having built an emulator, we can use it to carry out tasks such as uncertainty analysis, sensitivity analysis or calibration. An emulator is more than just a quick surrogate for the simulator, and allows us to account for additional uncertainty involved in using the emulator instead of the simulator.

This talk will present an overview of emulation, what is involved in building and validating an emulator and how an emulator is used for tasks such as sensitivity analysis. I will also address some more leading edge topics such as emulating dynamic simulators and simulators with random outputs.

« Back...


Singapore vulnerability study. Overview of ocean regional signals of climate change and variability
Pavel Tkalich, National University of Singapore

The talk review implications of global climate change and variability on some of the most important regional ocean parameters, including sea surface temperature, mean sea level, storm surges and wind waves. Dynamical and statistical downscaling techniques are applied for quantitative estimation of past and projected sea level trends and extremes.

« Back...


Coastal analyses and scenarios at the Helmholtz-Zentrum Geesthacht - The coastDat data set and approach
Ralf Weisse, Helmholtz-Zentrum Geesthacht, Germany

The coastDat data set is a compilation of coastal analyses and scenarios for the future from various sources. It contains no direct measurements but results from numerical models that have been driven either by observed data in order to achieve the best possible representation of observed past conditions or by climate change scenarios for the near future. Contrary to direct measurements which are often rare and incomplete, coastDat offers a unique combination of consistent atmospheric, oceanic, sea state and other parameters at high spatial and temporal detail, even for places and variables for which no measurements have been made. In addition, coastal scenarios for the near-future are available which complement the numerical analyses of past conditions.

The backbones of coastDat are regional wind, wave and storm surge hindcast and scenarios mainly for the North Sea and the Baltic Sea. I will discuss the methodology to derive these data, their quality and limitations in comparison with observations. Long-term changes in the wind, wave and storm surge climate will be discussed and potential future changes will be assessed. I will conclude with a number of coastal and offshore applications of coastDat demonstrating some of the potentials of the data set in hazard assessment. Examples will comprise applications of coastDat in ship design, oil risk modelling and assessment, and the construction and operation of offshore wind farms.

« Back...


Marine climate and climate change
Ralf Weisse, Helmholtz-Zentrum Geesthacht, Germany

An overview about marine climate phenomena such as storms, wind waves and storm surges - their long-term changes and assessment will be provided. I will start with brief and general introduction to climate variability. Focus will be on internally driven variability and the interplay between regional and larger scales. Subsequently an overview on how to determine long-term changes in the marine environment is provided. Problems related to data homogeneity and data quality are discussed and approaches such as proxies or regional reanalyses are addressed. I will conclude with a brief overview of past and potential future changes of the wind, wave and storm surge climate.

« Back...

Best viewed with IE 7 and above