Journal cover Journal topic
Earth System Dynamics An interactive open-access journal of the European Geosciences Union
Journal topic

Journal metrics

Journal metrics

  • IF value: 3.769 IF 3.769
  • IF 5-year value: 4.522 IF 5-year 4.522
  • CiteScore value: 4.14 CiteScore 4.14
  • SNIP value: 1.170 SNIP 1.170
  • SJR value: 2.253 SJR 2.253
  • IPP value: 3.86 IPP 3.86
  • h5-index value: 26 h5-index 26
  • Scimago H index value: 22 Scimago H index 22
Volume 7, issue 4
Earth Syst. Dynam., 7, 813-830, 2016
https://doi.org/10.5194/esd-7-813-2016
© Author(s) 2016. This work is distributed under
the Creative Commons Attribution 3.0 License.
Earth Syst. Dynam., 7, 813-830, 2016
https://doi.org/10.5194/esd-7-813-2016
© Author(s) 2016. This work is distributed under
the Creative Commons Attribution 3.0 License.

ESD Reviews 01 Nov 2016

ESD Reviews | 01 Nov 2016

Towards improved and more routine Earth system model evaluation in CMIP

Veronika Eyring1, Peter J. Gleckler2, Christoph Heinze3,4, Ronald J. Stouffer5, Karl E. Taylor2, V. Balaji5,6, Eric Guilyardi7,8, Sylvie Joussaume9, Stephan Kindermann10, Bryan N. Lawrence8,11, Gerald A. Meehl12, Mattia Righi1, and Dean N. Williams2 Veronika Eyring et al.
  • 1Deutsches Zentrum für Luft- und Raumfahrt (DLR), Institut für Physik der Atmosphäre, Oberpfaffenhofen, Germany
  • 2Program for Climate Model Diagnosis and Intercomparison (PCMDI), Lawrence Livermore National Laboratory, Livermore, CA, USA
  • 3University of Bergen, Geophysical Institute and Bjerknes Centre for Climate Research, Bergen, Norway
  • 4Uni Research Climate, Bergen, Norway
  • 5Geophysical Fluid Dynamics Laboratory/NOAA, Princeton, NJ, USA
  • 6Cooperative Institute for Climate Science, Princeton University, NJ, USA
  • 7Institut Pierre Simon Laplace, Laboratoire d'Océanographie et du Climat, UPMC/CNRS, Paris, France
  • 8National Centre for Atmospheric Science, University of Reading, Reading, UK
  • 9Institut Pierre Simon Laplace, Laboratoire des Sciences du Climat et de l'Environnement, CNRS/CEA/UVSQ, Saclay, France
  • 10Deutsches Klimarechenzentrum, Hamburg, Germany
  • 11Centre for Environmental Data Analysis, STFC Rutherford Appleton Laboratory, Didcot, UK
  • 12National Center for Atmospheric Research (NCAR), Boulder, CO, USA

Abstract. The Coupled Model Intercomparison Project (CMIP) has successfully provided the climate community with a rich collection of simulation output from Earth system models (ESMs) that can be used to understand past climate changes and make projections and uncertainty estimates of the future. Confidence in ESMs can be gained because the models are based on physical principles and reproduce many important aspects of observed climate. More research is required to identify the processes that are most responsible for systematic biases and the magnitude and uncertainty of future projections so that more relevant performance tests can be developed. At the same time, there are many aspects of ESM evaluation that are well established and considered an essential part of systematic evaluation but have been implemented ad hoc with little community coordination. Given the diversity and complexity of ESM analysis, we argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently and consistently. Here, we provide a perspective and viewpoint on how a more systematic, open, and rapid performance assessment of the large and diverse number of models that will participate in current and future phases of CMIP can be achieved, and announce our intention to implement such a system for CMIP6. Accomplishing this could also free up valuable resources as many scientists are frequently "re-inventing the wheel" by re-writing analysis routines for well-established analysis methods. A more systematic approach for the community would be to develop and apply evaluation tools that are based on the latest scientific knowledge and observational reference, are well suited for routine use, and provide a wide range of diagnostics and performance metrics that comprehensively characterize model behaviour as soon as the output is published to the Earth System Grid Federation (ESGF). The CMIP infrastructure enforces data standards and conventions for model output and documentation accessible via the ESGF, additionally publishing observations (obs4MIPs) and reanalyses (ana4MIPs) for model intercomparison projects using the same data structure and organization as the ESM output. This largely facilitates routine evaluation of the ESMs, but to be able to process the data automatically alongside the ESGF, the infrastructure needs to be extended with processing capabilities at the ESGF data nodes where the evaluation tools can be executed on a routine basis. Efforts are already underway to develop community-based evaluation tools, and we encourage experts to provide additional diagnostic codes that would enhance this capability for CMIP. At the same time, we encourage the community to contribute observations and reanalyses for model evaluation to the obs4MIPs and ana4MIPs archives. The intention is to produce through the ESGF a widely accepted quasi-operational evaluation framework for CMIP6 that would routinely execute a series of standardized evaluation tasks. Over time, as this capability matures, we expect to produce an increasingly systematic characterization of models which, compared with early phases of CMIP, will more quickly and openly identify the strengths and weaknesses of the simulations. This will also reveal whether long-standing model errors remain evident in newer models and will assist modelling groups in improving their models. This framework will be designed to readily incorporate updates, including new observations and additional diagnostics and metrics as they become available from the research community.

Publications Copernicus
Download
Short summary
We argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently to enable a systematic and rapid performance assessment of the large number of models participating in CMIP, and we announce our intention to implement such a system for CMIP6. At the same time, continuous scientific research is required to develop innovative metrics and diagnostics that help narrowing the spread in climate projections.
We argue that the CMIP community has reached a critical juncture at which many baseline aspects...
Citation
Share