Monday, July 2, 2018

Uncertain about Climate Models? Why? Can you Explain???


I'd like to start with this short reality check, "How temperature has changed in each country since 1900"
notice nation names listed on the outside ring.



Published by Carbon Brief on, Aug 2, 2017
https://www.youtube.com/watch?v=-yIHxOui9nQ


I've started a short engagement with one John Smith a YouTube Climate Science "Skeptic" of the glade hand persuasion.  In preparing an examination of his game, I got to thinking a review of Climate Model success stories would make a good preamble.  Not to mention perhaps encourage him to unload some anti-climate-model nuggets to look at in detail.

If John or anyone else out there disputes the veracity of the following information - please do share your thoughts - focused on real world details, specifics, concise explanations, supporting evidence.  I'm always up for a Constructive Debate.


Analysis: How well have climate models projected global warming?
https://www.carbonbrief.org/analysis-how-well-have-climate-models-projected-global-warming

ZEKE HAUSFATHER, May 10, 2017

{  The following screen shots are of CarbonBrief’s fascinating ‘interactive’ graphs that were created with  "Highcharts 6.1.1" - I recommend linking to their webpage if you want a closer look. }

Scientists have been making projections of future global warming using climate models of increasing complexity for the past four decades.

These models, driven by atmospheric physics and biogeochemistry, play an important role in our understanding of the Earth’s climate and how it will likely change in the future.

Carbon Brief has collected prominent climate model projections since 1973 to see how well they project both past and future global temperatures,
as shown in the animation below (sorry didn't reproduce that one). While some models projected less warming than we’ve experienced and some projected more, all showed surface temperature increases between 1970 and 2016 that were not too far off from what actually occurred, particularly when differences in assumed future emissions are taken into account.

How have past climate models fared?

While climate model projections of the past benefit from knowledge of atmospheric greenhouse gas concentrations, volcanic eruptions and other radiative forcings affecting the Earth’s climate, casting forward into the future is understandably more uncertain. Climate models can be evaluated both on their ability to hindcast past temperatures and forecast future ones.

Hindcasts – testing models against past temperatures – are useful because they can control for radiative forcings. Forecasts are useful because models cannot be implicitly tuned to be similar to observations. Climate models are not fit to historical temperatures, but modellers do have some knowledge of observations that can inform their choice of model parameterisations, such as cloud physics and aerosol effects.

In the examples below, climate model projections published between 1973 and 2013 are compared with observed temperatures from five different organizations. The models used in the projections vary in complexity, from simple energy balance models to fully-coupled Earth System Models.

(Note, these model/observation comparisons use a baseline period of 1970-1990 to align observations and models during the early years of the analysis, which shows how temperatures have evolved over time more clearly.)

Sawyer, 1973
One of the first projections of future warming came from John Sawyer at the UK’s Met Office in 1973. In a paper published in Nature in 1973, he hypothesised that the world would warm 0.6C between 1969 and 2000, and that atmospheric CO2 would increase by 25%. Sawyer argued for a climate sensitivity – how much long-term warming will occur per doubling of atmospheric CO2 levels – of 2.4C, which is not too far off the best estimate of 3C used by the Intergovernmental Panel on Climate Change (IPCC) today.

Unlike the other projections examined in this article, Sawyer did not provide an estimated warming for each year, just an expected 2000 value. His warming estimate of 0.6C was nearly spot on – the observed warming over that period was between 0.51C and 0.56C. He overestimated the year 2000’s atmospheric CO2 concentrations, however, assuming that they would be 375-400ppm – compared to the actual value of 370ppm.

Broecker, 1975
The first available projection of future temperatures due to global warming appeared in an article in Science in 1975 published by Columbia University scientist Prof Wally Broecker. Broecker used a simple energy balance model to estimate what would happen to the Earth’s temperature if atmospheric CO2 continued to increase rapidly after 1975. Broecker’s projected warming was reasonably close to observations for a few decades, but recently has been considerably higher.

This is mostly due to Broecker overestimating how CO2 emissions and atmospheric concentrations would increase after his article was published. He was fairly accurate up to 2000, predicting 373ppm of CO2 – compared to actual Mauna Loa observations of 370ppm. In 2016, however, he estimated that CO2 would be 424ppm, whereas only 404 pm has been observed.

Broecker also did not take other greenhouse gases into account in his model. However, as the warming impact from methane, nitrous oxide and halocarbons has been largely cancelled out by the overall cooling influence of aerosols since 1970, this does not make that large a difference (though estimates of aerosol forcings have large uncertainties).

As with Sawyer, Broecker used an equilibrium climate sensitivity of 2.4C per doubling of CO2. Broecker assumed that the Earth instantly warms up to match atmospheric CO2, while modern models account for the lag between how quickly the atmosphere and oceans warm up. (The slower heat uptake by the oceans is often referred to as the “thermal inertia” of the climate system.)

1975: Wally Broecker

Broecker made his projection at a time when scientists widely thought that the observations showed a modest cooling of the Earth. He began his article by presciently stating that “a strong case can be made that the present cooling trend will, within a decade or so, give way to a pronounced warming induced by carbon dioxide”.

Hansen et al, 1981

NASA’s Dr James Hansen and colleagues published a paper in 1981 that also used a simple energy balance model to project future warming, but accounted for thermal inertia due to ocean heat uptake. They assumed a climate sensitivity of 2.8C per doubling CO2, but also looked at a range of 1.4-5.6C per doubling.

1981: Hansen et al
Hansen and colleagues presented a number of different scenarios, varying future emissions and climate sensitivity. In the chart above, you can see both the “fast-growth” scenario (thick black line), where CO2 emissions increase by 4% annually after 1981, and a slow-growth scenario where emissions increase by 2% annually (thin grey line). The fast-growth scenario somewhat overestimates current emissions, but when combined with a slightly lower climate sensitivity it provides an estimate of early-2000s warming close to observed values.

The overall rate of warming between 1970 and 2016 projected by Hansen et al in 1981 in the fast-growth scenario has been about 20% lower than observations.

Hansen et al, 1988

The paper published by Hansen and colleagues in 1988 represented one of the first modern climate models. It divided the world into discrete grid cells of eight degrees latitude by 10 degrees longitude, with nine vertical layers of the atmosphere. It included aerosols, various greenhouse gases in addition to CO2, and basic cloud dynamics.

Hansen et al presented three different scenarios associated with different future greenhouse gas emissions. Scenario B is shown in the chart below as a thick black line, while scenarios A and C are shown by thin grey lines. Scenario A had exponential growth in emissions, with CO2 and other GHG concentrations considerably higher than today.

1988: Hansen et al
Scenario B assumed a gradual slowdown in CO2 emissions, but had concentrations of 401ppm in 2016 that were pretty close to the 404ppm observed. However, scenario B assumed the continued growth of emissions of various halocarbons that are powerful greenhouse gases, but were subsequently restricted under the Montreal Protocol of 1987. Scenario C had emissions going to near-zero after the year 2000.

Of the three, scenario B was closest to actual radiative forcing, though still about 10% too high. Hansen et al also used a model with a climate sensitivity of 4.2C per doubling CO2 – on the high end of most modern climate models. Due to the combination of these factors, scenario B projected a rate of warming between 1970 and 2016 that was approximately 30% higher than what has been observed.

IPCC First Assessment Report, 1990

The IPCC’s First Assessment Report (FAR) in 1990 featured relatively simple energy balance/upwelling diffusion ocean models to estimate changes in global air temperatures. Their featured business-as-usual (BAU) scenario assumed rapid growth of atmospheric CO2, reaching 418ppm CO2 in 2016, compared to 404ppm in observations. The FAR also assumed continued growth of atmospheric halocarbon concentrations much faster than has actually occurred.

The FAR gave a best estimate of climate sensitivity as 2.5C warming for doubled CO2, with a range of 1.5-4.5C. These estimates are applied to the BAU scenario in the figure below, with the thick black line representing the best estimate and the thin dashed black lines representing the high and low end of the climate sensitivity range.

1990: IPCC First Assessment Report
Despite a best estimate of climate sensitivity a tad lower than the 3C used today, the FAR overestimated the rate of warming between 1970 and 2016 by around 17% in their BAU scenario, showing 1C warming over that period vs 0.85C observed. This is mostly due to the projection of much higher atmospheric CO2 concentrations than has actually occurred.

IPCC Second Assessment Report, 1995

The IPCC’s Second Assessment Report (SAR) only published readily-available projections from 1990 onward. They used a climate sensitivity of 2.5C, with a range of 1.5-4.5C. Their mid-range emissions scenario, “IS92a”, projected CO2 levels of 405ppm in 2016, nearly identical to observed concentrations. SAR also included much better treatment of anthropogenic aerosols, which have a cooling effect on the climate.

1995: IPCC Second Assessment Report

As you can see in the chart above, SAR’s projections ended up being notably lower than observations, warming about 28% more slowly over the period from 1990 to 2016. This was likely due to a combination of two factors: a lower climate sensitivity than found in modern estimates (2.5C vs. 3C) and an overestimate of the radiative forcing of CO2 (4.37 watts per square meter versus 3.7 used in the subsequent IPCC report and still used today).

IPCC Third Assessment Report, 2001

The IPCC Third Assessment Report (TAR) relied on atmosphere-ocean general circulation models (GCMs) from seven different modeling groups. They also introduced a new set of socioeconomic emission scenarios, called SRES, which included four different future emission trajectories.

Here, Carbon Brief examines the A2 scenario, though all have fairly similar emissions and warming trajectories up to 2020. The A2 scenario projected a 2016 atmospheric CO2 concentration of 406 ppm, nearly the same as what was observed. The SRES scenarios were from 2000 onward, with models prior to the year 2000 using estimated historical forcings. The dashed grey line in the figure above shows the point at which models transition from using observed emissions and concentrations to projected future ones.

2001: IPCC Third Assessment Report

TAR’s headline projection used a simple climate model that was configured to match the average outputs of seven more sophisticated GCMs, as no specific multimodel average was published in TAR and data for individual model runs are not readily available. It has a climate sensitivity of 2.8C per doubling CO2, with a range of 1.5-4.5C. As shown in the chart above, the rate of warming between 1970 and 2016 in the TAR was about 14% lower than what has actually been observed.

IPCC Fourth Assessment Report, 2007

The IPCC’s Fourth Assessment Report (AR4) featured models with significantly improved atmospheric dynamics and model resolution. It made greater use of Earth System Models – which incorporate the biogeochemistry of carbon cycles – as well as improved simulations of land surface and ice processes.

AR4 used the same SRES scenarios as the TAR, with historical emissions and atmospheric concentrations up to the year 2000 and projections thereafter. Models used in AR4 had a mean climate sensitivity of 3.26C, with a range of 2.1C to 4.4C.

2007: IPCC Fourth Assessment Report
The figure above shows model runs for the A1B scenario (which is the only scenario with model runs readily available, though its 2016 CO2 concentrations are nearly identical to those of the A2 scenario). AR4 projections between 1970 and 2016 show warming quite close to observations, only 8% higher.

IPCC Fifth Assessment Report, 2013

The most recent IPCC report – the Fifth Assessment (AR5) – featured additional refinements on climate models, as well as a modest reduction in future model uncertainty compared to AR4. The climate models in the latest IPCC report were part of the Coupled Model Intercomparison Project 5 (CMIP5), where dozens of different modeling groups all around the world ran climate models using the same set of inputs and scenarios.
2013: IPCC Fifth Assessment Report

AR5 introduced a new set of future greenhouse gas concentration scenarios, known as the Representative Concentration Pathways (RCPs). These have future projections from 2006 onwards, with historical data prior to 2006. The grey dashed line in the figure above shows where models transition from using observed forcings to projected future forcings.

Comparing these models with observations can be a somewhat tricky exercise. The most often used fields from climate models are global surface air temperatures. However, observed temperatures come from surface air temperatures over land and sea surface temperatures over the ocean.

To account for this, more recently, researchers have created blended model fields, which include sea surface temperatures over the oceans and surface air temperatures over land, in order to match what is actually measured in the observations. These blended fields, shown by the dashed line in the figure above, show slightly less warming than global surface air temperatures, as models have the air over the ocean warming faster than sea surface temperatures in recent years.

Global surface air temperatures in CMIP5 models have warmed about 16% faster than observations since 1970. About 40% of this difference is due to air temperatures over the ocean warming faster than sea surface temperatures in the models; blended model fields only show warming 9% faster than observations.

A recent paper in Nature by Iselin Medhaug and colleagues suggests that the remainder of the divergence can be accounted for by a combination of short-term natural variability (mainly in the Pacific Ocean), small volcanoes and lower-than-expected solar output that was not included in models in their post-2005 projections.

Below is a summary of all the models Carbon Brief has looked at. The table below shows the difference in the rate of warming between each model or set of models and NASA’s temperature observations. All the observational temperature records are fairly similar, but NASA’s is among the group that includes more complete global coverage in recent years and is thus more directly comparable to climate model data.

* SAR trend differences are calculated over the period from 1990-2016, as estimates prior to 1990 are not readily available.
# Differences in parenthesis based on blended model land/ocean fields

Conclusion

Climate models published since 1973 have generally been quite skillful in projecting future warming. While some were too low and some too high, they all show outcomes reasonably close to what has actually occurred, especially when discrepancies between predicted and actual CO2 concentrations and other climate forcings are taken into account.

Models are far from perfect and will continue to be improved over time. They also show a fairly large range of future warming that cannot easily be narrowed using just the changes in climate that we have observed.

Nevertheless, the close match between projected and observed warming since 1970 suggests that estimates of future warming may prove similarly accurate.

Methodological note

Environmental scientist Dana Nuccitelli helpfully provided a list of past model/observation comparisons, available here. The PlotDigitizer software was used to obtain values from older figures when data was not otherwise available. CMIP3 and CMIP5 model data was obtained from KNMI Climate Explorer.


cc: Thank you CarbonBrief for all the important work you accomplish !

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


Climate model projections compared to observations

Filed under: — group @ 11 April 2017 - RealClimate.org


Since we have been periodically posting updates (e.g. 200920102011201220152016) of model output comparisons to observations across a range of variables, we have now set up this page as a permanent placeholder for the most up-to-date comparisons. We include surface temperature projections from 1981, 1988, CMIP3, CMIP5, and satellite products (MSU) from CMIP5, and we will update this on an annual basis, or as new observational products become available. For each comparison, we note the last update date. ...
__________________________________

Climate models are even more accurate than you thought

The difference between modeled and observed global surface temperature changes is 38% smaller than previously thought.

Dana Nuccitelli, July 31, 2015

__________________________________

Climate change computer model vindicated 30 years later by what has actually happened

Ian Johnston Environment Corresponden @montaukian  -  March 3, 2017


Sceptics have long sneered at climate models but one made in the late 1980s has proved remarkably prophetic

Nearly 30 years ago, scientists developed a computer model of the Earth’s climate that predicted the level of global warming – to the ridicule of ‘sceptics’ at a time when there still seemed to be a debate over the issue. …

For example, when it was first made the model came up with unexpected forecast of little or no warming in the Southern Ocean. …

Writing in the journal Nature Climate Change, Dr Ronald Stouffer, head of the climate and ecosystem group at Princeton University, and Dr Syukuro Manabe, a senior meteorologist at the same US college, said they had not expected the model to be so accurate. 
“It is quite surprising that the observed and projected pattern of surface temperature change are very similar to each other,” they wrote.

“It … suggests that the model likely contains the key physical processes that control the geographical pattern of global warming at the earth surface.” …
________________________________

How reliable are climate models?

While there are uncertainties with climate models, they successfully reproduce the past and have made predictions that have been subsequently confirmed by observations.

There are two major questions in climate modeling - can they accurately reproduce the past (hindcasting) and can they successfully predict the future? To answer the first question, here is a summary of the IPCC model results of surface temperature from the 1800s - both with and without man-made forcings. All the models are unable to predict recent warming without taking rising CO2 levels into account. Nobody has created a general circulation model that can explain climate's behavior over the past century without CO2 warming. …
__________________________________

What Are Climate Models and How Accurate Are They?
By Lauren Harper, May 18, 2018
__________________________________

General Circulation Models of Climate
__________________________________

It wouldn't be right to leave this background collection without introducing Syukuro Manaba the granddaddy of them all, coupling atmosphere and ocean.

The model scientist who fixed the greenhouse effect

November 23, 2013 — andyextance
Syukuro (“Suki”) Manabe in the 1960s at Princeton University

In 1963, using one of the world’s first transistor-based supercomputers, Syukuro Manabe was supposed to be simulating how Earth’s atmosphere behaves in more detail than ever before. Instead, the young US Weather Bureau scientist felt the frustration, far more common today, of a crashed system. But resolving that problem would lead ‘Suki’ Manabe to produce the first computerised greenhouse effect simulations, and lay the foundations for some of today’s most widely used climate models. …

Suki went on to get a PhD, but when he finished the kinds of jobs in meteorology he was qualified for were hard to find in Japan. But he had applied his interests to rainfall, in an approach known as numerical weather prediction pioneered by scientists like John von Neumann, Carl-Gustaf Rossby and Bert Bolin. Another leader in the field, Joe Smagorinsky, was looking at rainfall in a similar way, and had read Suki’s research. Joe was setting up a numerical weather prediction team at the US Weather Bureau in Washington, DC, and in 1958 invited Suki to join him. …

… The problem, Suki realised, was that the models needed to look at more than just heat energy in the form of radiation. Scientists had recently realised that processes related to clouds, called cumulus convection, moved heat upwards from the Earth’s surface into the low atmosphere above, the troposphere. When water vapour in clouds turns into rain it releases latent heat, which can cause a temperature rise known as sensible heat, in the upper troposphere. Suki therefore included cumulus convection and large-scale atmospheric circulation, which also helps heat flow upwards in similar ways, in his models. …
___________________________________________

Fighting for useful climate models
November 30, 2013 — andyextance
  • This is part two of a two-part post. Read part one here.
When Princeton University’s Syukuro Manabe first studied global warming with general circulation models (GCMs), few other researchers approved. It was the 1970s, computing power was scarce, and the GCMs had grown out of mathematical weather forecasting to become the most complex models available. “Most people thought that it was premature to use a GCM,” ‘Suki’ Manabe told interviewer Paul Edwards in 1998. But over following decades Suki would exploit GCMs widely to examine climate changes ancient and modern, helping make them the vital research tool they are today. …

Suki and his GFDL teammate Richard Weatherald had worked out how to push their global warming study onto whole world-scale ocean-coupled GCMs. They could now consider geographical differences and indirect effects, for example those due to changes of the distribution of snow and sea ice. Though the oceans in the world they simulated resembled a swamp, shallow and unmoving, they got a reasonably realistic picture of the difference between land and sea temperatures. Their model predicted the Earth’s surface would warm 2.9°C if the amount of CO2 in the air doubled, a figure known as climate sensitivity. That’s right in the middle of today’s very latest 1.5-4.5°C range estimate.

… Though Suki has since been less involved in global climate assessments, and flux adjustment is less widely used today, he has remained active in improving climate models. In March this year he published a paper in Proceedings of the National Academy of Sciences of the USA aimed at reducing uncertainty over climate sensitivity. …


Further reading:
Apart from the quotes in the last two paragraphs, which are from questions I asked Suki directly, all the quotes in this blog entry are from an interview with Suki Manabe by Paul Edwards on March 14-15 1998, published by the Archives for the History of Quantum Physics Collection, Niels Bohr Library & Archives, American Institute of Physics, College Park, MD USA. They are copyrighted by the AIP, and used here with its permission.

This year I’ve already written about the following pivotal climate scientists who came before Suki Manabe, or were around at the same time: Svante Arrhenius, Milutin Milanković, Guy Callendar part I, Guy Callendar part II, Hans Suess, Willi Dansgaard, Dave Keeling part I, Dave Keeling part II, Wally Broecker part I, Wally Broecker part II, Bert Bolin part I, Bert Bolin part II, Suki Manabe part I

Syukuro Manabe and Richard T. Wetherald (1975). The effects of doubling CO2 concentration on the climate of a general circulation model Journal of the Atmospheric Sciences : 10.1175/1520-0469(1975)0322.0.CO;2

Syukuro Manabe (1975). The use of comprehensive general circulation modelling for studies of the climate and climate variation The Physical Basis of Climate and Climate Modelling, Report of the International Study Conference, GARP Publications Series No. 16, World Meteorological Organization

Wallace S. Broecker, Dorothy M. Peteet & David Rind (1985). Does the ocean–atmosphere system have more than one stable mode of operation? Nature DOI: 10.1038/315021a0

Syukuro Manabe and Ronald J Stouffer (1988). Two stable equilibria of a coupled ocean-atmosphere model Journal of Climate DOI: 10.1175/1520-0442(1988)001

Syukuro Manabe and Ronald J Stouffer (1995). Simulation of abrupt climate change induced by freshwater input to the North Atlantic Nature DOI: 10.1038/378165a0

Syukuro Manabe and Ronald J Stouffer (1999). Are two modes of thermohaline circulation stable? Tellus A DOI: 10.3402/tellusa.v51i3.13461

Syukuro Manabe and Ronald J Stouffer (2000). Study of abrupt climate change by a coupled ocean-atmosphere model Quaternary Science Reviews DOI: 10.1016/S0277-3791(99)00066-9

Yoko Tsushima and Syukuro Manabe (2013). Assessment of radiative feedback in climate models using satellite observations of annual flux variation. Proceedings of the National Academy of Sciences of the United States of America, 110 (19), 7568-73 PMID: 23613585

Spencer Weart’s book, ‘The Discovery of Global Warming’ has been the starting point for this series of blog posts on scientists who played leading roles in climate science.
____________________

USA’s National Science Foundation weighs in,

… Climate is a fundamental factor in ecosystem health--while most species can survive a sudden change in the weather, such as a heat wave, flood or cold snap--they often cannot survive a long-term change in climate. Global climate is the average of all regional trends, and researchers have concluded that Earth's climate, as a whole, is warming. …

Researchers know that human activities including fossil fuel use, agriculture and land use have been the dominant causes of increased concentrations of greenhouse gases in the atmosphere over the past 250 years. In addition, aerosols and land surface changes are also altering the Earth's climate, making it extremely likely 1 that human activities have had a net warming effect since 1750 2. These human-caused changes to the climate system, and their consequences, provide much of the impetus for the National Science Foundation's (NSF) climate change research.

1According to the Treatment of Uncertainties (scroll to Box TS1, pg. 22) in the Working Group I Assessment of the Intergovernmental Panel on Climate Change, 'extremely likely' corresponds to a likelihood of >95% probability. Solomon, S., et al. Technical Summary, in Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, 2007 Cambridge University Press. Box TS.1

2 Solomon, S., et al. Technical Summary, in Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, 2007. Cambridge University Press. pp. 81

No comments:

Post a Comment