Naturally occurring carbon dioxide in the atmosphere provided a buffer and delayed the start of global climate change, according to a new UChicago study. Photo by: John McConnico/the Associated Press
In a paper published July 20 in the journal Climatic Change, David Archer, professor in geophysical sciences, theorizes that had CO2 levels been lower at the time of the industrial revolution, the climate impacts from releasing industrial carbon dioxide would have been more intense—or, as it turns out, would have happened sooner. Under those circumstances, understanding what was going on and changing our energy system in response would have been much more challenging.
“Odd as it may seem, the naturally occurring CO2 in the atmosphere provided a buffer which has stabilized the climate until now,” Archer said.
The concentration of carbon dioxide molecules in the atmosphere is measured in parts per million of dry air, or ppm. In the climatic past and earlier glacial periods, this level fluctuated between 180 ppm and 260 ppm. Measurements taken of Antarctic sheet ice show that the concentration of naturally occurring carbon dioxide in the atmosphere was already 278 ppm in the 1750s, before industrialization started in earnest.
“If the initial atmospheric carbon dioxide concentration were half its actual value, we would currently be experiencing the climate expected for the year 2050,” said Archer, setting out one possible scenario. “If there was only one-tenth as much carbon dioxide in the atmosphere initially, the climate forcing we are experiencing today would have already happened in the year 1900.”
Archer therefore describes the climatic changes currently being experienced on Earth as “moderate,” thanks to the blanketing effect that naturally occurring carbon dioxide in the atmosphere has had. This has given scientists the time to piece together an understanding of Earth’s climate system and the effects of fossil fuel emissions.
The first ideas about radiative balance and the greenhouse effect date to 1827, while predictions about climatic sensitivity due to carbon dioxide were made by 1896. It was, however, only after the advent of the computer that a modern understanding emerged of how fossil fuels would impact the climate. This understanding matured to the extent of public warnings about it by the 1970s.
“If the natural concentration had been a factor of two or more lower, the climate impacts of fossil fuel carbon dioxide release would have occurred about 50 or more years sooner, making it much more challenging for the developing human society to scientifically understand the phenomenon of manmade climate change in time to prevent it,” Archer said.
“To the extent that a thorough scientific understanding is also a requisite for making a decision to abandon fossil fuels, the outlook for humanity would have been considerably darker in this altered world than it has turned out in actuality.”
“Near Miss: The importance of the natural atmospheric CO2 concentration to human historical evolution,” by David Archer, published online July 20, 2016, Climatic Change, DOI: 10.1007/s10584-016-1725-y.