So, we’ve examined the history of the Earth’s climate and how climatic changes impacted earlier civilizations. But when did we first begin to actually learn about and understand our climate?
By the late 18th century, scientists began to find evidence of cyclic changes in the climate over long periods of time. Joseph Fourier was the first person to study the Earth’s temperature from a mathematical perspective. In 1824, Fourier recognized that our atmosphere kept the planet warmer than it otherwise would be in a vacuum, and concluded the Earth’s atmosphere acts like an insulator. Today, this is known as the Greenhouse Effect.
By 1856, Eunice Newton Foote examined the warming effect of sunlight on different gases, and theorized an increase in the Earth’s temperature with the presence of carbon dioxide.
By 1859, John Tyndall linked hydrocarbons (such as methane and carbon dioxide) with the absorption of infrared radiation and found them to strongly block the radiation.
During the 1890’s, Samuel Pierpoint Langley attempted to measure the surface temperature of the moon and found weaker measurements when the moon was lower on the horizon, continuing to prove the work of the previous scholars in infrared radiation absorption.
In 1899, Thomas Chrowder Chamberlin wrote a book, “An Attempt to Frame a Working Hypothesis of the Cause of Glacial Periods on an Atmospheric Basis,” that discussed the idea that changes in atmospheric carbon dioxide will result in changes to the climate.
At the start of the 20th century, several scientists and engineers continued working on and finding measurements to support the greenhouse effect theories, however, most scientific opinion continued to dispute or ignore the theory.
By the 1950’s, concern began slowly increasing for the potential of man-made climate change. Scientists studying CO2 absorption rates found that the ocean had a finite ability to sequester carbon. The 1960’s found evidence in deep sea cores and corals that our climate system was sensitive to small carbon changes and could easily transition from a stable state.
Deep sea core samples
In 1969, NATO planned to examine climate change by establishing a center for research in the area of the greenhouse effect (among others). The first United Nations Conference on the Human Environment took place in 1970, and later that decade a shift in academic literature that predicted global warming far in excess of global cooling.
The first World Climate Conference was held in Geneva in 1979, and concluded that it was plausible that an increase of atmospheric carbon would contribute to global warming, be detectable at the end of the century, and significant by the middle of next century. A UN National Research Council report concluded similar findings: if atmospheric carbon was doubled, then global temperatures would increase by 2-3.5 degrees Celsius, and there were no physical effects that could reverse the estimated global warming.
Finally by the 1980’s, a consensus began to form as the scientific community rallied around Climate Science. Chlorofluorocarbons (CFCs) were found to not only deplete the ozone layer but, together with methane and other gases, significantly contribute to climate change.
More scientists today agree that our changing climate will have an effect on the planet for future generations. 3.7 billion years ago, the world’s climate was completely different from what it is today. While methane producing microbes thrived in this environment, the 230 million years it took for a shift towards a more habitable atmosphere for humans seems manageable today. Until we look at the rate of change. Humans have been accelerating the change in climatic conditions, and no historical data can provide an estimate for the consequences.
Next week we look at the look into the more significant causes and resulting consequences of climate change from our anthropogenic activities.
Part III of our Climate Change Series that looks at climate history, consequences, policy and future outlook.