Environmental Law 101: Global Climate Change, Part IV

Part I of this series of posts on Global Climate Change discussed Greenhouse Gases (what they are, where they come from and why they are important to life on Earth).  Part II discussed the positions of various experts on whether Earth is going through a period of global warming (it is) without getting into a debate about how much the industrial revolution is likely to have contributed to the current warming trend and whether changing human conduct, particularly with respect to CO₂, can mitigate the trend.  Part III discussed the complexity of Planet Earth and our solar system; and provided background for the discussion in this post about the use of predictive computer modeling to project future temperature increases.

As pointed out in Part III, many reputable national and international organizations have created climate models.  Within the United States, the following organizations have such models:  (a) the National Center for Atmospheric Research; (b) the United States Department of Commerce; (c) the National Oceanic and Atmospheric Administration (NOAA); (d) the National Aeronautics and Space Administration (NASA); and the National Center for Atmospheric Research.  One of the most influential groups associated with the climate change debate, however, does no modeling—the Intergovernmental Panel on Climate Change (IPCC).  It was established in 1988 by the World Meteorological Association and the United Nations to assess scientific information regarding climate change.  Rather than modeling, it provides a regular assessment of the modeling done by others.

IPCC issued its Fourth Assessment Report in 2007 which considered 23 global climate models.  IPCC expressed considerable confidence that the models provided credible quantitative estimates of future climate change although it acknowledged that all of the models contained uncertainties which is really no great surprise inasmuch as the purpose for modeling is to project something that cannot be quantified with certainty—i.e., the future.  IPCC’s confidence levels were based on three traditional factors used to evaluate the ability of a model to project future outcomes.  These factors are the same ones used in judicial proceedings today to determine whether scientific evidence can be presented to a jury; they include:

–Is the model based on established physical laws and on an adequate quantity and quality of data?

–Can the model simulate important aspects of the current climate?

–Can the model reproduce/replicate past climate conditions and changes?

These are also the same considerations used in contaminate groundwater “fate and transport” modeling where computer programs are used to predict the movement of contaminates with groundwater through all types of soils over time for purposes of estimating where the contamination will go and when it will arrive at a drinking water source.  I have been involved in some of these cases and my general role has been to challenge the adequacy of these models and the factors stated above.  I spent an important part of my career doing just that and, in the process, learned about the strengths, weaknesses, allure and ability to manipulate computer modeling.  Usually, however, in court cases we deal with a limited universe of models such as two or three.  In that context, my feeling is that it would be difficult for 23 different groups to all manipulate their models in a way so that they all came to the same general conclusions.  In this context, there tends to be “strength in numbers” because the conclusions tend to support each other.

The IPCC noted, in its Fourth Assessment Report, several improvements in climate modeling since it issued its Third Assessment Report in 2001 and came to several conclusions including:

— “Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level.”

— “Most of the observed increase in globally-averaged temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.  It is likely there has been significant anthropogenic warming over the past 50 years.”

— “Continued GHG emissions at or above current rates would cause further warming and induce many changes in the global climate system during the 21st century that would very likely be larger than those observed during the 20th century.”

— “Overall projected temperature increases by 2100 range from about 1˚F to 11˚F, depending on the scenario assumed.”

Critics of the IPCC Fourth Assessment Report cited uncertainty as to specific elements in the various global climate models including uncertainty as to whether warmer air will hold additional water vapor; uncertainty regarding the amount of cloudiness in a changing environment; uncertainty regarding the overall effect of aerosols (warming or cooling); and uncertainty regarding the indirect effect of aerosols on cloud formation.

Other stated concerns by critics included arguments that climate models are based on surface temperature records that are inadequate to determine the average annual temperature of Earth because recording stations are concentrated in North America and western Europe; that there are few ocean weather stations; that there are few stations in deserts, tropical forests, mountains and the poles; that most temperature records go back only 50 years; and that climate models are based on relatively few above-surface temperature records.   This bundle of issues bothered me at first also when I began my research, but as I point out in the final post of this series, much of the  temperature data collected is for water which covers 70% of the Earth’s surface and which determines the speed with which the Greenland and Antarctic ice caps melt and the oceans rise.  We are in the age of satellites which constantly monitor Earth and its oceans and which can even determine the salinity of the water they observe.  So the traditional concept that scientists can only track temperatures from land stations is no longer valid.

Regardless, critics also argue that climate models are based on questionable assumptions regarding future economic growth, population growth and energy demand.  An overall stated concern was that the Earth’s climate is continuous, fluid and dynamic and cannot be represented by arbitrary horizontal and vertical grids.

In November 2014, the IPCC issued its Fifth Assessment Report (a/k/a the “synthesis report”).  Its principal conclusion was that human activity is a significant contributor to accelerating temperatures and that:  “If left unchecked, climate change will increase the likelihood of severe, pervasive and irreversible impacts for people and ecosystems.”  The Fifty Assessment was based on 30,000 scientific papers studied by about 830 authors and 2,000 reviewers.  The IPCC stated that about half of the CO₂ released into the atmosphere since the start of the industrial revolution has been produced since 1990 (because of increased population and energy/industrial development); and it  projected that global average temperatures will rise from around 6.5˚F to 8.5˚F by 2100.  IPCC projects that, in order to have a material impact on this warming trend, GHG emissions need to be decreased by around 70% by 2050 and be reduced, literally, to zero by 2100.

In the process of researching the material is this article, I came across a column written by Dr. Piers Sellers which was printed in the New York Times on November 11, 2014.  Dr. Sellers is a former NASA astronaut (he flew on three space shuttle missions) and has a doctorate in biometeorology.  He is a British-born Anglo-American meteorologist who, among other recognitions, was, in 2011, appointed Officer of the Order of the British Empire.  He certainly has the right background and the right experience to provide useful insights on this issue which is why I am including some of his comments in this post.  Among other things he wrote:

“. . . (W)e are not clueless about what is happening to the climate, thanks in part to a small fleet of satellites that fly above our heads, measuring the pulse of the earth.  Without them we would have no useful weather forecasts beyond a couple of days.  These satellite data are fed into computer models that use the laws of motion—Sir Isaac Newton’s theories—to figure out where the world’s air currents will flow, where clouds will form and rain will fall.  And—voila—you can plan your weekend, an airline can plan a flight and a city can prepare for a hurricane.

Satellites also keep track of other important variables:  polar ice, sea level rise, changes in vegetation, ocean currents, sea surface temperature and ocean salinity (that’s right—you can accurately measure salinity from space), cloudiness and so on.  These data are crucial for assessing and understanding changes in the earth system and determining whether they are natural or connected to human activities.  They are also used to challenge and correct climate models, which are mostly based on the same theories used in weather forecast models.  This whole system of observation, theory and prediction is tested daily in forecast models and almost continuously in climate models.  So, if you have no faith in the predictive capability of climate models, you should also discard your faith in weather forecasts and any other predictions based on Newtonian mechanics.  The earth has warmed nearly .8˚C over the last century and we are confident that the biggest factor in this increase is the release of CO₂ from fossil fuel burning.  It is almost certain that we will see a rise of 3.6˚F before 2100, and a 5.4˚F increase or higher is a possibility.  The impacts over such a short period would be huge.  The longer we put off corrective action, the more disruptive the outcome is likely to be.

It is my pleasure and duty as a scientist and civil servant to discuss the challenge of climate change with elected officials.  My colleagues and I do our best to transmit what we know and what we think is likely to happen.  The facts and accepted theories are fundamental to understanding climate change and they are too important to get wrong or trivialize.  Some difficult decisions lie ahead for us humans.  We should debate our options armed with the best information and ideas that science can provide.”

I was very impressed with Dr. Seller’s comments because, to me, he has credibility; his comments don’t impress me as being motivated by the fact that he, as a NASA scientist, is saying them because he “has something to gain.”  NOAA, a sister agency to NASA, has stated that 2014 will be the hottest year ever recorded; and that levels of CO₂ have reached the highest levels in at least 800,000 years.  Ironically, even the Pentagon, in a report entitled “Climate Change Adaptation Roadmap,” has concluded that weather change is the “real deal” (whatever the causes are) and expressed the following concern:  “Climate change will affect the Department of Defense’s ability to defend the Nation and poses immediate risks.”  In citing the Pentagon report, Bloomberg Businessweek (October 27 – November 2, 2014) described the Pentagon as “a cautious institution run by some of the most conservative people in the U.S. government” and, yet, Congressmen who “continue to deny that climate change is real, let alone that it requires action” are attempting to block the Pentagon from spending funds to adapt our military facilities to the new reality of rising seas and increased storm surges.  Again, in terms of judging the credibility of the source of information, it is hard to image that the Defense Department (which, understandably, always wants more ships, tanks and airplanes) would be part of a conspiracy to waste limited funds on protecting its facilities from threats such as rising seas and storm surges if it did not believe that the threat were real.

In terms of looking at this issue from the big perspective, the constant intense debate on “saving our grandchildren” from the burden of our national debt comes to mind.  Yale economist William Nordhaus has written an interesting book (“The Climate Casino”) in which he argues that, because future generations will presumably be richer than the current generation, we should leave it to them to deal with the costs of adapting to climate change.  This contrast between the obsession about debt reduction and the conclusion that we should let future generations deal with problems that we don’t want to deal with is a great example of the irrationality of the debate over climate change.  This paradox reminded me of the quip by David Stockman, President Reagan’s former budget director, when he asked:  “How does an economist get out of a nine foot hole?”  The answer is that he assumes a 10 foot ladder. It also reminds me of Galileo who was charged with heresy because he argued that the Earth rotates around the Sun rather than the other way around (which was official religious doctrine at the time).  The Roman Inquisition investigated his scientific position in 1616 and he was placed under house arrest for the rest of his life (26 years).  That’s what happens when science and religion (or, in the case of climate change, science and politics) don’t mix.

 

Comments are closed.