Archive for January, 2009

Another NASA Defection to the Skeptics’ Camp

Thursday, January 29th, 2009

Something about retirement apparently frees people up to say what they really believe. I retired early from NASA over seven years ago to have more freedom to speak my mind on global warming.

You might recall that after Dr. Joanne Simpson retired from NASA she ( admitted to a long-held skepticism regarding the role of mankind in global warming.

And who can forget NASA’s Administrator, Michael Griffin, admitting that he was skeptical of the urgency of the global warming problem? After the outrage that ensued, I suspect he wishes he had never brought it up.

And now my old boss when I was at NASA (as well as James Hansen’s old boss), John Theon, has stated very clearly that he doesn’t believe global warming is manmade…and adding “climate models are useless” for good measure. Even I wouldn’t go quite that far, since I use simple ones in my published research.

I remember the old days at NASA, when even John Theon was singing the same tune as most people at NASA were. Manmade global warming was a potentially serious threat, and NASA wanted Congress to fund new satellites to study the problem. It was a team effort to get that accomplished.

Global warming research was a relatively new field back then. Was Theon always skeptical, and just being a team player at the time? I don’t know. It could be that Dr. Theon, after watching 15 years of climate research go by, decided that he was no longer convinced that mankind was at fault for warming.

After all, there is some precedence for scientists changing their minds. One of today’s leading global warming alarmists is Stephen Schneider, who did a major about-face from the 1970s when global cooling was all the rage. At least Theon didn’t write a book back then about how serious the global warming issue was, as Schneider did on global cooling.

And how many defections have we seen in the other direction — from the skeptics’ camp to the alarmists’ camp? Seems like it’s been a one-way street so far.

Theon now also supports what I have repeatedly said over the years. That NASA’s James Hansen routinely ignored NASA policy, and said whatever he wanted to the press and to Congress without getting approval first. The reason why everyone at NASA looked the other way was that we were trying to get congressional funding for satellite missions to study climate. I personally don’t think we needed Hansen’s extremist views to get that accomplished, but it probably helped to some extent.

I asked NASA managers at the time, how can Hansen get away with saying whatever he wanted to? The answer was, “well…he’s not supposed to”.

You might think it’s OK for the lone scientist to warn everyone of impending planetary doom. But I consider it much closer to someone who makes a habit of yelling “Fire!” in a crowded theatre. Forcing expensive energy on people will lead to death and suffering. These are very real threats, not theoretical like manmade global warming, and they exist today. I personally don’t care where our energy comes from — but I do care that a maximum number of people can afford it.

In truth, it wasn’t Hansen who was muzzled, but it was me in the Clinton-Gore years, who was asked to keep my mouth shut about my skeptical views. That was fine…if a little annoying. At least the flap Hansen caused has managed to force NASA to say that their scientists no longer have to march in lock-step on scientific issues. That’s a good thing.

I have to wonder…how many more scientists will be outing themselves as skeptics? While we may never constitute a majority, and many of us have differing views on the real causes of climate change, it only takes one of us to be right for the global warming house of cards to collapse.

Al Gore’s Propaganda

Tuesday, January 27th, 2009

The methods used by global warming alarmists to convince you that more carbon dioxide is going to ruin the Earth are increasingly laced with insults and attacks directed toward anyone who might disagree with them. For instance, one of the many intellectually lazy (& false) claims is that I am paid by Big Oil.

Mr. Gore’s tactics have been a little more subtle, and reminiscent of propaganda methods which have proved to be effective throughout history at influencing public opinion. One should keep in mind that his main scientific adviser, NASA’s James Hansen, has the most extreme views of any climate researcher when it comes to predicting a global warming induced Armageddon.

Listed below are ten propaganda techniques I have excerpted from Wikipedia. Beneath each are one or more examples of Mr. Gore’s rhetoric as he has attempted to goad the rest of us into reducing our CO2 emissions. Except where indicated, most quotes are from his testimony before the U.S. Senate Environment and Public Works Committee, March 21, 2007. (Mr. Gore is scheduled to testify again tomorrow, January 28, 2009, before the Senate’s Foreign Relations Committee…if the cold and snowy weather doesn’t cause them to reschedule.)

Appeal to fear: Appeals to fear seek to build support by instilling anxieties and panic in the general population.

“I want to testify today about what I believe is a planetary emergency—a crisis that threatens the survival of our civilization and the habitability of the Earth.”

Appeal to authority: Appeals to authority cite prominent figures to support a position, idea, argument, or course of action. Also, Testimonial: Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited.

“Just six weeks ago, the scientific community, in its strongest statement to date, confirmed that the evidence of warming is unequivocal. Global warming is real and human activity is the main cause.”

“The scientists are virtually screaming from the rooftops now. The debate is over! There’s no longer any debate in the scientific community about this.” (from An Inconvenient Truth)

Bandwagon: Bandwagon and “inevitable-victory” appeals attempt to persuade the target audience to join in and take the course of action that “everyone else is taking”. Also, Join the crowd: This technique reinforces people’s natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their best interest to join.

“Today, I am here to deliver more than a half million messages to Congress
asking for real action on global warming. More than 420 Mayors have now
adopted Kyoto-style commitments in their cities and have urged strong federal action. The evangelical and faith communities have begun to take the lead, calling for measures to protect God’s creation. The State of California, under a Republican Governor and a Democratic legislature, passed strong, economy wide legislation mandating cuts in carbon dioxide. Twenty-two states and the District of Columbia have passed renewable energy standards for the electricity sector.”

Flag-waving: An attempt to justify an action on the grounds that doing so will make one more patriotic, or in some way benefit a group, country, or idea. Also, Inevitable victory: invites those not already on the bandwagon to join those already on the road to certain victory. Those already or at least partially on the bandwagon are reassured that staying aboard is their best course of action.

“After all, we have taken on problems of this scope before. When England and then America and our allies rose to meet the threat of global Fascism, together we won two wars simultaneously in Europe and the Pacific.”

Ad Hominem attacks: A Latin phrase which has come to mean attacking your opponent, as opposed to attacking their arguments. Also Demonizing the “enemy”: Making individuals from the opposing nation, from a different ethnic group, or those who support the opposing viewpoint appear to be subhuman.

“You know, 15 percent of people believe the moon landing was staged on some movie lot and a somewhat smaller number still believe the Earth is flat. They get together on Saturday night and party with the global-warming deniers.” (October 24, 2006, Seattle University)

Appeal to Prejudice: Using loaded or emotive terms to attach value or moral goodness to believing the proposition.

“And to solve this crisis we can develop a shared sense of moral purpose.” (June 21, 2006, London, England)

Black-and-White fallacy: Presenting only two choices, with the product or idea being propagated as the better choice.

“It is not a question of left vs. right; it is a question of right vs. wrong.” (July 1, 2007, New York Times op-ed)

Euphoria: The use of an event that generates euphoria or happiness, or using an appealing event to boost morale:

Live Earth concerts organized worldwide in 2007 by Al Gore.

Falsifying information: The creation or deletion of information from public records, in the purpose of making a false record of an event or the actions of a person or organization. Pseudo-sciences are often used to falsify information.

“Nobody is interested in solutions if they don’t think there’s a problem. Given that starting point, I believe it is appropriate to have an over-representation of factual presentations on how dangerous (global warming) is, as a predicate for opening up the audience to listen to what the solutions are, and how hopeful it is that we are going to solve this crisis.” (May 9, 2006 Grist interview)

Stereotyping or Name Calling or Labeling: This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable. Also, Obtain disapproval: This technique is used to persuade a target audience to disapprove of an action or idea by suggesting that the idea is popular with groups hated, feared, or held in contempt by the target audience

“There are many who still do not believe that global warming is a problem at all. And it’s no wonder: because they are the targets of a massive and well-organized campaign of disinformation lavishly funded by polluters who are determined to prevent any action to reduce the greenhouse gas emissions that cause global warming out of a fear that their profits might be affected if they had to stop dumping so much pollution into the atmosphere.” (January 15, 2004, New York City)

Global Warming, Ulcers, and Gas Pains

Friday, January 23rd, 2009

You have probably already heard about a Pew survey released yesterday, which shows that out of a choice of 20 primary concerns the American public has, the economy and jobs rank at the very top, and global warming ranks at the very bottom. Here are the stats:


A couple of blogs I’ve visited this morning show considerable fretting over this situation, with calls for reframing the global warming issue in terms that hit home with people, or reducing the alarmist rhetoric, etc. In other words, the public just isn’t getting it, and the problem lies with the communicators of the global warming message.

Well, maybe the problem doesn’t lie with the communicators…but instead with the long tradition science has of overselling issues that the scientific community knows relatively little about. The public already knows that science has a history of being spectacularly wrong with long-term predictions of doom. Paul Ehrlich’s Population Bomb bombed, and yet many people still fret over the Earth being overpopulated. (In my view, the only thing we are overpopulated with is stupidity).

And the time it takes for science to realize that the ‘scientific consensus’ is wrong can be very, very long indeed. For instance, as early as 1948 it was learned that administering antibiotics to farm animals eliminated the incidence of stomach ulcers. The medical community was told about this and they considered the idea as preposterous. Finally, in 2005 two Australian scientists were awarded the Nobel Prize for “discovering” the bacterial basis of peptic ulcers. The truth is, one of them was tipped off by a rancher many years earlier to look into using antibiotics.

Gee, I wonder how much more difficult it is to come to a sufficient theoretical understanding of the sensitivity of the climate system, compared to figuring out empirically what works for the treatment of stomach ulcers…when we have had literally millions of patients to test treatments on over the years? And stomach ulcer research doesn’t carry all of the baggage that accompanies global warming research, like economics, politics, the role of government in peoples’ lives, and even our religious beliefs regarding mankind’s relationship to the environment.

And what is particularly fascinating about the Pew study is that addressing global warming (at the bottom of the list of priorities) could supposedly help the economy (at the top of the list). If Obama really believes his own rhetoric about green energy creating green jobs as a way of invigorating the economy, he should be making alternative energy a top priority.

But the dirty little secret is that renewable energy is very expensive and requires large areas of land to be used for its generation…which then has its own environmental impact. I’m attending a PowerSouth electric utility cooperative meeting where the trustees are fretting over where our future energy will come from. While most of the energy producers in America are investor-owned, and will simply pass higher prices on to consumers (with investors continuing to take a piece of the pie), the electric co-ops are more interested in keeping prices low because they are looking out for their customers.

The most interesting presentation I’ve seen at the meeting so far compared the energy content of various energy sources to how much real estate it takes to create that energy. Ethanol was absolutely the worst, with huge amounts of land required to generate relatively little energy. Nuclear power is, of course, at the other end of the spectrum, with a huge amount of energy being generated with very little land.

But the future (it was claimed by the presenter) is with methane and natural gas, large deposits of which are being found under the ocean floor. If one compared how much hydrogen energy is contained in various fuels compared to their carbon content (and thus how much CO2 is produced when the are burned), the order from worst to best runs: wood & biomass, coal, petroleum, natural gas, methane, and pure hydrogen. (Since there are no natural deposits of hydrogen, it’s inclusion is somewhat irrelevant).

And the infrastructure required to retrieve and use these gases to fire electric turbines takes up very little space. His view was that methane IS renewable, continuously being generated by microbes in the Earth’s crust.

Whether these sources will help solve our energy problems, I have no idea. But if for no other reason that the physical limits on how much energy we can get out of one acre of wind, solar, biomass, etc., I predict we will never substantially reduce CO2 emissions until we embrace nuclear once again, or develop some other new energy technology which we, at this point, can only dream about.

(Of course, if you have read my writings, you already know my research suggests that the climate system is relatively insensitive to the CO2 we produce, anyway.)

The energy needs of humanity are so vast (and growing, especially in India and China) that we can only stay economically competitive if we abandon ideas which provide a poor return on investment, and remain open to ideas that might at first seem preposterous…such as using antibiotics to treat ulcers.

The Origin of Increasing Atmospheric CO2 – a Response from Ferdinand Engelbeen

Thursday, January 22nd, 2009

After yesterday’s post about manmade vs. natural sources of CO2, I received the following e-mail from Ferdinand Engelbeen. I’ve reproduced that e-mail below, and made a couple of comments (also in italics)….I’m at a conference, so I posted this quickly…sorry for any typos… and thanks to Ferdinand for taking the time to respond. – Roy

Dear Dr. Spencer,

I have reacted a few times via Anthony Watts’ weblog on your different thoughts about the origin of the increase of CO2 in the atmosphere. Regardless if that is man made or not, I think we agree that the influence of the increase itself on temperature/climate is limited, if observable at all. But we disagree about the origin of the increase. I am pretty sure that the increase is man-made and have made a comprehensive page to show all the arguments to that at:

Thus here follows my critique on your blog page:

Besides the mass balance (which excludes any net natural addition as long as the increase in the atmosphere per year is less than the emissions), there are two essential points:

The d13C reduction:
Indeed it is not directly possible to make a distinction between 13C depleted fossil fuel burning and 13C depleted vegetation decay. The fingerprint of d13C changes by vegetation over the seasons is much larger than from fossil fuel burning (~60 GtC vs. 8 GtC, with about the same average d13C level). But vegetation changes are two-way, while fossil fuel burning is one-way addition. Thus the absolute height of the seasonal variation is not important, only the difference at the end of the year with the previous year is important for the year-by-year and multiyear trend. (Yes…I was making no claim regarding the seasonal cycle as being contributed to by mankind…although a change from one year to the next could still be due to a small imbalance between the huge natural sources and sinks. – Roy)

To know if the biosphere as a whole (sea algues + vegetation + soil bacteria) is a net absorber or emitter, we have a nice distinction between the two opposing actions of growth and decay: oxygen use. That can be used to determine the difference. The type of fuels used are known with reasonable accuracy, and thus their oxygen use when burned is known with reasonable accuracy too. Now since about 1990, oxygen measurements have sufficient accuracy to see the very small changes that fossil fuel use cause and the deficit/addition that the biosphere growth/decay adds to the fossil fuel use of oxygen. This revealed that since about 1990 the biosphere is a net source of oxygen and thus a net sink for CO2 (and thus of preferentially 12CO2), enriching the atmosphere with 13CO2. The biosphere as a whole thus is a net source of 13CO2 and can’t be the origin of the decreasing d13C levels. The oxygen use and d13C changes are used to estimate the partitioning of CO2 sequestering between land biosphere and oceans:
and: (up to 2002) (But isn’t this just one component of the total budget? You are talking specifically about what happens to the extra manmade CO2, and how it affects O2 and C13…I’m talking about some sort of oceanic, temperature dependent source…say a decrease in phytoplankton growth associated with warming.-Roy)

Other sources of low d13C don’t exist in fast/large quantities: (deep) oceans, calcite deposits/weathering, volcanic degassing,… in general have high(er) d13C levels (around zero per mil) compared to the atmosphere (-8 per mil). Thus the human use of low d13C (and zero d14C) fossil fuels is the sole cause of decreasing d13C levels. (But Behrenfeld, 2006 showed huge Climate Driven Trends in Contemporary Ocean Productivity…wouldn’t these have a substantial depleted C13 signature? – Roy)

d13C levels decrease as well as in the atmosphere as in the upper oceans in line with the use of fossil fuels: relative stable from 600 to 150 years ago and starting to decrease faster and faster in the past 150 years, see e.g. the decrease of d13C in coralline sponges and ice cores/firn/atmosphere at:

Although fossil fuel burning is the sole cause of the d13C decrease, the result is less (about 1/3rd) of what can be expected from the burning of fossil fuels, if every molecule remained in the atmosphere. But that is not the case, as near 20% per year of the atmospheric CO2 is exchanged with vegetation (less influence, as most comes back next season), upper oceans (idem) and deep oceans (that is important, as that disappears in a large mass). With an exchange of about 40 GtC from the deep oceans, the calculated reduction of d13C fits most of the (recent) trend of d13C in the atmosphere…

Further: the sentence:
“And since most of the cycling of CO2 between the ocean, land, and atmosphere is due to biological processes, this alone does not make a decreasing C13/C12 ratio a unique marker of an anthropogenic source.”
Is not accurate, as most of the seasonal cycling between the oceans and atmosphere (~90 GtC) is abiogenic and a matter of solubility of CO2 at different temperatures and salt/bi/carbonate concentrations and pH of the ocean surface layer.
Feely e.a. has made several interesting pages on that: (I’m surprised at this claim…I thought the seasonal cycle is dominated by Northern Hemispheric vegetation growing and dying each year? Indeed, my first figure from yesterday’s post clearly shows the seasonal cycle has the largest dC13 signature…? – Roy)

Thus the main conclusion is that the d13C trend is not caused by the biosphere and strengthens the case for a human origin.

The temperature-CO2 connection:
I have had several discussions about this subject. The first point is that most of the people look at too short time intervals. We are interested in the (none) role of increased CO2 accumulation in the atmosphere on climate. Thus let us start with the trend of accumulations: temperature vs. accumulated emissions and accumulation in the atmosphere:


Well the temperature trend shows distinct periods with flat to negative trends (1900-1910, 1945-1975 and 1998-current) and upgoing trends (1910-1945, 1975-1998). There is some correlation with CO2 levels. As good as for temperature caused by CO2 levels as the “warmers” assure us, as the opposite way, but not really convincing. On the other hand the accumulation in the atmosphere is a near perfect replica of the accumulated emissions only at a lower level: 55% of the emissions on the whole 100+ years trend or any part of a few years long of that trend. One can see that more clear in the two correlation trends:


A huge short term temperature change of halve the scale has an influence of only a few ppmv/°C, while the “long” term temperature change should have an influence of 80 ppmv/°C? Seems rather impossible, especially when compared to the Vostok (and other ice cores) record, where a relative constant ratio of 8 ppmv/°C is recorded over the full past 420,000 years.

Compare that to the emissions-atmospheric accumulation correlation trend:


A near perfect fit…

Thus what you see if you look at the year-by-year accumulation rate is a mix of a trend (at about 55% of the emission, or about 1.5 ppmv/yr nowadays) and the direct, short-term influence of (mainly) temperature.

Your formula for the direct temperature influence (both detrended) is:
dCO2/dt = 1.71 ppmv/yr/°C
but there is a problem with this. The detrended plot for dCO2/dt should be around zero, but the plot shows an average of 1.24 ppmv/yr, which is attributed to the temperature changes, thus is not really detrended. And even with a negative temperature anomaly, there still is a continuous increase of CO2, which ends at about -0.7°C temperature anomaly.

The next, more important problem is that the formula has no time limit. Even with a constant temperature, the CO2 increase is indefinitely. If that is applied to the past, then the ice ages should show zero CO2 and the interglacials several thousands of ppmvs… Even on the more nearby past: the LIA lasted several decades, but only shows a decrease of about 8 ppmv/°C, near identical to the long term ratio seen in the Vostok ice core. While the current variability around the trend is in the order of 3 ppmv/°C. That is a lot different from an unlimited trend of 1.71 ppmv/yr/°C.

If we look again at the graph of the temperature and the accumulation of the atmosphere, then according to your interpretation, the CO2 levels would be decreasing increasing 1900-1920, increasing increasing 1920-1940, constant increasing 1940-1975, increasing increasing 1975-1998 and constant/declining increasing 1998-current. Although resembling, not really what is seen in the trends. Still the trend from accumulated emissions is far superior…

My formula looks more like:
dCO2/dt = 3 * dT + 0.55 * d(emissions)
where dCO2/dt holds for any time period (past or present), 3*dT is for short term changes and may expand to 8*dT for long term sustained changes and dT is for any time period, but always the difference over the full period.

Pieter Tans has a better fit than me by including precipitation. See:

Last but not least, the mass balance is a little difficult to explain with your attribution of a large part of the increase to temperature. As long as no matter disappears to space we have according to your attribution over the past 50 years:

attributed to temperature: +52 ppmv
added by humans: +110 ppmv
observed: +60 ppmv
to be removed by unknown (natural) sinks: 102 ppmv

Thus some natural sinks must remove more CO2 from the atmosphere than the increase of CO2 which is attributed to the temperature increase…

Moreover, as the oceans are the main source of extra CO2 by increasing temperatures, according to this scheme, the sole fast sink remaining is the biosphere. Thus the terrestrial biosphere should have removed 102 ppmv in 50 years, which means the equivalent of about 33% of the total terrestrial vegetation (see )… That in times of disappearing tropical forests seems rather unlikely…

That is the kernel of my objections…


Ferdinand Engelbeen
retired engineer process automation

Increasing Atmospheric CO2: Manmade…or Natural?

Wednesday, January 21st, 2009

I’ve usually accepted the premise that increasing atmospheric carbon dioxide concentrations are due to the burning of fossil fuels by humans. After all, human emissions average around twice that which is needed to explain the observed rate of increase in the atmosphere. In other words, mankind emits more than enough CO2 to explain the observed increase in the atmosphere.

Furthermore, the ratio of the C13 isotope of carbon to the normal C12 form in atmospheric CO2 has been observed to be decreasing at the same time CO2 has been increasing. Since CO2 produced by fossil fuel burning is depleted in C13 (so the argument goes) this also suggests a manmade source.

But when we start examining the details, an anthropogenic explanation for increasing atmospheric CO2 becomes less obvious.

For example, a decrease in the relative amount of C13 in the atmosphere is also consistent with other biological sources. And since most of the cycling of CO2 between the ocean, land, and atmosphere is due to biological processes, this alone does not make a decreasing C13/C12 ratio a unique marker of an anthropogenic source.

This is shown in the following figure, which I put together based upon my analysis of C13 data from a variety of monitoring stations from the Arctic to the Antarctic. I isolated the seasonal cycle, interannual (year-to-year) variability, and trend signals in the C13 data.

The seasonal cycle clearly shows a terrestrial biomass (vegetation) source, as we expect from the seasonal cycle in Northern Hemispheric vegetation growth. The interannual variability looks more like it is driven by the oceans. The trends, however, are weaker than we would expect from either of these sources or from fossil fuels (which have a C13 signature similar to vegetation).

C13/C12 isotope ratios measured at various latitudes show that CO2 trends are not necessarily from fossil fuel burning.

C13/C12 isotope ratios measured at various latitudes show that CO2 trends are not necessarily from fossil fuel burning.

Secondly, the year-to-year increase in atmospheric CO2 does not look very much like the yearly rate of manmade CO2 emissions. The following figure, a version of which appears in the IPCC’s 2007 report, clearly shows that nature has a huge influence over the amount of CO2 that accumulates in the atmosphere every year.

The yearly increase of CO2 measured at Mauna Loa shows huge natural fluctuations which are caused by temperature changes.

The yearly increase of CO2 measured at Mauna Loa shows huge natural fluctuations which are caused by temperature changes.

In fact, it turns out that these large year-to-year fluctuations in the rate of atmospheric accumulation are tied to temperature changes, which are in turn due mostly to El Nino, La Nina, and volcanic eruptions. And as shown in the next figure, the CO2 changes tend to follow the temperature changes, by an average of 9 months. This is opposite to the direction of causation presumed to be occurring with manmade global warming, where increasing CO2 is followed by warming.

Year to year CO2 fluctuations at Mauna Loa show that the temperature changes tend to precede the CO2 changes.

Year to year CO2 fluctuations at Mauna Loa show that the temperature changes tend to precede the CO2 changes.

If temperature is indeed forcing CO2 changes, either directly or indirectly, then there should be a maximum correlation at zero months lag for the change of CO2 with time versus temperature (dCO2/dt = a + b*T would be the basic rate equation). And as can be seen in the above graph, the peak correlation between these two variables does indeed occur close to zero months.

And this raises an intriguing question:

If natural temperature changes can drive natural CO2 changes (directly or indirectly) on a year-to-year basis, is it possible that some portion of the long term upward trend (that is always attributed to fossil fuel burning) is ALSO due to a natural source?

After all, we already know that the rate of human emissions is very small in magnitude compared to the average rate of CO2 exchange between the atmosphere and the surface (land + ocean): somewhere in the 5% to 10% range. But it has always been assumed that these huge natural yearly exchanges between the surface and atmosphere have been in a long term balance. In that view, the natural balance has only been disrupted in the last 100 years or so as humans started consuming fossil fuel, thus causing the observed long-term increase.

But since the natural fluxes in and out of the atmosphere are so huge, this means that a small natural imbalance between them can rival in magnitude the human CO2 input. And this clearly happens, as is obvious from the second plot shown above!

So, the question is, does long-term warming also cause a CO2 increase, like that we see on in the short term?

Let’s look more closely at just how large these natural, year-to-year changes in CO2 are. Specifically, how much CO2 is emitted for a certain amount of warming? This can be estimated by detrending both the temperature and CO2 accumulation rate data, and comparing the resulting year-to-year fluctuations (see figure below).


Although there is considerable scatter in the above figure, we see an average relationship of 1.71 ppm/yr for every 1 deg C. change in temperature. So, how does this compare to the same relationship for the long-term trends? This is shown in the next figure, where we see a 1.98 ppm/yr for every 1 deg. C of temperature change.


This means that most (1.71/1.98 = 86%) of the upward trend in carbon dioxide since CO2 monitoring began at Mauna Loa 50 years ago could indeed be explained as a result of the warming, rather than the other way around.

So, there is at least empirical evidence that increasing temperatures are causing some portion of the recent rise in atmospheric CO2, in which case CO2 is not the only cause of the warming.

Now, the experts will claim that this is all bogus, because they have computer models of the carbon budget that can explain all of long term rise in CO2 as a result of fossil fuel burning alone.

But, is that the ONLY possible model explanation? Or just the one they wanted their models to support? Did they investigate other model configurations that allowed nature to play a role in long term CO2 increase? Or did those model simulations show that nature couldn’t have played a role?

This is the trouble with model simulations. The ones that get published are usually the ones that support the modeler’s preconceived notions, while alternative model solutions are ignored.

If an expert in this subject sees a major mistake I’ve made in the above analysis, e-mail me and I’ll post an update, so that we might all better understand this issue.

Does Nature’s Thermostat Exist? A Global Warming Debate Challenge

Tuesday, January 13th, 2009

Scientific disagreements over just how much mankind’s carbon dioxide emissions will warm the planet can be described with the analogy of the thermostat in your home. You set the thermostat to a certain temperature, and if it senses (for example) that the temperature is rising too much above that preset level, a cooling mechanism (air conditioning) kicks in and works to push the temperature back down.

I, and a number of other scientists, believe that nature has a thermostatic control mechanism that “pushes back” against a warming influence, such as the relatively weak warming from more atmospheric carbon dioxide. (The direct warming effect of more CO2 would amount to little more than 1 deg. F by late in this century, and is generally not the subject of debate.)

In climate research (and engineering, and physics) a thermostatic control mechanism is called ‘negative feedback’, and as discussed elsewhere on my web site there are a number of studies that suggest it really does exist in the climate system. At this point my own research suggests that the natural cooling mechanism is most likely due to the response of clouds to warming. While it is a bit technical, the issue is introduced in this peer-reviewed publication.

To be sure, positive feedbacks also exist in nature, such as the enhanced solar heating of the ocean that accompanies the melting of sea ice. But these are regional and relatively weak compared to the dominating, global influence of cloud feedbacks.

The reason why we keep hearing about how serious global warming will be is that all twenty-something of the computer climate models tracked by the IPCC now have net positive feedbacks. They enhance the small direct warming effect of extra CO2…by near-catastrophic amounts for a couple of the models.

Of course, the models are only behaving the way they are programmed to behave, and here I discuss why I think the modelers have seriously misinterpreted the role of clouds in climate change when building those models. While the modelers do not realize it, their tests of the models’ behavior with satellite observations have not been specific enough to validate the models’ feedback behavior. In effect, their tests are yielding ‘false positives’.

Obviously, the thermostat (feedback) issue is the most critical one that determines whether manmade global warming will be catastrophic or benign. In this context, it is critical for the public and politicians to understand that the vast majority of climate researchers do not work on feedbacks.

In popular political parlance, most climate researchers do not appreciate the nuanced details of how one estimates feedbacks in nature, and therefore they are not qualified to pass judgment on this issue. Therefore, any claims about how many thousands of scientists agree with the IPCC’s official position on global warming are meaningless.

I would challenge any IPCC scientist who considers himself or herself an expert on feedbacks to debate this issue against me. We can invite a variety of physicists and engineers who understand the concept of ‘feedback’, and who do not already have strong philosophical or political biases on the issue, and ask them to judge whether the IPCC models are behaving in realistic ways when it comes to cloud feedbacks.

The debate could be in either oral or written form, with our best arguments presented for evaluation by others. Then, those judges can summarize in lay terms for politicians whether “the science is settled” on this issue.

And I’m particularly interested to see whether anyone can respond to this challenge without using phrases like “this issue is settled”, “the cloud claim is bogus”, or without ad hominem attacks.

New Study Doesn’t Support Climate Models (But You’ll Never Hear About It)

Sunday, January 11th, 2009

A new study just published in the January 2009 issue of Journal of Climate uses a model to study the effect of warming oceans on the extensive low-level stratocumulus cloud layers that cover substantial parts of the global oceans. This study, entitled “Response of a Subtropical Stratocumulus-Capped Mixed Layer to Climate and Aerosol Changes”, by Peter Caldwell and Christopher Bretherton, is important because it represents a test of climate models, all of which now cause low level clouds to decrease with warming.

And since less low cloud cover means more sunlight reaching the surface, the small amount of direct warming from extra CO2 in climate models gets amplified – greatly amplified in some models. And the greater the strength of this ‘positive cloud feedback’, the worse manmade global warming and associated climate change will be.

But everyone agrees that clouds are complicated beasts…and it is not at all clear to me that positive cloud feedback really exists in nature. (See here and here for such evidence).

The new Journal of Climate study addressed the marine stratocumulus clouds which form just beneath the temperature inversion (warm air layer) capping the relatively cool boundary layer to the west of the continents. The marine boundary layer is where turbulent mixing of water vapor evaporated from the ocean surface gets trapped and some of that vapor condenses into cloud just below the inversion.

That warm temperature inversion, in turn, is caused by rising air in thunderstorms – usually far away — forcing the air above the inversion to sink, and sinking air always warms. The inversion forms at a relatively low altitude where the air is ‘prevented’ from sinking any farther. This relationship is shown in their Figure 1, which I have reproduced below.

Conceptual model of how marine stratocumulus clouds are formed, from Fig. 1 in Caldwell and Bretherton (2009).

Conceptual model of how marine stratocumulus clouds are formed, from Fig. 1 in Caldwell and Bretherton (2009).

The authors used a fairly detailed model to study the behavior of these clouds in response to warming of the ocean and found that the cloud liquid water content increased with warming, under all simulated conditions. This, by itself, would be a negative feedback (natural cooling effect) in response to the warming since denser clouds will reflect more sunlight. At face value, then, these results would not be supportive of positive cloud feedback in the climate models.

But what is interesting is that the authors do not explicitly make this connection. Even though they mention in the Introduction the importance of their study to testing the behavior of climate models, in their Conclusions they don’t mention whether the results support – or don’t support — the climate models.

And I would imagine they will not be happy with me making that connection for them, either. They would probably say that their study is just one part of a giant puzzle that doesn’t necessarily prove anything about the climate models that predict so much global warming.

Fair enough. But a double standard has clearly been established when it comes to publishing studies related to global warming. Published studies that support climate model predictions of substantial manmade global warming are clearly preferred over those that do not support the models, and explicitly stating that support in the studies is permitted.

But results that appear to contradict the models either can not get published…or (like in this study) the contradiction can not be explicitly stated without upsetting one or more of the peer reviewers.

For instance, a paper I recently submitted to Geophysical Research Letters was very rapidly rejected based upon only one reviewer who was asked to review that paper. (I have never heard of a paper’s fate being left up to a single reviewer, unless no other reviewers could be found, which clearly was not the case in my situation). That reviewer was quite hostile to our satellite-based results, which implied the climate models were wrong in their cloud feedbacks.

One wonders whether support of climate models would have been mentioned in the Caldwell and Bretherton paper if their results were just the opposite, and supported the models. Of course, we will never know.

Daily Monitoring of Global Average Temperatures

Saturday, January 10th, 2009

For those who are interested in monitoring how the current month’s global average tropospheric temperature is shaping up, we have a website where you can plot daily global average satellite-based temperatures since August 1998. The data come from the AMSU instrument flying on the NOAA-15 satellite, and the updates are made automatically once a day in the late afternoon; they run about 1-2 days behind real-time.

The following screenshot shows an example I took from the website today.

NASA Discover project website screenshot of tool to monitor daily global-average temperatures measured by the NOAA-15 satellite.

NASA Discover project website screenshot of tool to monitor daily global-average temperatures measured by the NOAA-15 satellite.

The webpage tool uses Java, which can be kind of slow loading (clearing your browser’s cache first seems to help…e.g., in Firefox use Tools => Clear private data).

Use the drop-down menu to pick “ch5” (AMSU channel 5) which is the channel John Christy and I use to monitor mid-tropospheric temperatures. The fairly large fluctuations seen within individual months are usually due to increases (warming) or decreases (cooling) in tropical rainfall activity, called “intraseasonal oscillations”.

Channel 9 is the channel we use for lower stratospheric temperatures. Channel 13 is in the upper stratosphere, and is kind of interesting since it is expected to cool with increasing atmospheric carbon dioxide concentrations.

The check boxes allow you to choose which years to display. You can also display the daily record highs and lows measured during the first 20 years of our MSU-based record, 1979-1998, but only for AMSU channels 5 (mid-troposphere) and 9 (lower stratosphere).

What we usually watch is how the current month is shaping up compared to the same calendar month in the previous year. For instance, in the screenshot above we see that early January 2009 is running roughly the same as January 2008, which ended up being close to the 1979-98 average.

This web page should be used as only a rough guide, because there are some data adjustments made before we officially post the UAH monthly updated data. (I post a plot of those data here.) The biggest adjustment is the fact that we don’t even use NOAA-15 right now…we are using the AMSU data from NASA’s Aqua satellite in the final UAH product.

Brutal Cold in the IPCC Models versus Nature

Friday, January 9th, 2009

Every winter I start thinking about the processes that lead to brutally cold air masses over regions far removed from the warming influence of the oceans…Siberia, interior Canada, etc. Temperatures in interior Alaska have been routinely dipping into the -50’s F over the last week or so. Europe has been hard hit by unusually cold weather, and Vladimir Putin decided this would be a good time to cut off natural gas supplies to a number of European countries.

As of this writing (January 9), it looks like the coldest temperatures in the Lower 48 are yet to come, as the coldest airmass over northwest Canada finds its way down into the central and eastern U.S. starting around next Wednesday (January 14) or so. Gee, where is global warming when you really need it?

The ‘scientific consensus’ is that these frigid air masses are the ones that should warm the most with manmade global warming. The reasoning goes that since they contain very little water vapor (Earth’s main greenhouse gas), the warming effect of the extra carbon dioxide should be proportionately greater there.

But what causes these air masses to get so cold in the first place? Well, little or no sunlight is the most direct reason, which means they radiatively cool to outer space without any solar heating to offset that infrared cooling.

But what limits how cold they can get? Why do these temperatures seldom fall below -60 or -70 deg. F….temperatures reached fairly early in the winter, but which then level off? The answer is mostly related to the water vapor content of the air.

There is an interesting issue of causation involved with these cold and dry air masses. Contrary to what some meteorologists think, the air doesn’t become dry because of the cold. If that was the case, the air would become continuously saturated with clouds and fog as it keeps cooling, rather than clear and relatively dry as is observed.

No, rather than being dry because it is cold, the air instead becomes cold because it is dry. And the reason the air is so dry is because it has been slowly sinking from high in the atmosphere, where there is very little water vapor. And why is THAT air so dry? Because precipitation processes have removed the water vapor as relatively warmer and moister air ascends in low pressure areas — snowstorms — which move around the periphery of the high pressure zones that are created by the strong cooling.

So, ultimately, it is precipitation processes in regions remote from these cold high pressure areas that mostly determine how cold surface temperatures will get. And since we have little understanding of how these precipitation processes in the upper atmosphere might change with ‘global warming’, there is (in my mind) more uncertainty about water vapor feedbacks than the IPCC has led us to believe.

I thought it would be interesting to examine the behavior of the coldest temperatures in the climate models that are tracked by the IPCC. Monthly gridded data are archived at PCMDI for transient CO2 simulations from these models, and we simply took the minimum monthly average temperature anywhere in the Northern Hemisphere for each month in the first 70 years of those simulations. Some of the results are shown in the figure below (note the temperature scaling is relative, not absolute). The red lines are 12-month running averages.

Warming of the coldest monthly temperatures observed anywhere in the N. Hemisphere for the first 70 years of transient CO2 integrations from 22 IPCC climate models.

Across those 22 models, the minimum monthly temperatures warmed by an average of 0.43 deg. C per decade (0.77 deg. F per decade), which is somewhat less than double the average global warming rate in those models. Thus, the coldest air masses in the models do warm much faster than the global average temperature does.

Also, the average minimum monthly temperature for the first February across all 22 models was -50 deg. C (-58 deg. F), indicating that the models do indeed create very cold airmasses.

The model with the greatest rate of warming is also shown: the INM CM3.0 model warmed at an average of 0.80 deg. C per decade (1.44 deg. F per decade). The model with the least warming (actually, it had a zero warming trend) was the FGOALS 1.0, which is also the least sensitive of all the IPCC models analyzed by Forster & Taylor (2006 J. Climate).

What does all this mean for the theory of manmade global warming? How fast have these coldest airmasses warmed, compared to the IPCC models? Well, the location in Siberia that is traditionally the coldest, Ojmjakon, hit -60 deg. C (-76 deg. F) twice last month (December, 2008), a temperature that has been reached only one other time in the last 25 years. So, I suspect that global warming isn’t happening nearly fast enough for the folks who live there.

50 Years of CO2: Time for a Vision Test

Thursday, January 1st, 2009

(Jan. 10 update: A few people seem to have missed the point of this satirical post. It is a counterpoint to Al Gore’s use of “millions of tons” when talking about CO2 emissions. I’m pointing out that relative to the total atmosphere, millions of tons of CO2 is miniscule. And even a 50% increase in a very small number [the CO2 content of the atmosphere] is still a very small number.)

Now that there have been 50 full years of atmospheric carbon dioxide concentration monitoring at Mauna Loa, Hawaii, I thought January1, 2009 would be an appropriate time to take a nostalgic look back.

As you well know from Al Gore’s movie (remember? It’s the one you were required to come to English class and watch or the teacher would fail your kid), we are now pumping 70 million tons of carbon dioxide into the atmosphere every day as if it’s an “open sewer”.

Well, 50 years of that kind of pollution is really taking its toll. So, without further ado, here’s what 50 years of increasing levels of CO2 looks like on the Big Island:

As you can see, there has been a rapid…what? You can’t see it?…oh, I’m sorry. It’s that flat line at the bottom of the graph…here let me change the vertical scale so it runs from 0 to 10% of the atmosphere, rather than 0 to 100%….

Now, as I was saying…you can see there has been a rapid increase…what? what NOW? You still can’t see it?? It’s that blue line at the bottom! Are you color deaf?

Obviously, you had too much to drink at the New Years party last night, and your eyes are a little blurry. Here, I’ll change the scale… go from 0 to 1% of the atmosphere….

Now can you see it? Good. As I was saying, 50 years of carbon dioxide emissions by humanity has really caused the CO2 content of the atmosphere to surge upward. It might not look like much, but trust me, Mr. Gore says….

NOW what?? Carbon dioxide is what? Necessary for life on Earth?

What are you, some kind of global warming denying right-wing extremist wacko? The polar bears are drowning!!

I can see I’m just wasting my time…sheesh.