Archive for July, 2011

Rise of the 1st Law Deniers

Sunday, July 31st, 2011

So, we continue to be treated to news articles (e.g. here, and here.) quoting esteemed scientists who claim to have found problems with our paper published in Remote Sensing, which shows huge discrepancies between the real, measured climate system and the virtual climate system imagined by U.N.-affilliated climate modelers and George Soros-affiliated pundits (James Hansen, Joe Romm, et al.)

Their objections verge on the bizarre, and so I have to wonder whether any of them actually read our paper. I eagerly await their published papers which show any errors in our analysis.

Apparently, all they need to know is that our paper makes the U.N. IPCC climate models look bad. And we sure can’t have that!

What’s weird is that these scientists, whether they know it or not, are denying the 1st Law of Thermodynamics: simple energy conservation. We show it actually holds for global-average temperature changes: a radiative accumulation of energy leads to a temperature maximum…later. Just like when you put a pot of water on the stove, it takes time to warm.

But while it only takes 10 minutes for a few inches of water to warm, the time lag of many months we find in the real climate system is the time it takes for several tens of meters of the upper ocean to warm.

We showed unequivocal satellite evidence of these episodes of radiant energy accumulation before temperature peaks…and then energy loss afterward. Energy conservation cannot be denied by any reasonably sane physicist.

We then showed (sigh…again…as we did in 2010) that when this kind of radiant forcing of temperature change occurs, you cannot diagnose feedback, at least not at zero time lag as Dessler and others claim to have done.

If you try, you will get a “false positive” even if feedback is strongly negative!

The demonstration of this is simple and persuasive. It is understood by Dick Lindzen at MIT, Isaac Held at Princeton (who is far from a “skeptic”), and many others who have actually taken the time to understand it. You don’t even have to believe that “clouds can cause climate change” (as I do), because it’s the time lag – which is unequivocal – that causes the feedback estimation problem!

Did we “prove” that the IPCC climate models are wrong in their predictions of substantial future warming?

No, but the dirty little secret is that there is still no way to test those models for their warming predictions. And as long as the modelers insist on using short term climate variability to “validate” the long term warming in their models, I will continue to use that same short term variability to show how the modelers might well be fooling themselves into believing in positive feedback. And without net positive feedback, manmade global warming becomes for all practical purposes a non-issue. (e.g., negative cloud feedback could more than cancel out any positive feedback in the climate system).

If I’m a “denier” of the theory of dangerous anthropogenic climate change, so be it. But as a scientist I’d rather deny that theory than deny the 1st Law of Thermodynamics.

The Debt Crisis: Compromise is Not an Option

Friday, July 29th, 2011

We are used to politicians having to compromise in Washington. Compromise is viewed as a good thing. Both sides get some of what they want. What our country is facing with the current budget crisis, however, is a totally different situation.

We are not discussing how much this constituency gets out of tax revenue versus that constituency. We are instead dealing with the very real possibility that our economy will collapse due to excessive levels of debt and overspending, at which point no one is going to get much of anything they want.

An Example from the Real World

Let’s say a large family has gotten in the habit of spending more than it earns, borrowing more and more each year to the point where they are now spending 40% more than they are earning (which is where the federal government is now).

The family has loaded up its credit cards, and even opened up new credit card accounts each year in order to pay off the interest owed on the previous cards, as well as to support their lavish spending.

To make matters worse, creditors are about to raise the interest rates on those credit cards because they see the family as a high risk for not being able to pay off their cards. As it is, the parents in this family already know their children will be inheriting the problem they have created, since it will take many years of sacrifice to fix the problem.

What should they do? Should the family keep on the current path? Or, should they start to reduce their rate of spending, rather than increase it year after year?

The husband and wife agree there is a problem, but disagree about what they should do. One wants to start decreasing their rate of spending, even if it is painful in the short run. But the other wants to continue spending more than they earn…after all, the rest of the family has grown accustomed to clamoring for more and more cash to support their lifestyle. It would be cruel to not allow them to continue as before.

On an Unsustainable Path

The following chart dramatically illustrates that the situation our country is in right now, and it is far worse than any financial situation we have ever experienced before (the data come from here). True, almost every year the U.S. Federal Government has run a budget deficit (spending more than it takes in), but the last few years have seen an astronomical growth in that deficit due to the housing bubble, multiple wars being fought, “stimulus” spending, and an increasing proportion of the population willing to just sit back and live off their neighbors’ tax dollars: (click for the full-size version)

The brake this puts on economic growth is now making the budget deficit even worse because the amount of tax revenue coming in is a “percent of the action”, and the “action” (economic activity) has slowed to a trickle.

Clearly, the path we are on is unsustainable. I fear we will soon find ourselves in the same situation as Argentina, whose unsustainable rate of borrowing finally culminated in what amounted to economic collapse around 2001. Much of the country was suddenly poverty stricken, with rampant crime as people were just trying to survive.

Banks either closed, or only allowed customers to withdraw very small amounts of cash each week. Inflation skyrocketed. Many of the ruling elite fled the country with great amounts of wealth, since they saw the crisis coming.

In a matter of a couple of years, Argentina became virtually a Third World country.

Unfortunately, just like the family that could not rein in its spending, so much of our population has become dependent on government handouts (which means, taking from those taxpayers who help keep our economy going) that it will be difficult for politicians to do what needs to be done to put us back on a path toward prosperity.

We must reduce wasteful spending, and we must reduce the governmental tax and regulatory burdens on businesses which are keeping those businesses from growing. Politicians must make tough decisions that will save the country without regard for whether they will be re-elected or not.

The problem cannot be fixed by “taxing the rich more” because (1) there is not nearly enough money there to fix the problem, and even more importantly, (2) unless there is at least some incentive for people to financially benefit in proportion to their good ideas, there is no motivation to take the risks involved in bringing new and better products and services to market. After all, most of those attempts fail, and people who want more of what “the rich” have, are not willing to share in the failures of those who tried and failed.

Remember, “the rich” have kept only a small fraction of the total wealth they have provided to our country in the form of a higher standard of living with innumerable products at reduced prices, along with the millions of jobs provided to bring those products to market.

We need to celebrate the rich, not demonize them.

We are now at a crossroads, and it is our way of life that is at stake. If you want to see what the future looks like, just look at surviving in Argentina.

Fallout from Our Paper: The Empire Strikes Back

Friday, July 29th, 2011

UPDATE: Due to the many questions I have received over the last 24 hours about the way in which our paper was characterized in the original Forbes article, please see the new discussion that follows the main post, below.


LiveScience.com posted an article yesterday where the usual IPCC suspects (Gavin Schmidt, Kevin Trenberth, and Andy Dessler) dissed our recent paper in in the journal Remote Sensing.

Given their comments, I doubt any of them could actually state what the major conclusion of our paper was.

For example, Andy Dessler told LiveScience:

“He’s taken an incorrect model, he’s tweaked it to match observations, but the conclusions you get from that are not correct…”

Well, apparently Andy did not notice that those were OBSERVATIONS that disagreed with the IPCC climate models. And our model can quantitatively explain the disagreement.

Besides, is Andy implying the IPCC models he is so fond of DON’T have THEIR results tweaked to match the observations? Yeah, right.

Kevin Trenberth’s response to our paper, rather predictably, was:

“I cannot believe it got published”

Which when translated from IPCC-speak actually means, “Why didn’t I get the chance to deep-six Spencer’s paper, just like I’ve done with his other papers?”

Finally Gavin Schmidt claims that it’s the paleoclimate record that tells us how sensitive the climate system is, not the current satellite data. Oh, really? Then why have so many papers been published over the years trying to figure out how sensitive today’s climate system is? When scientists appeal to unfalsifiable theories of ancient events which we have virtually do data on, and ignore many years of detailed global satellite observations of today’s climate system, *I* think they are giving science a bad name.

COMMENTS ON THE FORBES ARTICLE BY JAMES TAYLOR
I have received literally dozens of phone calls and e-mails asking basically the same question: did James Taylor’s Forbes article really represent what we published in our Remote Sensing journal article this week?

Several of those people, including AP science reporter Seth Borenstein, actually read our article and said that there seemed to be a disconnect.

The short answer is that, while the title of the Forbes article (New NASA Data Blow Gaping Hole In Global Warming Alarmism) is a little over the top (as are most mainstream media articles about global warming science), the body of his article is — upon my re-reading of it — actually pretty good.

About the only disconnect I can see is we state in our paper that, while the discrepancy between the satellite observations were in the direction of the models producing too much global warming, it is really not possible to say by how much. Taylor’s article makes it sound much more certain that we have shown that the models produce too much warming in the long term. (Which I think is true…we just did not actually ‘prove’ it.)

But how is this any different than the reporting we see on the other side of the issue? Heck, how different is it than the misrepresentation of the certainty of the science in the IPCC’s own summaries for policymakers, versus what the scientists write in the body of those IPCC reports?

I am quite frankly getting tired of the climate ‘alarmists’ demanding that we ‘skeptics’ be held a higher standard than they are held to. They claim our results don’t prove their models are wrong in their predictions of strong future warming, yet fail to mention they have no good, independent evidence their models are right.

For example….

…while our detractors correctly point out that the feedbacks we see in short term (year-to-year) climate variability might not indicate what the long-term feedbacks are in response to increasing CO2, the IPCC still uses short-term variability in their models to compare to satellite observations to then support the claimed realism of the long-term behavior of those models.

Well, they can’t have it both ways.

If they are going to validate their models with short term variability as some sort of indication that their models can be believed for long-term global warming, then they are going to HAVE to explain why there is such a huge discrepancy (see Fig. 3 in our paper) between the models and the satellite observations in what is the most fundamental issue: How fast do the models lose excess radiant energy in response to warming?

That is essentially the definition of “feedback”, and feedbacks determine climate sensitivity.

I’m sorry, but if this is the best they can do in the way of rebuttal to our study, they are going to have to become a little more creative.

Our Feedback Diagnosis Paper is Published Today

Monday, July 25th, 2011

UPDATE: Since it appears the web traffic trying to access our paper has overloaded the publisher’s server, you can get a copy here.

On the Misdiagnosis of Surface Temperature Feedbacks from Variations in Earth’s Radiant Energy Balance was published today in the journal, Remote Sensing, and a pdf is available. I discussed the findings here.

Modeled Ocean Temperatures from 1880 through 2010

Friday, July 22nd, 2011

This is an update of my last post, where I described the results of a Forcing-Feedback-Diffusion (FFD) model of ocean temperature variations to 2,000 meters deep.

The model assumes the GISS-assumed forcings since 1880, including greenhouse gases, volcanoes, and manmade aerosol pollution. To those, I added a forcing term proportional to El Nino/La Nina activity, equal to 0.9 x MEI (Multivariate ENSO Index), in Watts per sq. meter. I adjusted ocean turbulent heat diffusion coefficients and the El Nino term until I got a correlation of 0.95 between the model temperature variations for the surface-50 meter ocean layer for the period 1955 through 2010. It also matches the warming trend in the 650-700 m layer during 1955-2010, which is the deepest layer for which we have long-term data to compare to.

I’ve now extended the model simulation back to 1880, since GISS forcings go back that far, and NOAA has an MEI reconstruction of El Nino/La Nina activity that goes back even before that. The optimum climate sensitivity was 1.1 deg. C for a doubling of atmospheric CO2, a climate sensitivity so low that the IPCC considers it very unlikely.

The results look like this (click on image for full size version):

Note I have also added the HadSST2 sea surface temperatures, scaled to match the variability of the Levitus 0-50 meter layer during 1955-2010.

The match is pretty good, except the model does not capture the exceptionally cool conditions during 1900-1935. Since this was a period of low sunspot activity, it could be this is a cosmic ray effect on global cloud cover.

The important message here is that this simulation was done with a very low climate sensitivity, corresponding to only about 1/3 the warming rate the IPCC projects for the future in response to increasing atmospheric CO2.

A Note on Deep Ocean Heat Storage
For those interested in the deep-ocean heat storage issue, the assumed model diffusivities I used (which average 3.7 x 10-4 m2/sec down to 2,000 meters depth) are considerably larger than what is usually assumed in climate model simulations. If I use a diffusivity much closer to what is traditionally used (1.2 x 10-4 m2/sec), then I have to reduce the model sensitivity to 0.9 deg. C for 2XCO2 in order to still match the near-surface warming trend, otherwise the model warms too much compared to observations.

Oh, the Insensitivity! More on Ocean Warming 1955-2010

Thursday, July 21st, 2011

The evidence for anthropogenic global warming being a false alarm does not get much more convincing than this, folks.

Using a combination of the GISS-assumed external forcings for long-term temperature changes, and an El Nino/La Nina internal forcing term for year-to-year variability, a simple Forcing-Feedback-Diffusion (FFD) model explains 90% of the variance in ocean heat content variations in the surface-to-50 meter depth layer since 1955 (click for full-size version):

The dashed lines are 3rd order polynomial fits; I have included a small offset between the model and observation data so you can see those trend curves, otherwise they would lie on top of each other. Note that the model captures the lack of warming since about 2003.

Here are the model-vs-observations warming trends as a function of ocean depth; I have plotted them for both the full period (56 years, 1955-2010), and also for the 2nd half of the period (1983-2010) which is when almost all of the warming below 200 meters occurred (click for full-size version):

Now, the important news is that the model fits to the data were accomplished with a climate sensitivity of only 1.1 deg. C for a doubling of CO2 (a feedback parameter of 3.6 W m-2 K-1). This is well below what the IPCC claims for future warming rates.

Also, without the El Nino/La Nina term, the model produces far too much warming late in the period. So, a lack of ocean warming since about 2003 could be La Nina’s ‘canceling out’ CO2 warming. (I found using the Pacific Decadal Oscillation, PDO, as a natural forcing term did basically the same thing, but it could not capture the large El Nino/La Nina temperature swings.)

Of course, there could be other natural forcings at work, too. To the extent that a portion of warming since the 1950s was due to those processes, this would mean climate sensitivity is lower still.

Where’s that Darn Missing Heat?
Now, there are Trenberthian claims out there that a recent lack of warming is due to the ‘missing heat’ hiding in the deep ocean somewhere, just waiting to pounce on us when we aren’t looking. To address this possibility, the model mixes a substantial amount of extra heat down to 2000 meters depth (even though a 2009 Levitus presentation suggested that there has been essentially no warming below 1500 meters depth…see slide 14 here).

In fact, even with low climate sensitivity, the model solution for 1955-2010 pumps 18% more heat into the 0-700 meter layer than the Levitus observations show for that 56 year period. Furthermore, 30% of the total heat pumped into the ocean by the model is below 700 meters deep. So, it appears that even deep ocean heating cannot explain away a recent lack of surface warming.

One of the things you find with these models is that, if the heat cannot mix below the thermocline, then there is no way for the ocean below the thermocline to warm. The Levitus data suggests an average thermocline depth of 100 meters or so….notice the “bend” in the warming profiles around that depth, suggesting resistance to turbulent heat diffusion. Instead, much of the extra heat in the mixed layer is lost to space through negative feedback processes which the IPCC claims are instead positive.

[For a little entertainment, here's an animation of the three-monthly changes in the temperature profile down to 700 meters, 1955 through early 2011, put together for me by Danny Braswell]:
Levitus temperature profile animation

Conclusion
The bottom line is that the relatively weak warming of the ocean since the 1950s is consistent with negative feedback (low climate sensitivity), not positive feedback. The ocean mixed layer and the atmosphere convectively coupled to it loses excess heat to outer space before it can be mixed into the deep ocean.

In other words, Trenberth’s missing heat is not in the deep ocean…it’s instead lost in outer space.

Model Details:
- 40 model layers, each 50 m thick, to a depth of 2,000 meters (the atmosphere is assumed to be in convective equilibrium with the top layer, and its heat capacity is small enough to be ignored).
- Energy is conserved in each model layer through a combination of forcing, feedback (top layer only), and turbulent heat diffusion between layers.
- Feedback (loss of excess energy to space from implicit changes in the atmosphere with temperature) is proportional to the top layer temperature departure from average.
- Heat diffusion coefficients between layers are manually adjusted to give an approximate fit to the warming profiles with depth. We are working on further optimizing those fits with an iterative adjustment procedure.
- Yearly ‘external’ forcing estimates are from GISS, interpolated to monthly, and extrapolated thru 2010.
- El Nino/La Nina ‘internal’ forcing is empirically derived based upon a best fit to observed year-to-year temperature variations: forcing = 0.85 times the Multivariate ENSO Index (MEI), in Watts per sq. meter, with a time lag of 3 months.
– Model is initialized in 1900 using GISS forcings; MEI forcing is included starting in 1955 (which is when the ocean temperature observations become available for comparison).

Our Refutation of Dessler (2010) is Accepted for Publication

Friday, July 15th, 2011

Some of you might remember last year’s little dust-up between Andy Dessler and me over feedbacks in the climate system.

Our early-2010 paper showed extensive evidence of why previous attempts to diagnose feedbacks (which determine climate sensitivity) have likely led to overestimates of how sensitive the climate system is to forcings like that from increasing CO2. The basic reason is that internal radiative forcing from natural cloud variations causes a temperature-radiation relationship in the data which gives the illusion of high climate sensitivity, even if climate sensitivity is very low.

Dessler’s late-2010 paper basically blew off our arguments and proceeded to diagnose cloud feedback from satellite data in the traditional manner. His justification for ignoring our arguments was that since: 1) most of the temperature variability during the satellite record was due to El Nino and La Nina (which is true), and 2) no one has published evidence that ‘clouds cause El Nino and La Nina’, then he could ignore our arguments.

Well, our paper entitled On the Misdiagnosis of Surface Temperature Feedbacks from Variations in Earth’s Radiant Energy Balance which refutes Dessler’s claim, has just been accepted for publication. In it we show clear evidence that cloud changes DO cause a large amount of temperature variability during the satellite period of record, which then obscures the identification of temperature-causing-cloud changes (cloud feedback).

Along with that evidence, we also show the large discrepancy between the satellite observations and IPCC models in their co-variations between radiation and temperature:

Given the history of the IPCC gatekeepers in trying to kill journal papers that don’t agree with their politically-skewed interpretations of science (also see here, here, here, here), I hope you will forgive me holding off for on giving the name of the journal until it is actually published.

But I did want to give them plenty of time to work on ignoring our published research as they write the next IPCC report. :)

And this is not over…I am now writing up what I consider to be our most convincing evidence yet that the climate system is relatively insensitive.

Atlantis Launch SRB Camera Videos

Thursday, July 14th, 2011

This is SO cool. Cameras mounted on both of the solid rocket boosters during the final Shuttle launch show what it would be like to ride one from the launch pad, into space, falling back through the atmosphere, and then splashing into the ocean. The total video runs 32 minutes, with multiple camera views from both boosters. I watched the whole thing and is it was worth it.

Understanding James Hansen’s View of Our Climate Future

Wednesday, July 13th, 2011

I’ve been wading through James Hansen’s recent 52-page unpublished paper explaining why he thinks the cooling effect of manmade sulfate aerosols has been underestimated by climate modelers.

This is the same theme as the “cooling from Chinese pollution is canceling out carbon dioxide warming” you might have heard about recently.

As I read Hansen’s paper, I stumbled upon a sentence on page 23 that sounds like one I just wrote in a new paper we are preparing. I’m going to use Hansen’s statement — which I agree with — because it provides a good introduction to understanding the basics of climate change theory:

“…surface temperature change depends upon three factors: (1) the net climate forcing, (2) the equilibrium climate sensitivity, and (3)…..the rate at which heat is transported into the deeper ocean (beneath the mixed layer).”

To better understand those 3 factors, consider what controls how much a pot of water on the stove will warm over a short period of time. The temperature rise depends upon (1) how much you turn up the stove, or cover up the pot with a lid (“forcing”); (2) how fast the pot can lose extra heat to its surroundings as it warms, thus limiting the temperature rise (“climate sensitivity”); and (3) the depth of the water in the pot.

Most people working in this business (including the IPCC) agree that probably the biggest uncertainty in determining the extent to which manmade global warming is something we need to worry about is #2, climate sensitivity.

But not Hansen.

Hansen believes he knows climate sensitivity very accurately, based upon paleoclimate theories of what caused temperature changes hundreds of thousands to millions of years ago, and how large those temperature changes were. Most of Hansen’s climate sensitivity claims are based upon the Ice Ages and the Interglacial periods.

I must admit, it astounds me how some scientists can be so sure of theories which involve events in the distant past that we cannot measure directly. Yet we measure the entire Earth every day with a variety of satellite instruments, and we are still trying to figure out from that abundance of data how today’s climate system works!

In Hansen’s case, in order to explain the amount of warming in recent decades, he thinks he knows #2 (the climate sensitivity) is quite high, and so he has been experimenting with various realistic values for #3 (the assumed rate of heat diffusion into the deep ocean) and has decided that factor #1 (the radiative forcing of the climate system) has not been as large as everyone has been assuming. Again, this is in order to explain why surface warming has not been as strong as expected.

Now, I tend to agree with Hansen that the main portion of that forcing, from increasing CO2 (a warming effect), is known pretty well. (Please, no flaming from the sky dragon slayers out there…I already know about your arguments). So in his view there MUST be some cooling influence canceling it out. That’s where the extra dose of aerosol cooling comes in.

All modelers have already fudged in various amounts of cooling from sulfate aerosols in order to prevent their climate models from warming more than has been observed. But Hansen ALSO thinks that the real climate system does not mix heat into the deep ocean as fast as the IPCC climate models do.

Unfortunately, correcting this error (if it exists, which I think it does) would push the models in the direction of too much surface warming. Therefore, Hansen thinks this must then mean that the IPCC models are assuming too much forcing (the only remaining possibility of the 3 factors).

Of course, as Hansen correctly points out, accurate global measurements of the cooling effect of aerosols are essentially non-existent. How convenient. This means that modelers can continue to use increasing amounts of sulfate aerosols as an “excuse” for a lack of recent warming, despite the lack of quantitative evidence that this is actually occurring.

As I sometimes point out, this line of reasoning verges on blaming NO climate change on humans, too.

It is unfortunately that so few of us (me, Lindzen, Douglass, and a few others) are actively researching the OTHER possibility: that climate sensitivity has been greatly overestimated. Lindzen and Choi have a new paper in press on the subject, and Braswell and I have another that I expect to be accepted for publication in the next few days.

I sort of understand the reluctance to research the possibility, though. If climate sensitivity is low, then global warming, climate disruption, tipping points, and carbon footprints all suddenly lose their interest. And we can’t have that.

I agree with Hansen that our best line of observational evidence is how fast the oceans warm — at all depths. Our recent work on estimating climate sensitivity from the rate of warming between 1955-2010 at different depths has been very encouraging, something which I hope to provide an update on soon.

Global SST Update: Still No Sign of Resumed Warming

Friday, July 8th, 2011

Here’s the global average sea surface temperature (SST) update from AMSR-E on NASA’s Aqua satellite, updated through yesterday, July 7, 2011:

The anomalies are relative the existing period of record, which is since June 2002.

As can be seen, the SSTs have not quite recovered from the coolness of the recent La Nina.

Something else I track is the ocean cloud water anomalies, also from AMSR-E, which I have calibrated in terms of anomalies in reflected sunlight based upon Aqua CERES data:

Why I watch this is it often predicts future SST behavior. For instance, the circled portion in 2010 shows a period of enhanced reflection of sunlight (thus reduced solar input into the ocean), and this corresponded to strong cooling of SSTs during 2010 as seen in the first graph.

So, the recent new enhancement of cloudiness (smaller circle) suggests a fall of SST in the next month or so. After that, it generally takes another month or so before ocean changes are transferred to the global oceanic atmosphere through enhanced or decreased convective overturning and precipitation.