Archive for February, 2011

On the House Vote to Defund the IPCC

Saturday, February 19th, 2011

The climate change deniers have no one but themselves to blame for last night’s vote.

I’m talking about those who deny NATURAL climate change. Like Al Gore, John Holdren, and everyone else who thinks climate change was only invented since they were born.

Politicians formed the IPCC over 20 years ago with an endgame in mind: to regulate CO2 emissions. I know, because I witnessed some of the behind-the-scenes planning. It is not a scientific organization. It was organized to use the government-funded scientific research establishment to achieve policy goals.

Now, that’s not necessarily a bad thing. But when they are portrayed as representing unbiased science, that IS a bad thing. If anthropogenic global warming – and ocean ‘acidification’ (now there’s a biased and totally incorrect term) — ends up being largely a false alarm, those who have run the IPCC are out of a job. More on that later.

I don’t want to be misunderstood on this. IF we are destroying the planet with our fossil fuel burning, then something SHOULD be done about it.

But the climate science community has allowed itself to be used on this issue, and as a result, politicians, activists, and the media have successfully portrayed the biased science as settled.

They apparently do not realize that ‘settled science’ is an oxymoron.

The most vocal climate scientists defending the IPCC have lost their objectivity. Yes, they have what I consider to be a plausible theory. But they actively suppress evidence to the contrary, for instance attempts to study natural explanations for recent warming.

That’s one reason why the public was so outraged about the ClimateGate e-mails. ClimateGate doesn’t prove their science is wrong…but it does reveal their bias. Science progresses by investigating alternative explanations for things. Long ago, the IPCC all but abandoned that search.

Oh, they have noted (correctly I believe) that a change in the total output of the sun is not to blame. But there are SO many other possibilities, and all they do is dismiss those possibilities out of hand. They have a theory — more CO2 is to blame — and they religiously stick to it. It guides all of the research they do.

The climate models are indeed great accomplishments. It’s what they are being used for that is suspect. A total of 23 models cover a wide range of warming estimates for our future, and yet there is no way to test them for what they are being used for! climate change predictions.

Virtually all of the models produce decadal time scale warming that exceeds what we have observed in the last 15 years. That fact has been known for years, but its publication in the peer reviewed literature continues to be blocked.

My theory is that a natural change in cloud cover has caused most of the recent warming. Temperature proxy data from around the world suggests that just about every century in the last 2,000 years has experienced warming or cooling. Why should today’s warmth be manmade, when the Medieval Warm Period was not? Just because we finally have one potential explanation – CO2?

This only shows how LITTLE we understand about climate change…not how MUCH we know.

Why would scientists allow themselves to be used in this way? When I have pressed them on the science over the years, they all retreat to the position that getting away from fossil fuels is the ‘right thing to do anyway’.

In other words, they have let their worldviews, their politics, their economic understanding (or lack thereof) affect their scientific judgment. I am ashamed for our scientific discipline and embarrassed by their behavior.

Is it any wonder that scientists have such a bad reputation among the taxpayers who pay them to play in their ivory tower sandboxes? They can make gloom and doom predictions all day long of events far in the future without ever having to suffer any consequences of being wrong.

The perpetual supply of climate change research money also biases them. Everyone in my business knows that as long as manmade climate change remains a serious threat, the money will continue to flow, and climate programs will continue to grow.

Now, I do agree the supply of fossil fuels is not endless. But we will never actually “run out”…we will just slowly stop trying to extract them as they become increasingly scarce (translation – more expensive). That’s the way the world works.

People who claim we are going to wake up one morning and our fossil fuels will be gone are either pandering, or stupid, or both.

But how you transition from fossil fuels to other sources of energy makes all the difference in the world. Making our most abundant and affordable sources of energy artificially more expensive with laws and regulations will end up killing millions of people.

And that’s why I speak out. Poverty kills. Those who argue otherwise from their positions of fossil-fueled health and wealth are like spoiled children.

The truly objective scientist should be asking whether MORE, not less, atmospheric carbon dioxide is what we should be trying to achieve. There is more published real-world evidence for the benefits of more carbon dioxide, than for any damage caused by it. The benefits have been measured, and are real-world. The risks still remain theoretical.

Carbon dioxide is necessary for life on Earth. That it has been so successfully demonized with so little hard evidence is truly a testament to the scientific illiteracy of modern society. If humans were destroying CO2 — rather than creating more — imagine the outrage there would be at THAT!

I would love the opportunity to cross examine these (natural) climate change deniers in a court of law. They have gotten away with too much, for too long. Might they be right? Sure. But the public has no idea how flimsy – and circumstantial – their evidence is.

In the end, I doubt the IPCC will ever be defunded. Last night’s vote in the House is just a warning shot across the bow. But unless the IPCC starts to change its ways, it runs the risk of being totally marginalized. It has almost reached that point, anyway.

And maybe the IPCC leadership doesn’t really care if its pronouncements are ignored, as long as they can jet around the world to meet in exotic destinations and plan where their next meeting should be held. I hear it’s a pretty good gig.

Fire & Water: Some Thoughts on Wood Stove Design and Efficiency

Friday, February 18th, 2011

Fire & Water: Some Thoughts on Wood Stove Design and Efficiency

Sometimes I have to get away from the climate stuff for awhile. This is one of those times.

Also, each year at this time my wife asks how we can get our swimming pool to warm up quicker this spring. Even after 20 years, global warming hasn’t helped a darn bit.

She also always mentions wood heat as a possibility. I have always discounted the idea as too involved a project.

Well, this year we’re gonna git ‘er done. Last year I built a homemade solar pool heater. This year we are going to add some of that concentrated, carbon-based fuel to our energy portfolio.

After all, we DO have lots of wood available to us behind our house. Mature hardwoods, and the old trees just fall over and rot. I believe one of our white oaks dates to before our country WAS a country.

So, how to make a wood stove that can heat swimming pool water? Over the years, I’ve had enough experience with wood burning fireplaces, free-standing wood stoves, thermodynamics, radiative and convective heat transfer, buoyancy of heated air, etc., that I think I could help come up with a good stove design.

And ‘Uncle Lou’ (my wife’s sister’s husband) up in Sault Sainte Marie, Michigan has a lifetime of building and welding and fixing and fabricating. So, he’s helping me design a stainless steel wood stove with an outer water jacket that I’ll pump pool water through to heat the pool. We will use stainless steel to help keep iron out of the pool water.

Meanwhile, I’ve been reading about the newer EPA-certified stove designs – which is all you can buy anymore — that provide a hotter fire with more complete combustion of wood, rather than losing the gases and smoke out the chimney like the older “smoke dragon” designs do. I had no idea that (dry) wood could be so completely burned that there is little or no smoke at all. Cool!

The modern advance in wood stove technology is, simply put, to create a hotter fire with sufficient oxygen supply to burn all the wood and its byproducts.

To achieve this, the firebox is better insulated, and a pre-heated supply of air is made available in the upper portion of the firebox through perforated stainless steel secondary burn tubes so the wood gases and smoke can burn.

I’m sure many of you have these stoves, which are the only ones sold for inside residential use anymore. The secondary burn tubes produce beautiful, “ghost” flames, helping to ignite the wood gases and smoke that used to just go up the chimney.

So, this got me to thinking about the optimum stove design that would provide maximum efficiency, that is the maximum amount of heat energy from burning the wood transferred into your home (or my swimming pool water).

The goal is pretty simple: burn the wood and its gases as completely as possible and let as little heat escape out the chimney as possible. But even after hundreds of years of experience, people are still debating the best way to accomplish that.

I was thinking about the efficiency of a car engine as an analogy…but it is totally wrong. 100% efficiency for a car engine would be for all of the energy created by burning fuel to go into the mechanical work of pushing the pistons, turning the engine, and creating motion, with zero waste heat.

The wood stove is just the opposite, though. We want to create as much “waste” heat as possible, with as little mechanical energy as possible used to “push” the air through the system.

So, what are the limits to a 100% efficient wood stove?

First, you must recognize that you have to lose SOME heat out the chimney. It is the warm air in the chimney which provides the buoyancy (lift) needed to draw more air into the firebox. But the greater the volume of air flowing out the chimney, and the higher its temperature, the lower the efficiency of the stove for heating purposes.

Next, the higher temperatures required in the firebox for more complete combustion means more insulation, which means a reduction in the rate of heat flow to the room — which is opposite to the whole point of heating with a wood stove in the first place.

Now, I realize a hotter fire which is burning fuel more completely might actually lead to an increase in heat transferred to the room….but, all other things being equal, more insulation MUST, by itself, reduce the rate of heat flow compared to less insulation. Simple thermodynamics.

It’s an interesting dichotomy, trying to increase the efficiency of these stoves. On the one hand you need to MINIMIZE the loss of heat from the firebox in order to attain the high temperatures required for more complete combustion. But you also want to MAXIMIZE the loss of heat by the stove to the room. That’s the whole point of using the stove.

But this really isn’t a dichotomy if you realize that you are only insulating a portion of the stove – the firebox – to achieve the high temperatures and more complete combustion. If you can then route the hot gases leaving the firebox through a different part of the stove before going up the chimney, you then have the opportunity to extract the extra heat you generated from more complete combustion at the higher temperatures created within the (well insulated) firebox.

In other words, the firebox portion of the stove is primarily the energy generation portion of the system, and the rest of the stove that the hot gases pass through is the heat recovery portion of the system.

What is needed is a way to provide the hot gases leaving the firebox a greater opportunity to transfer their heat through the stove to its surroundings. A longer path through the stove, with multiple baffles conducting heat to the outside of the stove, would be one way to accomplish this.

Another would be to have a system of fins inside. Either way, you need to get the hot gas to come in contact with as much stove inner surface as possible, to maximize conduction of the heat to the outside of the stove, before all the heat goes up the chimney.

Now, obviously, you can’t remove so much heat from the exhaust that the air in the chimney is no longer buoyant, because then you will lose the stove’s “sucking” power for the fresh air it needs to burn the wood. An insulated chimney will help keep those gases as warm as possible through the entire path length of the chimney.

The air supply is of particular interest to me. (After all, I am a meteorologist. We know air.) Why doesn’t a bonfire, with an unlimited supply of fresh air, not burn all of the wood gases and smoke completely? It’s because as soon as a flame develops, it gets turbulently mixed with cooler ambient air, reducing the temperature of the mixture below what is necessary to burn the wood gases and smoke.

An analogy is the entrainment of environmental air into a convective cloud, which reduces the clouds ability to produce precipitation…a key component of the atmosphere’s heat engine.

So, in the modern wood stove they put tubes heated by the fire in the firebox to deliver an additional “secondary” source of air – very hot air — to the upper part of the firebox where the hot gases and smoke naturally collect. The pre-heating of the air is necessary for combustion of those gases to occur.

But after thinking and reading about this, I don’t really see the need for a distinction between “primary” and “secondary” air sources for a wood stove. All that is needed is a sufficient supply of pre-heated air to the whole fire. The secondary burn technology seems to me to be a retrofit to fix a problem that could just as easily have been fixed by reworking the primary air supply.

So, Uncle Lou and I have been discussing a way to preheat ALL of the air that enters the firebox, one that includes as its first ‘stop’ the window in the door, since a steady stream of hot fresh air is also needed to keep the window clean.

Of course, this is all in the design phase right now. Unfortunately, as Bert once told Ernie on Sesame Street regarding building a lemonade stand, “It’s easy to have ideas. It’s not so easy to make them work.”

So, if you don’t hear a progress report in a month or two, you’ll know the project was a failure. At least I don’t have to worry about burning the swimming pool down.

So, now the REAL stove experts out there can chime in and tell me where I’m wrong in my newbie analysis of wood stoves. It’s OK…I’m used to it.

Radiative Changes Over the Global Oceans During Warm and Cool Events

Wednesday, February 9th, 2011

In my continuing efforts to use satellite observations to test climate models that predict global warming, I keep trying different ways to analyze the data.

Here I’ll show how the global oceanic radiative budget changes during warm and cool events, which are mostly due to El Niño and La Niña (respectively). By ‘radiative budget’ I am talking about top-of-atmosphere absorbed sunlight and emitted infrared radiation.

I’ve condensed the results down to a single plot, which is actually a pretty good learning tool. It shows how radiative energy accumulates in the ocean-atmosphere system during warming, and how it is then lost again during cooling.

[If you are wondering how radiative ‘feedback’ fits into all this — oh, and I KNOW you are — imbalances in the net radiative flux at the top of the atmosphere can be thought of as some combination of forcing and feedback, which always act to oppose each other. A radiative imbalance of 2 Watts per sq. meter could be due to 3 Watts of forcing and -1 Watt of feedback, or 7 Watts of forcing and -5 Watts of feedback (where ‘feedback’ here includes the direct Planck temperature response of infrared radiation to temperature). Unfortunately, we have no good way of knowing the proportions of forcing and feedback, and it is feedback that will determine how much global warming we can expect from forcing agents like more atmospheric carbon dioxide.]

But for now let’s ignore that conceptual distinction, and just talk about radiative imbalances. This simplifies things since more energy input should be accompanied by a temperature rise, and more energy loss should be accompanied by a temperature fall. Conservation of energy.

And, as we will see from the data, that is exactly what happens.

We analyzed the 20th Century runs from a total of 14 IPCC climate models that Forster & Taylor (2006 J. Climate) also provided a diagnosed long-term climate sensitivity for. In order to isolate the variability in the models on time scales less than ten years or so, I removed the low-frequency variations with a 6th order polynomial fit to the surface temperature and radiative flux anomalies. It’s the short-term variability we can test with short term satellite datasets.

I’ve already averaged the results for the 5 models than had below-average climate sensitivity, and the 9 models that had above-average climate sensitivity.

The curves in the following plot are lag regression coefficients, which can be interpreted as the rate of radiative energy gain (or loss) per degree C of temperature change, at various time lags. A time lag of zero months can be thought of as the month of temperature maximum (or minimum). I actually verified this interpretation by examining composite warm and cold events from the CNRM-CM3 climate model run, which exhibits strong El Niño and La Niña activity.

Also shown are satellite-based results, from detrended HadSST2 global sea surface temperature anomalies and satellite-measured anomalies in radiative fluxes from the Terra CERES instrument, for the 10-year period from March 2000 through June 2010.

The most obvious thing to note is that in the months running up to a temperature maximum (minimum), the global oceans are gaining (losing) extra radiative energy. This is true of all of the climate models, and in the satellite observations.

The above plot is a possibly a more intuitive way to look at the data than the ‘phase space’ plots I’ve been pushing the last few years. One of the KEY things it shows is that doing these regressions only at ZERO time lag (as Dessler recently did in his 2010 cloud feedback paper, and all previous researchers have tried to do) really has very little meaning. Because of the time lags involved in the temperature response to radiative imbalances, one MUST do these analyses taking into account the time lag behavior if one is to have any hope of diagnosing feedback. At zero time lag, there is very little signal at all to analyze.

So, What Does This Tell Us About the Climate Models Used to Predict Global Warming?

Of course, what I am ultimately interested in is whether the satellite data can tell us anything that might allow us to determine which of the climate models are closer to reality in terms of their global warming predictions.

And, as usual, the results shown above do not provide a clear answer to that question.

Now, the satellite observations DO suggest that there are larger radiative imbalances associated with a given surface temperature change than the climate models exhibit. But the physical reason why this is the case cannot be determined without other information.

It could be due to a greater depth of water being involved in temperature changes in the real climate system, versus in climate models, on these time scales. Or, maybe the extra radiative input seen in the satellite data during warming is being offset by greater surface evaporation rates than the models produce.

But remember, conceptually these radiative changes are some combination of forcing and feedback, in unknown amounts. What I call forcing is what some people call “unforced internal variability” – radiative changes not due to feedback (by definition, the direct or indirect result of surface temperature changes). They are probably dominated by quasi-chaotic, circulation-induced variations in cloud cover, but could also be due to changes in free-tropospheric humidity.

Now, if we assume that the radiative changes AFTER the temperature maximum (or minimum) are mostly a feedback response, then one might argue that the satellite data shows more negative feedback (lower climate sensitivity) than the models do. The only trouble with that is that I am showing averages across models in the above plot. One of the MORE sensitive models actually had larger excursions than the satellite data exhibit!

So, while the conclusion might be true…the evidence is not exactly ironclad.

Also, while I won’t show the results here, there are other analyses that can be done. For instance: How much total energy do the models (versus observations) accumulate over time during the warming episodes? During the cooling episodes? and does that tell us anything? So far, based upon the analysis I’ve done, there is no clear answer. But I will keep looking.

In the meantime, you are free to interpret the above graph in any way you want. Maybe you will see something I missed.

UAH Update for January 2011: Global Temperatures in Freefall

Wednesday, February 2nd, 2011

…although this, too, shall pass, when La Nina goes away.

UAH_LT_1979_thru_Jan_2011


YR MON GLOBE NH SH TROPICS
2010 1 0.542 0.675 0.410 0.635
2010 2 0.510 0.553 0.466 0.759
2010 3 0.554 0.665 0.443 0.721
2010 4 0.400 0.606 0.193 0.633
2010 5 0.454 0.642 0.265 0.706
2010 6 0.385 0.482 0.287 0.485
2010 7 0.419 0.558 0.280 0.370
2010 8 0.441 0.579 0.304 0.321
2010 9 0.477 0.410 0.545 0.237
2010 10 0.306 0.257 0.356 0.106
2010 11 0.273 0.372 0.173 -0.117
2010 12 0.181 0.217 0.145 -0.222
2011 1 -0.009 -0.055 0.038 -0.369

LA NINA FINALLY BEING FELT IN TROPOSPHERIC TEMPERATURES
January 2011 experienced a precipitous drop in lower tropospheric temperatures over the tropics, Northern Hemisphere, and Southern Hemisphere. This was not unexpected, since global average sea surface temperatures have been falling for many months, with a head start as is usually the case with La Nina.

This is shown in the following plot (note the shorter period of record, and different zero-baseline):

SO WHY ALL THE SNOWSTORMS?
While we would like to think our own personal experience of the snowiest winter ever in our entire, Methuselah-ian lifespan has some sort of cosmic — or even just global — significance, I would like to offer this plot of global oceanic precipitation variations from the same instrument that measured the above sea surface temperatures (AMSR-E on NASA’s Aqua satellite):

Note that precipitation amounts over the global-average oceans vary by only a few percent. What this means is that when one area gets unusually large amounts of precipitation, another area must get less.

Precipitation is always associated with rising air, and so a large vigorous precipitation system in one location means surrounding regions must have enhanced sinking air (with no precipitation).

In the winter, of course, the relatively warmer oceans next to cold continental air masses leads to snowstorm development in coastal areas. If the cold air mass over the midwest and eastern U.S. is not dislodged by warmer Pacific air flowing in from the west, then the warm oceanic air from the Gulf of Mexico and western Atlantic keeps flowing up and over the cold dome of air, producing more snow and rain. The “storm track” and jet stream location follows that boundary between the cold and warm air masses.

WeatherShop.com Gifts, gadgets, weather stations, software and more...click here!

A Challenge to the Climate Research Community

Wednesday, February 2nd, 2011

I’ve been picking up a lot of chatter in the last few days about the ‘settled science’ of global warming. What most people don’t realize is that the vast majority of published research on the topic simply assumes that warming is manmade. It in no way “proves” it.

If the science really is that settled, then this challenge should be easy:

Show me one peer-reviewed paper that has ruled out natural, internal climate cycles as the cause of most of the recent warming in the thermometer record.

Studies that have suggested that an increase in the total output of the sun cannot be blamed, do not count…the sun is an external driver. I’m talking about natural, internal variability.

The fact is that the ‘null hypothesis’ of global warming has never been rejected: That natural climate variability can explain everything we see in the climate system.