Kevin Trenberth has a response over at Roger Pielke, Sr’s blog to my comments about his and John Fasullo’s recent Science Perspectives article about “missing energy” in the climate system.
Trenberth and Fasullo discuss in their original Science Perspectives article the observational evidence for missing energy being lost somewhere in the climate system, based upon satellite radiation budget measurements of the Earth which suggest that extra energy has been accumulating in the climate system for about the last 10 years, but with no appreciable warming of the upper ocean and atmosphere to accompany it as would be expected.
I posted some comments here about my view that the missing energy does not really exist. I also pointed out that they failed to mention that the missing energy over the period since about 2000 was in the reflected sunlight component, not the emitted infrared. This now makes two “missing energy” sources…the other one being the lack of expected warming from increasing carbon dioxide concentrations, which causes a steadily increasing global radiative imbalance in the infrared.
So, Kevin’s response on Pielke Sr’s blog begins with, “I saw Roy Spencer’s comment for the first time and it is not correct”, but I see no specific refutation of any of the points I made.
To further support my comments, here are the global-average CERES ERBE-like ES-4 Edition 2 radiative flux anomalies for reflected solar (1st graph) and outgoing longwave radiation (OLR, 2nd graph) for the period 2000 through 2008…these are daily running 91-day averages:
Clearly, the long-term “trend” during 2000 through 2008 was in the reflected solar (SW), not OLR (LW).
What is important for global warming or cooling is the sum of the global SW and LW, shown in the following graph (note I have flipped the y-axis, to correspond to the sense of the plot Kevin and John Fasullo showed in their Science Perspectives article):
But rather that address my points, Kevin instead focuses on the anomalous drop in OLR around the beginning of 2008. While he makes it sound like this event is currently inexplicable, he should recognize that there is indeed a very simple explanation for it: global-average temperatures were quite low at that time, as seen in the next graph:
After all, OLR is THERMALLY emitted radiation, and so it depends upon temperature. What would be the expected OLR response to such a drop in temperature? Well, we know that the expected change in OLR resulting from a 1 deg. C decrease in global average temperature should be a drop of about 3.2 Watts per sq. meter per degree C. The above temperature plot shows a fall of about 0.4 deg. C from early 2007 to early 2008, which should then cause a reduction in OLR by about (0.4 x 3.2 ), or about 1.3 Watts per sq. meter.
And indeed, as seen in the LW plot above, there was a fall of about 1 Watt per sq. meter in the LW (OLR) during the same time. To the extent that the drop in OLR with cooling was not quite as much as might be expected could be due to a small positive feedback in high clouds and/or water vapor. These are just rough estimates, anyway…the point is, one must take into account temperature changes when diagnosing the reasons for changes in OLR. This fact is seldom mentioned.
In our new paper accepted for publication in JGR, we show that the 2007-08 cooling event Kevin Trenberth discussed was due to a temporary increase in low cloud cover, evidence of which is clearly seen in the form of a large spike in reflected sunlight in the first plot, above. There is a lead-lag relationship between the two which clearly indicates the primary direction of causation.
And, as discussed in our JGR paper, this fact makes the diagnosis of feedback from natural climate variations much more difficult than previous researchers have been led to believe.