Siberian Express to Bring -30 deg. F to Wyoming

November 8th, 2014

Even though it’s still early November, a January-like cold wave just entering Montana and the Dakotas on Sunday will bring 30 below zero temperatures to scattered locations in Montana and Wyoming by Wednesday morning.

The cold will fill the nation’s midsection by mid-week, with no let up in sight. The coldest air to arrive in a series of reinforcing surges is still a week away.

Temperatures are forecast to run 15 to 30 deg. F below normal for at least 5 days over a large portion of the central U.S. starting late in the coming week (graphic courtesy of Weatherbell.com, click to enlarge):

Forecast temperature departures from normal for the five day period Thursday Nov. 13 to Monday Nov. 18, 2014.

Forecast temperature departures from normal for the five day period Thursday Nov. 13 to Monday Nov. 18, 2014.

On individual days the temperatures will be as much as 50 deg. F below normal for this time of year, which is quite exceptional. The air mass temperature (850 mb, ~5,000 ft. altitude) will be as much as 4.5 standard deviations below normal, which is less than 1 in 100,000 in probability terms.

Frigid Week Ahead for Most of the U.S.

November 7th, 2014

Brace yourself for more “polar vortex” news stories by the middle of next week.

An unusually widespread and persistent cold air mass will grip all but the U.S. Southwest and Florida by late in the week.

It’s origins can be traced back to eastern Siberia a week ago, then it crossed the Arctic Ocean and northwest Canada. It will enter Montana and the Dakotas on Sunday, then gradually sink south and east as the week progresses.

The latest forecast for 7-day average temperature departures from normal shows widespread 10 to 15 deg. F below normal over much of the nation for the seven days starting next Wednesday (click for full-size, graphics courtesy of Weatherbell.com):

GFS model forecast of 7-day average departures from normal temperature for Nov. 12-19.

GFS model forecast of 7-day average departures from normal temperature for Nov. 12-19.

That’s a whole week of unusually cold weather, folks. Recent cold events have been very short-lived, sweeping through rapidly, lasting not much more than a day or so. This one is going to stick around.

The Arctic intrusion will be accompanied by a swath of snow across the Northern Plains and Great Lakes on Monday, then snow for Virginia and D.C. by Friday:

Eight-day total forecast snowfall from Friday Nov. 7 to Saturday Nov. 15.

Eight-day total forecast snowfall from Friday Nov. 7 to Saturday Nov. 15.

My friend Joe Bastardi at WeatherBell tells me the ocean temperature and weather patterns right now are reminiscent of the epic winters of 1976-77 and 77-78. I remember those winters. I was taking graduate meteorology courses at UW-Madison at that time, and the meteorology professors were all saying the early cold air outbreaks we experienced would surely end.

Except they didn’t.

UAH Global Temperature Update for October, 2014: +0.37 deg. C

November 3rd, 2014

The Version 5.6 global average lower tropospheric temperature (LT) anomaly for October, 2014 is +0.37 deg. C, up from the September value of +0.29 deg. C (click for full size version):

UAH_LT_1979_thru_October_2014_v5

The global, hemispheric, and tropical LT anomalies from the 30-year (1981-2010) average for the last 22 months are:

YR MON GLOBAL NH SH TROPICS
2013 1 +0.497 +0.517 +0.478 +0.386
2013 2 +0.203 +0.372 +0.033 +0.195
2013 3 +0.200 +0.333 +0.067 +0.243
2013 4 +0.114 +0.128 +0.101 +0.165
2013 5 +0.082 +0.180 -0.015 +0.112
2013 6 +0.295 +0.335 +0.255 +0.220
2013 7 +0.173 +0.134 +0.211 +0.074
2013 8 +0.158 +0.111 +0.206 +0.009
2013 9 +0.365 +0.339 +0.390 +0.190
2013 10 +0.290 +0.331 +0.249 +0.031
2013 11 +0.193 +0.160 +0.226 +0.020
2013 12 +0.266 +0.272 +0.260 +0.057
2014 1 +0.291 +0.387 +0.194 -0.029
2014 2 +0.170 +0.320 +0.020 -0.103
2014 3 +0.170 +0.338 +0.002 -0.001
2014 4 +0.190 +0.358 +0.022 +0.092
2014 5 +0.326 +0.325 +0.328 +0.175
2014 6 +0.305 +0.315 +0.295 +0.510
2014 7 +0.304 +0.289 +0.319 +0.451
2014 8 +0.199 +0.244 +0.153 +0.061
2014 9 +0.294 +0.187 +0.401 +0.181
2014 10 +0.367 +0.335 +0.399 +0.191

It should be remembered that during ENSO, there is a 1-2 month lag between sea surface temperature change and tropospheric temperature changes, so the tropospheric temperature anomaly will take a month or two to reflect what recent global SSTs have been doing.

The global image for October should be available in the next day or so here.

Popular monthly data files (these might take a few days to update):

uahncdc_lt_5.6.txt (Lower Troposphere)
uahncdc_mt_5.6.txt (Mid-Troposphere)
uahncdc_ls_5.6.txt (Lower Stratosphere)

Super Typhoon Nuri to Become a Bering Sea Bomb

November 3rd, 2014

When a tropical cyclone moves poleward and merges with a frontal system, and then draws upon the extra energy that exists between cold and warm air masses, a Sandy-type storm can result.

Sometimes it causes explosive cyclogenesis, what we call a “bomb”, with rapidly deepening low surface pressures.

I’ve been watching Super Typhoon Nuri in the West Pacific, one of the strongest of the year, and each run of the GFS model continues to show this system becoming a spectacular extratropical storm in the Bering Sea, with hurricane force winds and near record low barometric pressure in about 5 days time, after it just misses Japan.

As of today, this is what Nuri looks like in the latest MODIS imagery…it’s not a particularly large storm, but it has an intense core, with maximum surface winds estimated at 180 mph with gusts to 220 mph (these are satellite-estimated…they do not routinely fly into typhoons to measure them like we do in the West Atlantic):

Super Typhoon Nuri over the tropical West Pacific on Nov. 3, 2014.

Super Typhoon Nuri over the tropical West Pacific on Nov. 3, 2014.

Here’s some nice video of Nuri from the International Space Station from yesterday:

The GFS forecast model run from this morning shows Nuri as an extratropical low with an exceedingly low central pressure of 924 mb (27.29 inches) by Friday evening (graphic courtesy of Weatherbell.com, click image for full-size):

GFS model forecast surface pressures and winds when extratropical cyclone Nuri reaches peak intensity, Friday evening Nov. 7, 2014.

GFS model forecast surface pressures and winds when extratropical cyclone Nuri reaches peak intensity, Friday evening Nov. 7, 2014.

The previous 2 model runs had the low at 919 mb lowest pressure. By comparison, the lowest pressures recorded in extratropical storms have been in the range of 912-920 mb, in the North Atlantic, so it looks like Nuri might be one of the strongest on record. The lowest surface pressure ever recorded in the U.S. was from one of these Bering Sea systems: 927 mb (27.35 inches) at Dutch Harbor, Alaska on October 25, 1977.

Earliest Snow in Columbia, SC

November 1st, 2014

As I predicted two days ago, the earliest snows on record are beginning to occur in the Carolinas.

Columbia, SC has just experienced their earliest snow in 125 years of weather records, beating the Nov. 9, 1913 earliest snow record by 8 days. Current South Carolina weather shows it’s still snowing in Greenville, SC.

The Christian Science Monitor is reporting Greenville was especially hard hit with downed trees and power outages. The Smokey Mtns received up to 16 inches overnight. The current U.S. snow cover map shows 18 states with some amount of snow this morning.

Here is the latest model forecast of total snowfall ending at midnight tonight (graphic courtesy of Weatherbell.com):
hires_snow_acc_nc_7

Early indications are that next Sunday the “polar express” will arrive in the northern plains and Great Lakes with bitterly cold air currently sitting over northern Siberia.

Do Satellite Temperature Trends Have a Spurious Cooling from Clouds?

October 30th, 2014

The validity of the satellite record of global temperature is sometimes questioned; especially since it shows only about 50% of the warming trend as do surface thermometers over the 36+ year period of satellite record.

The satellite measurements are based upon thermal microwave emissions by oxygen in the atmosphere. But like any remote sensing technique, the measurements include small contaminating effects, in this case cloud water, precipitation systems, and variations in surface emissivity.

A new paper by Weng et al. has been published in Climate Dynamics, entitled “Uncertainty of AMSU-A derived temperature trends in relationship with clouds and precipitation over ocean”, which examines the influence of clouds on the satellite measurements.

To see how clouds and precipitation can affect the satellite temperatures, here’s an example of one day (August 6, 1998) of AMSU ch. 5 data (which is used in both our mid-tropospheric and lower-tropospheric temperature products), and the corresponding SSM/I-derived cloud water for the same day:

Fig. 1. One day of AMSU limb-corrected ch. 5 brightness temperatures (top), and the corresponding SSM/I cloud water retrievals centered on the same day (August 6, 1998).

Fig. 1. One day of AMSU limb-corrected ch. 5 brightness temperatures (top), and the corresponding SSM/I cloud water retrievals centered on the same day (August 6, 1998).

As can be seen, the contamination of AMSU5 by cloud and precipitation systems is small, with slight cooling in deep convective areas, and no obvious cloud water contamination elsewhere (cirrus clouds are essentially transparent at this microwave frequency).

And even if there is contamination, what matters for tropospheric temperature trends isn’t the average level of contamination, but whether there are trends in that contamination. Below I will discuss new estimates of both the average contamination, as well as the effect on tropospheric temperature trends.

The fact that our monthly gridpoint radiosonde validation shows an extremely high level of agreement with the satellite further supports our assumption that such contamination is small. Nevertheless, it is probably worth revisiting the cloud-contamination issue, since the satellite temperature trends are significantly lower than the surface temperature trends, and any potential source of error is worth investigating.

What Weng et al. add to the discussion is the potential for spurious warming effects in AMSU ch. 5 of cloud water not associated with heavy precipitation, something which we did not address 18 years ago. While these warming influences are much weaker than the cooling effects of precipitation systems (as can be seen in the above imagery), cloud water is much more widespread, and so its influence on global averages might not be negligible.

The Weng et al Results Versus Ours (UAH)

I’m going to go ahead and give the final result up front for those who don’t want to wade through the details.

Weng et al. restrict their analysis to 13 years (1998-2010) of data from one satellite, NOAA-15, and find a spurious cooling effect from cloud contamination in the middle latitudes, with little effect in the tropics. (They don’t state how they assume their result based upon 13 years, even if it was correct, can be applied to 35+ years of satellite data.) I’ve digitized the data in their Fig. 8, so that I can compare to our results (click image for full size):

Oceanic trends by latitude band in AMSU5 during late 1998 to mid-2010 in the Weng et al. study (top) and our own calculations (bottom), for "all-weather" and "clear-sky" conditions.

Fig. 2. Oceanic trends by latitude band in AMSU5 during late 1998 to mid-2010 in the Weng et al. study (top) and our own calculations (bottom), for “all-weather” and “clear-sky” conditions.

There are two main points to take away from this figure. First, the temperature trends they get at different latitudes for 1998-2010 are VERY different from what we get, even in the “all-weather” case, which is simply including all ocean data whether cloud-contaminated or not. The large warming signal we get in the tropics is fully expected for this limited period, which starts during a very cool La Nina event, and ends during a very warm El Nino event.

I have spent most of this week slicing and dicing the data different ways, and I simply do not see how they could have gotten the near-zero trends they did in the tropics and subtropics. I suspect some sort of data processing error.

The second point (which was the main point of their paper) is the difference in “clear-sky” versus “all-weather” trends they got in the middle latitudes, which is almost non-existent in our (UAH) results. While they estimate up to a 30% spurious cooling of warming trends from cloud contamination, we estimate a global ocean average spurious cooling of only -0.006 deg. C/decade for 1998-2010 from not adjusting for cloud-contaminated data in our operational product. Most of this signal is probably related to the large change in cloud conditions going from La Nina to El Nino, and so it would likely be even less for the 36+ year satellite record.

While I used a different method for identifying and removing cloud contamination (I use localized warm spots in AMSU ch. 3, they use a retrieval scheme using AMSU ch. 1 & 2), I get about the same number of data screened out (40%) as they do (20%-50%), and the geographic distribution of my identified cloud and precip. systems match known regional distributions. So I don’t see how different cloud identification methodologies can explain the differences. I used AMSU prints 10-21 (like our operation processing), as well as their restricted use of just prints 15 & 16, and got nearly the same results, so that can’t explain the discrepancy, either.

I have many more plots I’m not showing relating to how cloud systems in general: (1) do indeed cause a small average warming of the AMSU5 measurements (by up to 0.1 deg. C); (2) less frequent precipitation systems cause localized cooling of about 1 deg. C; (3) how these effects average out to much smaller influences when averaged with non-contaminated data; and most importantly (4) the trends in these effects are near zero anyway, which is what matters for climate monitoring.

We are considering adding an adjustment for cloud contaminated data to a later version of the satellite data. I’ve found that a simple data replacement scheme can eliminate an average of 50% of the trend contamination (you shouldn’t simply throw away all cloud-influenced data…we don’t do that for thermometer data, and it could cause serious sampling problems); the question we are struggling with is whether the small level of contamination is even worth adjusting for.

Polar Vortex Charleston

October 30th, 2014

A cold airmass plunging out of Canada and a coastal low developing off the Carolinas by Saturday is the kind of weather event you expect in January — not November 1.

Snow is expected to fall over portions of 18 eastern states over the next 3 days, with the potential for earliest-ever snowfall in portions of the Carolinas by noon on Sunday (all forecast graphics courtesy of WeatherBell.com):

Total snowfall forecast by noon, Sunday, Nov. 2, 2014.

Total snowfall forecast by noon, Sunday, Nov. 2, 2014.

The surface air temperature departures from normal show this cold event pushing unusually far south for this time of year, with 20 deg. F below normal over much of the southeast, including all of Florida by Sunday morning:

Surface temperature departures from normal forecast at sunrise, Sunday, Nov. 2, 2014.

Surface temperature departures from normal forecast at sunrise, Sunday, Nov. 2, 2014.

The deep, cold airmass is what causes the “polar vortex”, which is the swirling of upper-air winds around the airmass. By noon on Saturday, the rapidly moving vortex will be centered near Charleston, SC:

"Polar vortex" pattern at 18,000 ft altitude forecast to be centered near Charleston at noon, Saturday, Nov. 1, 2014.

“Polar vortex” pattern at 18,000 ft altitude forecast to be centered near Charleston at noon, Saturday, Nov. 1, 2014.

Luckily, the deepening low pressure off the coast is expected to stay offshore, with northerly winds at Cape Hatteras around 50 mph Saturday night:

Surface pressure and wind patterns forecast for Saturday night.

Surface pressure and wind patterns forecast for Saturday night.


By Sunday evening, the low is forecast to be centered over the Gulf of St. Lawrence, and by Tuesday morning total snow accumulations of 1 to 2 feet or more are expected over portions of Maine and the Canadian Maritimes.

UPDATE (11:30 a.m. EDT Oct. 30):
Here’s the latest high-resolution model forecast of snowfall ending Saturday evening, showing flurries reaching scattered coastal areas of the Carolinas, and 6″-12″ snowfalls in the Smokey Mountains and to the lee of Lake Michigan in NW Indiana:
hires_snow_acc_ky_20

Sunspot 2192 Time Lapse Video

October 25th, 2014

I missed Thursday’s solar eclipse due to clouds, but here’s a sunset time lapse video I created from last evening which clearly shows sunspot 2192, the largest sunspot group in 18 years. This was taken 1 hour after the sunspot released an X-class solar flare:

Sunset time lapse with giant sunspot 2192 from Roy Spencer on Vimeo.

Green Meme Friday

October 24th, 2014

In commemoration of green hypocrisy.

One-does-not-simply

so-your-actors-can-have-jets

NYC-where-greens-live

green-energy-doo-doo

take-rich-peoples-money-thatd-be-great

Our Initial Comments on the Abraham et al. Critique of the Spencer & Braswell 1D model

October 23rd, 2014

Our 1D forcing-feedback-mixing model published in January 2014 (and not paywalled, but also here) addressed the global average ocean temperature changes observed from the surface to 700 m depth, with the model extending to 2,000 m depth.

We used the 1D model to obtain a consensus-supporting climate sensitivity when traditional forcings were used (mostly anthropogenic GHGs, aerosols, and volcanoes), but a much smaller 1.3 deg. C climate sensitivity if the observed history of ENSO was included, which was shown from CERES satellite measurements to modulate the Earth’s radiative budget naturally (what we called “internal radiative forcing” of the climate system).

Abraham et al. recently published an open source paper addressing the various assumptions in our model. While we have only had a couple days to look at it, in response to multiple requests for comment I am now posting some initial reactions.

Abraham et al. take great pains to fault the validity of a simple 1D climate model to examine climate sensitivity. But as we state in our paper (and as James Hansen has even written), in the global average all that really matters for the rate of rise of temperature is (1) forcing, (2) feedback, and (3) ocean mixing. These three basic processes can be addressed in a 1D model. Advective processes (horizontal transports) vanish in the global ocean average.

They further ignore the evidence we present (our Fig. 1 in Spencer & Braswell, 2014) that a 1D model might actually be preferable from the standpoint of energy conservation, since the 3D models do not appear to conserve energy – a basic requirement in virtually any physical modelling enterprise. Some of the CMIP3 models’ deep ocean temperature changes in apparent contradiction to whether the climate system is being radiative forced from above. Since the 3D models do not include a changing geothermal heat flux, this suggests a violation of the 1st Law of Thermodynamics. (Three of the 13 models we examined cooled most of deep ocean since 1955, despite increasing energy input from above. How does that happen?)

On this point, how is it that Abraham et al. nitpick a 1D model that CAN explain the observations, but the authors do not fault the IPCC 3D models which CANNOT explain the observations, and possibly don’t even conserve energy in the deep ocean?

Regarding their specific summary points (in bold):

1. The model treats the entire Earth as ocean-covered.
Not true, and a red herring anyway. We model the observed change in ocean heat content since 1955, and it doesn’t matter if the ocean covers 20% of the globe or 100%. They incorrectly state that ignoring the 30% land mass of the Earth will bias the sensitivity estimates. This is wrong. All energy fluxes are per sq. meter, and the calculations are independent of the area covered by the ocean. We are surprised the authors (and the reviewers) did not grasp this basic point.

2. The model assigns an ocean process (El Nino cycle) which covers a limited geographic region in the Pacific Ocean as a global phenomenon…
This is irrelevant. We modeled the OBSERVED change in global average ocean heat content, including the observed GLOBAL average expression of ENSO in the upper 200 m of the GLOBAL average ocean temperature.

3. The model incorrectly simulates the upper layer of the ocean in the numerical calculation.
There are indeed different assumptions which can be made regarding how the surface temperature relates to the average temperature of the first layer, which is assumed to be 50 m thick. How these various assumptions change the final conclusion will require additional work on our part.

4. The model incorrectly insulates the ocean bottom at 2000 meters depth.
This approximation should not substantially matter for the purpose the model is being used. We stopped at 2,000 m depth because the results did not substantially depend upon it going any deeper.

5. The model leads to diffusivity values that are significantly larger than those used in the literature.

We are very surprised this is even an issue, since we took great pains to point out in our paper that the *effective* diffusivity values we used in the model are meant to represent *all* modes of vertical mixing, not just diffusivity per se. If the authors read our paper, they should know this. And why did the reviewers not catch this basic oversight? Did the reviewers even read our paper to see whether Abraham et al. were misrepresenting what it claimed? Again, the *effective* diffusivity is meant to represent all modes of vertical heat transport (this is also related to point #8, below). All the model requires is a way to distribute heat vertically, and a diffusion-type operator is one convenient method for doing that.

6. The model incorrectly uses an asymmetric diffusivity to calculate heat transfer between adjacent layers, and
7. The model contains incorrect determination of element interface diffusivity.

The authors discuss ways in which the implementation of the diffusion operator can be more accurately expressed. This might well be the case (we need to study it more). But it should not impact the final conclusions because we adjust the assumed effective diffusivities to best match the observations of how the ocean warms and cools at various depths. If there was a bias in the numerical implementation of the diffusion operator (even off by a fact of 10), then the effective diffusivity values will simply adjust until the model matches the observations. The important thing is that, as the surface warms, the extra heat is mixed downward in a fashion which matches the observations. Arguing over the numerical implementation obscures this basic fact. Finally, a better implementation of diffusivity calculation still must then be run with a variety of effective diffusivities (and climate sensitivities) until a match with the observations has been obtained, which as far as we can tell the authors did not do. The same would apply to a 3D model simulation…when one major change is implemented, other model changes are often necessary to get realistic results.

8. The model neglects advection (water flow) on heat transfer.
Again, there is no advection in the global average ocean. The authors should know this, and so should the reviewers of their paper. Our *effective* diffusivity, as we state in the paper, is meant to represent all processes that cause vertical mixing of heat in the ocean, including formation of cold deep water at high latitudes. Why did neither the authors nor the reviewers of the paper not catch this basic oversight? Again, we wonder how closely anyone read our paper.

9. The model neglects latent heat transfer between the atmosphere and the ocean surface.
Not true. As we said in our paper, processes like surface evaporation, convective heat transfer, latent heat release, while not explicitly included, are implicitly included because the atmosphere is assumed to be in convective equilibrium with the surface. Our use of 3.2 W/m2 change in OLR with a surface temperature change of 1 deg. C is the generally assumed global-average value for the effective radiating temperature of the surface-atmosphere system. This is the way in which a surface temperature change is realistically translated into a change in top-of-atmosphere OLR, without having to explicitly include latent heat transfer, atmospheric convection, temperature lapse rate, etc.

Final Comments
If our model is so far from reality, maybe Abraham et al. can tell us why the model works when we run it in the non-ENSO mode (mainly greenhouse gas, aerosol, and volcanic forcing) , yielding a climate sensitivity similar to many of the CMIP models (2.2 deg. C). If the model deficiencies are that great, shouldn’t the model lead to a biased result for this simple case? Again, they cannot obtain a “corrected” model run by changing only one thing (e.g. the numerical diffusion scheme) without sweeping the other model parameters (e.g. the effective diffusivities) to get a best match to the observations.

These are our initial reactions after only a quick look at the paper. It will take a while to examine a couple of the criticisms in more detail. For now, the only one we can see which might change our conclusions in a significant way is our assumption that surface temperature changes have the same magnitude as the average temperature change in the top (50 m) layer of the model. In reality, surface changes should be a little larger, which will change the feedback strength. It will take time to address such issues, and we are now under a new DOE contract to do climate model validation.