Author Archive

Trump’s NOAA Administrator Must Address the Temperature Record Controversy

Wednesday, January 18th, 2017

Roseburg, Oregon official USHCN temperature monitoring site shows examples of spurious heat influences that accumulate over the years, spuriously exaggerating the “global warming” signal.

An article appeared in the Washington Post yesterday entitled, “Who Will Lead NOAA Under President Trump?“. Written by the Capitol Weather Gang’s Jason Samenow, it lists three top contenders:

Scott Rayder, senior adviser for development and partnerships at the University Corporation for Atmospheric Research
Barry Myers, chief executive of AccuWeather in State College, Pa.
Jonathan White, president and chief executive of the Consortium for Ocean Leadership

The article addresses important issues facing NOAA in the coming years, such as making our weather forecasting capability the best in the world while still respecting the role of the private sector in adding value to the data collection and modelling role the government has taken leadership in.

Yet, something is missing….

You see, the names mentioned are part of the existing establishment, and we all know that President Trump is interested in “draining the swamp”.

They might be perfectly fine candidates — if Hillary Clinton had won the election.

What is missing is NOAA’s controversial role in promoting the U.N. plan to use global climate change as a way for the U.N. to oversee the redistribution of the world’s wealth and deindustrialize the West. (Note that’s not my claim…it’s their claim). It is well known that most of the countries that signed on to the Paris Agreement did so because they hope to gain from those transfers of wealth.

And we also know the result of CO2 emissions reduction will be a huge amount of pain (up to $100 Trillion loss of wealth this century) for no measurable impact on global temperatures, even using the U.N.’s over-inflated warming predictions.

NOAA has been actively “adjusting” the thermometer record of global temperatures over the years by making the present warmer, and the past colder, leading to an ever increasing upward temperature trend. This supports the global warming narrative the current administration, and the U.N., favors.

In my opinion, NOAA needs leadership that will reexamine these procedures. It took a TV meteorologist, Anthony Watts, to spearhead a site inspection of nearly all of the temperature monitoring locations in the U.S., even forcing NOAA to admit that many of their temperature monitoring stations were simply of no use for monitoring climate trends, when parking lots and air conditioning exhaust fans gradually encroached on these sites, causing spurious warming. Watts’ research has suggested that, after removing the contaminated stations, a substantial fraction of the reported warming in the U.S. simply disappears.

Why did it take an outsider — with no funding — to do what NOAA should have done to begin with?

Yes, providing data and analysis addressing the global warming issue is only one part of NOAA’s responsibility (which includes ocean research as well).

But it is by far the most important part of NOAA’s mission when it comes to the future health of the U.S. economy.

The new NOAA Administrator needs to address this issue head on, and not whitewash it. I seriously doubt any of the three candidates listed above will do that.

Satellite Reveals End of “Unending” N. California Drought

Saturday, January 14th, 2017

With more rain and snow on the way, the supposed “unending drought” that the New York Times reported on last year has, in a matter of weeks, ended — at least in Northern California.

Yesterday’s color satellite imagery from NASA shows the dramatic changes which have occurred since the same date three years ago:

– Widespread and deep snowpack
– Greening vegetation
– Rivers overflowing their banks
– Strong river discharge into the Pacific Ocean

NASA Aqua MODIS color satellite imagery of N. California separated by exactly three years, showing dramatic snowpack increase, vegetation greening, and river discharge into the Pacific Ocean.

Here’s a zoomed version of the NASA Terra MODIS image yesterday covering the San Francisco Bay area northeastward toward Sacramento:

NASA Terra MODIS zoomed image on 13 January 2017 covering San Francisco to Sacramento.

The latest GFS model forecast for the next 10 days predicts another 2 to 10 inches of rain, depending on location, with several more feet of snow at higher elevations.

The Frigid 48: U.S. Average Temperature 11 deg. F

Saturday, January 7th, 2017

As predicted here ten days ago, portions of all of the Lower 48 states are below 32 deg. F at 6 a.m. EST this morning (animation here):

Surface temperature and wind patterns at 6 a.m. January 7, 2017.

The spatial average temperature over the Lower 48 at 6 a.m. is 11 deg. F, which is fully 9 deg. (!) colder than at any time last winter (20 deg. F) which occurred twice in January of 2016.

Global Satellites: 2016 not Statistically Warmer than 1998

Tuesday, January 3rd, 2017

Strong December Cooling Leads to 2016 Being Statistically Indistinguishable from 1998

The Version 6.0 global average lower tropospheric temperature (LT) anomaly for December 2016 was +0.24 deg. C, down substantially from the November value of +0.45 deg. C (click for full size version):

The resulting 2016 annual average global temperature anomaly is +0.50 deg. C, which is (a statistically insignificant) 0.02 deg. C warmer than 1998 at +0.48 deg. C. We estimate that 2016 would have had to be 0.10 C warmer than 1998 to be significantly different at the 95% confidence level. Both 2016 and 1998 were strong El Nino years.

The 38 years in the satellite record, ranked from warmest to coolest (and ignoring statistical uncertainty) are:

RANK YEAR deg.C.
01 2016 +0.50
02 1998 +0.48
03 2010 +0.34
04 2015 +0.26
05 2002 +0.22
06 2005 +0.20
07 2003 +0.19
08 2014 +0.18
09 2007 +0.16
10 2013 +0.13
11 2001 +0.12
12 2006 +0.11
13 2009 +0.10
14 2004 +0.08
15 1995 +0.07
16 2012 +0.06
17 1987 +0.05
18 1988 +0.04
19 2011 +0.02
20 1991 +0.02
21 1990 +0.01
22 1997 -0.01
23 1996 -0.01
24 1999 -0.02
25 2000 -0.02
26 1983 -0.04
27 1980 -0.04
28 1994 -0.06
29 2008 -0.10
30 1981 -0.11
31 1993 -0.20
32 1989 -0.21
33 1979 -0.21
34 1986 -0.22
35 1984 -0.24
36 1992 -0.28
37 1982 -0.30
38 1985 -0.36

The global, hemispheric, and tropical LT anomalies from the 30-year (1981-2010) average for the last 24 months are:

YEAR MO GLOBE NHEM. SHEM. TROPICS
2015 01 +0.30 +0.44 +0.15 +0.13
2015 02 +0.19 +0.34 +0.04 -0.07
2015 03 +0.18 +0.28 +0.07 +0.04
2015 04 +0.09 +0.19 -0.01 +0.08
2015 05 +0.27 +0.34 +0.20 +0.27
2015 06 +0.31 +0.38 +0.25 +0.46
2015 07 +0.16 +0.29 +0.03 +0.48
2015 08 +0.25 +0.20 +0.30 +0.53
2015 09 +0.23 +0.30 +0.16 +0.55
2015 10 +0.41 +0.63 +0.20 +0.53
2015 11 +0.33 +0.44 +0.22 +0.52
2015 12 +0.45 +0.53 +0.37 +0.61
2016 01 +0.54 +0.69 +0.39 +0.84
2016 02 +0.83 +1.16 +0.50 +0.99
2016 03 +0.73 +0.94 +0.52 +1.09
2016 04 +0.71 +0.85 +0.58 +0.93
2016 05 +0.54 +0.65 +0.44 +0.71
2016 06 +0.34 +0.51 +0.17 +0.37
2016 07 +0.39 +0.48 +0.30 +0.48
2016 08 +0.43 +0.55 +0.32 +0.49
2016 09 +0.44 +0.49 +0.39 +0.37
2016 10 +0.41 +0.42 +0.39 +0.46
2016 11 +0.45 +0.41 +0.50 +0.37
2016 12 +0.24 +0.19 +0.30 +0.21

The UAH global image for December, 2016 (and annual image for 2016) should be available in the next several days here.

The new Version 6 files should be updated soon, and are located here:

Lower Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Mid-Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tmt/uahncdc_mt_6.0.txt
Tropopause: http://vortex.nsstc.uah.edu/data/msu/v6.0/ttp/uahncdc_tp_6.0.txt
Lower Stratosphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt

Cold to be Followed by Southern Snowstorm

Monday, January 2nd, 2017

The coast-to-coast cold that will be spreading across the U.S this week will be be accompanied by the development of a Gulf Coast low pressure center that will threaten the South and Southeast with substantial snowfall by the weekend.

The low is just now approaching N. California and will intensify as it travels across the Inter-mountain region, the Texas panhandle, then travel eastward along the Gulf coast by the weekend.

The latest GFS model forecast total snowfall by midday Sunday shows the possibility of 6-12 inch snowfalls across portions of about ten southern states, including Oklahoma, Missouri, Arkansas, Tennessee, Mississippi, Alabama, Georgia, the Carolinas, and Virginia (graphic courtesy of Weatherbell.com):

Total snowfall accumulation by midday Sunday, Jan. 8, 2017 as forecast by the NWS GFS forecast model.

It’s still too early to tell what areas will get the greatest snowfall, which in the current forecast approaches two feet in the higher elevations of North Carolina. It’s also possible that a wintry mix including freezing rain will exist along the southern edge of the frozen precipitation region.

UPDATE:
As of Tuesday morning (Jan. 3, 2017), the snow path looks like it will be farther south than depicted above, with lesser snow totals: 6-12 inches only over eastern N. Carolina, and up to 3 to 6 inches elsewhere.

First Week of 2017: Record Cold, 48 States Going Below Freezing

Wednesday, December 28th, 2016

It is increasingly looking like the first full week of 2017 will be greeted with a cold air outbreak over the Lower 48 states that will be widespread and persistent.

Early next week the cold air will enter the U.S. through Montana and the Dakotas, where temperatures will likely plunge into the minus 30 deg F (or colder) range.

By the end of the week, single digits could extend into the southeast U.S., and a hard freeze could push into central Florida (graphic courtesy of Weatherbell.com):

GFS model forecast surface temperatures for Friday morning, Jan. 6, 2017.

As can be seen, substantial portions of all 48 states might well be below 32 deg. F.

At the longer range, there appears to be a reinforcing plunge of even more frigid air heading south out of northwest Canada in the second week of January.

Science Under President Trump: End the Bias in Government-Funded Research

Wednesday, December 21st, 2016

You might expect that my background in climate research would mean my suggestions to a Trump Administration would be all climate-related. And there’s no question that climate would be a primary focus, especially neutering the Endangerment Finding by the EPA which, if left unchecked, will weaken our economy and destroy jobs, with no measurable benefit to the climate system.

But there’s a bigger problem in U.S. government funded research of which the climate issue is just one example. It involves bias in the way that government agencies fund science.

Government funds science to support pre-determined policy outcomes

So, you thought government-funded science is objective?

Oh, that’s adorable.

Since politicians are ultimately in charge of deciding how much money agencies receive to dole out to the research community, it is inevitable that politics and desired outcomes influence the science the public pays for.

Using climate as an example, around thirty years ago various agencies started issuing requests for proposals (RFPs) for scientists to research the ways in which humans are affecting climate. Climate research up until that time was mostly looking into natural climate fluctuations, since the ocean-atmosphere is a coupled nonlinear dynamical system, capable of producing climate change without any external forcing whatsoever.

Giddy from the regulatory success to limit the production of ozone-destroying chemicals in the atmosphere with the 1973 Montreal Protocol, the government turned its sights on carbon dioxide and global warming.

While ozone was a relatively minor issue with minor regulatory impact, CO2 is the Big Kahuna. Everything humans do requires energy, and for decades to come that energy will mostly come from fossil fuels, the burning of which produces CO2.

The National Academies, which are supposed to provide independent advice to the nation on new directions in science, were asked by the government to tell the government to study human causes of climate change. (See how that works?)

Research RFPs were worded in such a way that researchers could blame virtually any change they saw on humans, not Mother Nature. And as I like to say, if you offer scientists billions of dollars to find something… they will do their best to find it. As a result, every change researchers saw in nature was suddenly mankind’s fault.

The problem with attribution in global warming research is that any source of warming will look about the same, whether human-caused or nature-caused. The land will warm faster than the ocean. The high northern latitudes will warm the most. Winters will warm somewhat more than summers. The warming will be somewhat greater at 10 km altitude than at the surface. It doesn’t matter what caused the warming. So, it’s easy for the experts to say the warming is “consistent with” human causation, without mentioning it could also be “consistent with” natural causation.

The result of this pernicious, incestuous relationship between government and the research community is biased findings by researchers tasked to find that which they were paid to find. The problem has been studied at the Cato Institute by Pat Michaels, among others; Judith Curry has provided a good summary of some of the related issues.

The problem is bigger than climate research

The overarching goal of every regulatory agency is to write regulations. That’s their reason for existence.

It’s not to strengthen the economy. Or protect jobs. It’s to regulate.

As a result, the EPA continues the push to make the environment cleaner and cleaner, no matter the cost to society.

How does the EPA justify, on scientific grounds, the effort to push our pollution levels to near-zero?

It comes from the widespread assumption that, if we know huge amounts of some substance is a danger, then even tiny amounts must be be a danger as well.

This is how the government can use, say, extreme radiation exposure which is lethal, and extrapolate that to the claim that thousands of people die every year from even low levels of radiation exposure.

The only problem is that it is probably not true; it is the result of bad statistical analysis. The assumption that any amount of a potentially dangerous substance is also dangerous is the so-called linear no-threshold issue, which undergirds much of our over-regulated society.

In fact, decades of research by people like Ed Calabrese has suggested that exposure to low levels of things which are considered toxic in large amounts actually strength the human body and make it more resilient — even exposure to radiation. You let your children get sick because it will strengthen their immune systems later in life. If you protected them from all illnesses, it could prove fatal later in life. Read about the Russian family Lost in the Taiga for 40 years, and how their eventual exposure to others led to their deaths due to disease.

The situation in climate change is somewhat similar. It is assumed that any climate change is bad, as if climate never changed before, or as if there is some preferred climate state that keeps all forms of life in perpetual peace and harmony.

But, if anything, some small amount of warming is probably beneficial to most forms of life on Earth, including humans. The belief that all human influence on the environment is bad is not scientific, but religious, and is held by most researchers in the Earth sciences.

In my experience, it is unavoidable that scientists’ culture, wordview, and even religion, impact the way they interpret data. But let that bias be balanced by other points of view. Since CO2 is necessary for life on Earth, an unbiased scientist would be taking that into account before pontificating on the supposed dangers of CO2 emissions. That level of balance is seldom seen in today’s research community. If you don’t toe the line, getting research results that support desired government policy outcomes, you won’t get funded.

Over-regulation kills people

You might ask, what’s wrong with making our environment ever-cleaner? Making our food ever-safer? Making our radiation exposure ever-lower?

The answer is that it is expensive. And as any economist will tell you (except maybe Paul Krugman), the money we spend on such efforts is not available to address more pressing problems.

Since poverty is arguably the most lethal of killers, I believe we have a moral obligation to critically examine any regulations which have the potential of making poverty worse.

And that’s what is wrong with the Precautionary Principle, a popular concept in environmental circles, which states that we should avoid technologies which carry potential risk for harm.

The trouble is that you also add risk when you prevent society from technological benefits, based upon your risk-adverse worldview of its potential side effects. Costs always have to be weighed against benefits. Thats the way everyone lives their lives, every day.

Are you going to stop feeding your children because they might choke on food and die? Are you going to stop driving your car because there are 40,000 automobile deaths per year?

Oh, you dont drive? Well, are you going to stop crossing the street? That’s also a dangerous activity.

Every decision humans make involve cost-vs-benefit tradeoffs. We do it consciously and subconsciously.

Conclusions & Recommendations

In my opinion, we are an over-regulated society. Over-regulation not only destroys prosperity and jobs, it ends up killing people. And political pressures in government to perform scientific research that favors biased policy outcomes is part of the problem.

Science is being misused, prostituted if you wish.

Yes, we need regulations to help keep our air, water, and food reasonably clean. But government agencies must be required to take into account the costs and risks their regulations impose upon society.

Just as too much pollution can kill people, so too can too much regulation of pollution.

I don’t believe that cutting off funding for research into human causes of climate change is the answer. Instead, require that a portion of existing climate funding be put into investigating natural causes of climate change, too. Maybe call it a Red Team approach. This then removes the bias in the existing way such research programs are worded and funded.

I’ve found that the public is very supportive of the idea that climate changes naturally, and until we determine how much of the change weve seen is natural, we cannot say how much is human-caused.

While any efforts to reduce the regulatory burden will be met with claims that the new administration is out to kill your children, I would counter these objections with, “No, expensive regulations will kill our children, due to the increased poverty and societal decay they will cause. 22,000 children die each day in the world due to poverty; in contrast, we aren’t even sure if anyone has ever died due to human-caused global warming.”

Using a simple analogy, you can make your house 90% clean and safe relatively easily, but if you have to pay to make it 100% clean and safe (an impossible goal), you will no longer be able to afford food or health care. Is that what we want for our children?

The same is true of our government’s misguided efforts to reduce human pollution to near-zero.

U.S. Colder Now than All of Last Winter

Sunday, December 18th, 2016

If it seems like the current cold snap is unusual, you are right.

As of 5 7 a.m. EST this morning, Sunday, Dec. 18, the average temperature across the Lower 48 states of the U.S. is colder than any time all last winter.

As this plot of hourly temperatures shows, the average temperature is 18 16 deg. F, which is 2 4 deg. colder than any time last winter (graphic courtesy of Weatherbell.com):

Even the unusual warmth remaining in the Southeast U.S. is not enough to offset the frigid airmass much of the country is now experiencing. And the coldest part of winter is still six weeks away.

New Location for UAH Version 6 Text Files

Tuesday, December 13th, 2016

Now that our paper describing the UAH Version 6 methodology is in-press (publication date unknown), the text files containing monthly global and regional deep-layer temperature anomalies are now in a new location, without the “beta” identifier:

Lower Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Mid-Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tmt/uahncdc_mt_6.0.txt
Tropopause: http://vortex.nsstc.uah.edu/data/msu/v6.0/ttp/uahncdc_tp_6.0.txt
Lower Stratosphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt

UAH Global Temperature Update for November 2016: +0.45 deg. C

Thursday, December 1st, 2016

November Temperature Up a Little from October; 2016 Almost Certain to be Warmest in 38 Year Satellite Record

NOTE: This is the twentieth monthly update with our new Version 6.0 dataset. Differences versus the old Version 5.6 dataset are discussed here. The paper describing the methodology has been accepted for publication.

The Version 6.0 global average lower tropospheric temperature (LT) anomaly for November 2016 is +0.45 deg. C, up a little from the October value of +0.41 deg. C (click for full size version):

uah_lt_1979_thru_november_2016_v6

The global, hemispheric, and tropical LT anomalies from the 30-year (1981-2010) average for the last 23 months are:

YEAR MO GLOBE NHEM. SHEM. TROPICS
2015 01 +0.30 +0.44 +0.15 +0.13
2015 02 +0.19 +0.34 +0.04 -0.07
2015 03 +0.18 +0.28 +0.07 +0.04
2015 04 +0.09 +0.19 -0.01 +0.08
2015 05 +0.27 +0.34 +0.20 +0.27
2015 06 +0.31 +0.38 +0.25 +0.46
2015 07 +0.16 +0.29 +0.03 +0.48
2015 08 +0.25 +0.20 +0.30 +0.53
2015 09 +0.23 +0.30 +0.16 +0.55
2015 10 +0.41 +0.63 +0.20 +0.53
2015 11 +0.33 +0.44 +0.22 +0.52
2015 12 +0.45 +0.53 +0.37 +0.61
2016 01 +0.54 +0.69 +0.39 +0.84
2016 02 +0.83 +1.17 +0.50 +0.99
2016 03 +0.73 +0.94 +0.52 +1.09
2016 04 +0.71 +0.85 +0.58 +0.94
2016 05 +0.55 +0.65 +0.44 +0.72
2016 06 +0.34 +0.51 +0.17 +0.38
2016 07 +0.39 +0.48 +0.30 +0.48
2016 08 +0.43 +0.55 +0.32 +0.49
2016 09 +0.44 +0.49 +0.39 +0.37
2016 10 +0.41 +0.42 +0.39 +0.46
2016 11 +0.45 +0.41 +0.50 +0.37

To see how we are now progressing toward a record warm year in the satellite data, the following chart shows the average rate of cooling for the rest of 2016 that would be required to tie 1998 as warmest year in the 38-year satellite record:

uah-v6-lt-with-2016-projection

Based upon this chart, it now seems virtually impossible for 2016 to not be a record warm year in the UAH dataset.

UPDATE: It should be pointed out that 2016 will end up being 0.03-0.04 deg. C warmer than 1998, which is probably not a statistically significant difference given the uncertainties in the satellite dataset adjustments.

The “official” UAH global image for November, 2016 should be available in the next several days here.

The new Version 6 files (use the ones labeled “beta5”) should be updated soon, and are located here:

Lower Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tlt/uahncdc_lt_6.0beta5.txt
Mid-Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tmt/uahncdc_mt_6.0beta5.txt
Tropopause: http://vortex.nsstc.uah.edu/data/msu/v6.0beta/ttp/uahncdc_tp_6.0beta5.txt
Lower Stratosphere: http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tls/uahncdc_ls_6.0beta5.txt