Even Though Warming Has Stopped, it Keeps Getting Worse?

March 9th, 2015 by Roy W. Spencer, Ph. D.

I was updating a U.S. Corn Belt summer temperature and precipitation dataset from the NCDC website, and all of a sudden the no-warming-trend-since-1900 turned into a significant warming trend. (Clarification: the new warming trend for 1900-2013 is still not significantly different from zero at the 90% confidence level. H/T, Pat Michaels)

As can be seen in the following chart, the largest adjustments were to earlier years in the dataset, which were made colder. The change in the linear trend goes from 0.2 deg F/century to 0.6 deg. F/century.


I know others have commented on the tendency of thermometer data adjustments by NOAA always leading to greater warming.

As Dick Lindzen has noted, it seems highly improbable that successive revisions to the very same data would lead to ever greater warming trends. Being the co-developer of a climate dataset (UAH satellite temperatures) I understand the need to make adjustments for known errors in the data…when you can quantitatively demonstrate an error exists.

But a variety of errors in data measurement and collection would typically have both positive and negative signs. For example, orbit decay causes a spurious cooling trend in the satellite lower tropospheric temperatures (discovered by RSS), while the instrument body temperature effect causes a spurious warming trend (discovered by us). The two effects approximately cancel out over the long term, but we (and RSS) make corrections for them anyway since they affect different years differently.

Also, the drift in satellite local observation time associated with orbit decay causes spurious cooling in the 1:30 satellites, but spurious warming in the 7:30 satellites. Again this shows that a variety of errors typically have positive and negative signs.

In contrast, the thermometer data apparently need to be adjusted in such a way that almost always leads to greater and greater warming trends.

How odd.

176 Responses to “Even Though Warming Has Stopped, it Keeps Getting Worse?”

Toggle Trackbacks

  1. MarkB says:

    Here is an excellent overview by Zeke Hausfather over why this is so:

    • Johan says:

      Not really. It merely explains why adjustments are needed (not contested by dr. Spencer). Doesn’t explain why adjustments always have to lead to greater and greater warming trends. Neither does it prove that their methods are above criticism.

      • Kasuha says:

        It explains why there’s gradual 0.5 degree warming change due to ToD adjustment nicely. And it even more or less corresponds to this difference.

      • MarkB says:

        Time of observation is the dominant adjustment. Note that Zeke’s figure 6 is essentially the same as Dr Spencer’s second figure. The science behind this is well documented as in the links in Zeke’s article and is explained further in this post: http://judithcurry.com/2015/02/22/understanding-time-of-observation-bias/ Frankly I find it remarkable that Dr Spencer would be unaware of this.

        • Johan says:

          What about the quantitative uncertainty of the TOBs (see Roger Pielke Sr’s and also Judith Curry’s concerns)? Never saw a satisfactory answer to that.

          • MarkB says:

            The result from Karl et al, for example, suggests that the uncertainty in the TOBs adjustment is considerably less than the expected bias in unadjusted data. I’m not clear whether there is some objection to a particular result like that or if I’m missing your point.

          • Johan says:

            My point is explained a lot better below by kevin Roche
            March 9, 2015 at 12:45 PM

            “If the raw data is that bad, not clear to me that there is any formula for adjustment that doesn’t end up with wider error ranges. Somehow, we never seem to see a strong and prominent statement in all the press releases, blog posts, comments, etc. that there is great uncertainty in these adjusted temperature records.”

          • David S says:

            MarkB is either a moron or a troll. Ignore him.

        • I’m not unaware of it Mark. The Tobs adjustment has been around for years. But why does the *latest* adjustment (in the last year) add yet *another* 0.4 deg F warming to the record?

          Frankly I find it remarkable that you don’t see the distinction.

          • Curious George says:

            Why? Because they are paid for it. Simple.

          • geran says:

            MarkB has his belief system, which is failing, so it is okay (in his mind) to “adjust” the observations to match his false science.

          • MarkB says:

            Mea culpa, I didn’t realize on the first reading that you were comparing ostensibly the same data drawn from the data base a year apart. It’s clear after looking more closely at the annotations on the graphs what you’ve done, and I agree that it is curious.

            Do you have similar data archived from other regions and does it show a similar effect?

          • BBould says:

            Perhaps this curiosity should be passed on to congress, aren’t they looking into this?

        • KTM says:

          Zeke posted the following graph in a recent WUWT discussion.


          What that tells me is that they are making adjustments in the most error-prone way possible.

          For tracking temperature trends, they report temperature anomalies, not absolute temperatures. If they just want to track the long-term trend itself I don’t see why they couldn’t use only the minimum or maximum temperature anomalies rather than averaging them.

          If there was a station that was recording temperatures in the afternoon, that graph shows a nice wide plateau where the Minimum temperature reading is unaffected by the actual time in the afternoon that the reading was taken. The Maximum temperature, however, is very prone to TOBS error in the afternoon. In other words, they have one good dataset and another poor dataset.

          If this station then shifts its temperature reading to the morning, you have the opposite situation. The daily Maximum reading takes place in a nice broad plateau where it is unaffected by TOBS but the minimum temperature is very prone to TOBS errors. Again, one good dataset and one poor dataset.

          Their approach is to take both datasets, adjust them, then average them. They are taking good data and then polluting it with poor data they have tried to rehabilitate using a generic one-size-fits-all formula. That produces the average curve in the above figure, which swings wildly throughout the day and is more prone to errors than either Min or Max individually.

          I would love to see a chart where the anomaly from individual stations was generated by producing an anomaly from the good datasets only, Minimum during periods when afternoon readings were taken then Maximum during periods when morning readings were taken, and spliced together one time when the TOBS changed for that particular station. Then compare this “unadjusted” anomaly against the official TOBS adjusted one.

          • Dan W. says:


            Very well explained. As it concerns the reporting of an annual average temperature this would be better done, as you explain, by computing an average annual Tmin and Tmax and then averaging those. It would also be of value for the average Tmin and Tmax to be reported with just as much emphasis as the Tavg, for this data is meangingful for explaining why the average is changing!

            Of course the whole exercise of constructing a historical temperature record is fraught with issues. Zeke knows this but he plows ahead with the attempt anyway. I find it interesting that in the financial world the composition of the stock indices (the DOW and S&P, etc) change regularly. When this happen the index calculation is adjusted so there is no discontinuity going forward. But the indices are never recalculated going back! When AT&T is removed from the DOW average no one is going to reconstruct the DOW for past years as if AT&T was never in it. The index / average is accepted for what it was at the time it was computed.

          • MarkB says:

            Creating a split when each station’s TOBs changes is the approach used in the BEST series, rather than applying an explicit adjustment. My understanding is that they are using average temperatures though rather than your concept of using the “better” of the min or max temperature’s anomaly. One downside of your proposed approach is that the trend is slightly different for the daily minimum and the daily maximum temperatures as per NCDC US data, so there will still be an implicit bias in the technique.

        • An Inquirer says:

          MarkB, Your potshots such as March 9, 2015 at 10:48 AM might have some intended impact on your ego or on the uniformed. However, among those of us who understand the adjustment process and what is being done by whom, your potshot only undermines your credibility.

    • JohnKl says:

      Why massage two different data-sets into one? If the time of observations change (say in 1950 or 1960, the link seemed uncertain in that while 1960 appears most often 1950 appeared once in one of your links) simply report two different data sets referencing the time of observations from afternoon to morning as a changed parameter. Nothing excuses altering the data.

      Btw, the link itself provides ample evidence one should not give this data-set much credence when it comes to climatic projections. Apparently, the time of observations held to little if any systematic metric other than being within a couple of hours (5-7 evening or 7-9 in the morning) of each other and taking the max and min temperature.

      The apparent delusion only seems to grow when the link states:

      assumes that the current set of instruments recording temperature is
      accurate, so any time of observation changes or PHA-adjustments are done
      relative to current temperatures. Because breakpoints are detected through
      pair-wise comparisons, new data coming in may slightly change the magnitude
      of recent adjustments by providing a more comprehensive difference series
      between neighboring stations.

      When breakpoints are removed, the entire record prior to the breakpoint is
      adjusted up or down depending on the size and direction of the breakpoint.
      This means that slight modifications of recent breakpoints will impact all
      past temperatures at the station in question though a constant offset. The
      alternative to this would be to assume that the original data is accurate,
      and adjusted any new data relative to the old data (e.g. adjust everything
      in front of breakpoints rather than behind them). From the perspective of
      calculating trends over time, these two approaches are identical, and its
      not clear that there is necessarily a preferred option.”

      Frankly this statement should concern any rational mortal over the age of 2, toddlers included. The best alternative would be to make NO ADJUSTMENTS AT ALL and ASSUME NOTHING ABOUT THE ACCURACY OR INACCURACY OF CURRENT OR PRIOR DATA!!!

      If you except Zeke’s dross as an intelligible explanation you may wish to seek help.

      Have a great day!

      • Nefynhop says:

        JohnKl, “make NO ADJUSTMENTS AT ALL and ASSUME NOTHING ABOUT THE ACCURACY OR INACCURACY OF CURRENT OR PRIOR DATA!!!”….exactly! Would love to see the original unaltered data. Does anyone have this or has anyone produced a chart using this data?

        • JohnKl says:

          Hi Nefynhop,

          “Does anyone have this or has anyone produced a chart using this data?”

          Good question. Over a considerable period of time I’ve asked for such data and don’t recall ever seeing it, but if you add your voice to the chorus it may one day happen. The ice will eventually thaw, much to the chagrin of the warmists. Supposedly NOAA set up domestic temp monitors throughout the country in 2005. The data was not supposed to be “ADJUSTED” at all. In 2014 they announced the result, continental U.S. temps dropped some fraction of a degree centigrade. You may wish to look it up my memory doesn’t seem to clear as to the exact amount. In any case, ACTUAL measurements could clear up a lot of BRAIN-WASHING!!!

          Have a great day!

    • George E Smith says:

      I’m supposed to believe from that second graph that Temperature is quantized, and can only change in integral steps of 0.1 deg F.

      Pray tell how quantum mechanics ever became familiar with the Fahrenheit Temperature scale, so that Temperons are exactly 0.1 deg. F ??

      Amazing Dr. Roy !!

    • Gordon Robertson says:

      @Mark B….a quote from the article, “…that critical analysis should start out from a position of assuming good faith and with an understanding of what exactly has been done”.

      That might have applied before the Climategate email scandal exposed top IPCC officials scheming behind the backs of the public to fix peer review and adjusting cooler temperatures to look warmer (Mike’s nature trick). Or before Pachauri got caight sexually harassing a female employee.

      It might have applied before Hansen started raving madly about skeptics, that they should be jailed. It might have applied before NOAA began denying it’s own satellite data as processed by UAH and manipulating the surface data to show a warming that is not there.

      More on NOAA/NCDC (use the links on the page to get further detail):


      If you have time, a video with the guy from the link (Michael Smith) (you’ll have to find Part 2, which is normally on the same page as the video:


      A quote from the first link:

      “NOAA / NCDC have Fudged and Corrupted the Input Data Series”

      The guy making this statement has expertise in analyzing the kind of data NOAA/NCDC put out.

      Among other things, they have cut back surface reporting stations from 5000 to 1000, then applied interpolation to fill in the missing stations. Anyone with half a brain knows that will lead to unnatural anomalies if not outright lies, given that the stations used for intepolation are up to 1200 miles apart.

      When unnatural anomalies do show up, NOAA/NCDC ‘homogenize’ the data to smooth out the irregularities….using climate models.

      NOAA/NCDC has moved outside the realm of science into the social sciences where past situation involving humans are guessed at using fossils and the likes. Social scientist like anthropologists are forced to do that but NOAA/NCDC has absolutely no excuse for adjusting past temperatures to enhance their diabolocal and mythical claims about global warming.

    • Whenever some impossible to justify adjustment is pointed out to Zeke, Zeke will usually concede that it’s a problem, but then hand wave it away, as not being especially important. Or in other words, “trust me”.

      In most sciences, and certainly in engineering, if you make changes you can objectively test the consequences of those changes. And when errors are pointed out to you, you’re usually sent back to the drawing board. Not so with historical temperature adjustment. There is no way to know if the adjustments made things better or worse, or at least, there is limited interest in finding out. They can only argue that their adjustments confirm their assumptions or their model of what they think should have happened. It’s the perfect sort of government job.

    • An Inquirer says:

      Yes, Zeke did an overview, but I do not know if I would call it an excellent interview. Zeke does not address fundamental problems or weaknesses with the adjustment process. Moreover, he has given zero response to the issues that I have raised.

      P.S. The motivation to adjust — or the belief that one can improve a dataset via adjustments — is understandable. But that motivation in itself does not mean that the data set will be more reliable than the unadjusted one. Adjustments to the satellite data set is in a different ball game than adjustments to the surface data set. There are tests that can be performed to see if the data set has been improved, and the tests that I see so far indicate the surface data set has not been improved — and most likely it is less reliable.

      • JohnKl says:

        Hi Inquirer,

        “P.S. The motivation to adjust — or the belief that one can improve a dataset via adjustments — is understandable.”

        Not really, since it takes FAITH

      • JohnKl says:

        Hi Inquirer,

        “P.S. The motivation to adjust — or the belief that one can improve a dataset via adjustments — is understandable.”

        Not really, since it takes FAITH to BELIEVE the measurements and the actions of those involved in taking the measurements and providing the DATA will produce results that bear any relation to reality. This seems especially true when you consider that so frequently the RAW UNADJUSTED DATA SET DOESN’T GET PRESENTED ALONGSIDE THE ADJUSTED DATA SET!!! When all one presents proves to be adjusted, massaged or in some way tidied up for the proper viewing of the credulous the ability to take it seriously drops exponentially.

        Have a great day!

      • JohnKl says:

        G.K. Chesterton — ‘Reason is itself a matter of faith. It is an act of faith to assert that our thoughts have any relation to reality at all.’

    • hunter says:

      Bunk on that. There is no legitimate need to readjust the historical data. The obvious goal is to fabricate support for the climate consensus.
      The problem with so-called “noble cause corruption” is that it is always corrupt and eventual the “noble” cause gets forgotten, leaving only the corruption.

  2. Johan says:

    How so odd ? How else could reality conform to models ?

  3. With satellite data now in play this nonsense will no longer be able to take place in the void of other data sources which will be able to expose this practice in the future if it should take place.

    • Pete says:

      I find it interesting that interpretation of satellite data is farmed out to UAH and RSS. Why doesn’t NOAA provide its own in-house interpretation? This seems to be an arms length relationship between NOAA and UAH/RSS, which allows for some to point the finger at “them” when results don’t line up.

      Of course, if NOAA did take over complete responsibility, I then would no longer believe in what I was being told, so it is a dilemma.

      • NOAA has developed their own satellite dataset, but it’s not used that much for some reason. We have basic disagreements with them over their approach to how you intercalibrate satellites, and how you calibrate the older MSU instruments.

        • Gordon Robertson says:

          Roy…thanks for that clarification. My fear is that NOAA will allow the satellites to fail while not replacing them.

          • RWturner says:

            It would be very interesting to read the documents and/or testimony to congress explaining why the MSU satellites were needed in the first place. I assume that the lobbying for the expenditure came from NOAA or GISS and included something about the land-based data not being sufficient for obtaining an accurate global average temperature anomaly.

  4. Ben Palmer says:

    Dear Roy, let me explain it to you. Let’s pick a measurement point in the graphics above, say the measurement where your arrows point. Climatologists (or meteorologists?) periodically verify their databases to make sure they really have taken into account all the potential sources of errors. For this specific measurement they probably found, after March 2014 and before March 2015, a log indicating that the guy responsible for taking the measurements had often forgotten to put on his glasses and was systematically reading too high a number. A correction by say 0.2 degrees is certainly in order to keep the records on track.

    • sorry, Ben, but I refuse to give up my sarcasm crown to you.

    • Gordon Robertson says:

      @Ben Palmer …”A correction by say 0.2 degrees is certainly in order to keep the records on track”.

      Not as funny as you might think. There was such a 0.2 C warming circa 1977 which lead to the discovery of the Pacific Decadal Oscillation. The PDO caused the 0.2 C shift.

      One scientist, who I cannot recall at the moment, claimed later that the 0.2 C shift made no sense and must be a mistake. He was for expunging the record to remove the 0.2 C sudden warming.

      We had another sudden shift of 0.2C circa 2001, which plainly shows up on the UAH graph. If I remember correctly, John Christy of UAH suggested it may have been due to a rebound effect from the 1998 El Nino extreme.

      That would make perfect sense if the atmosphere had harmonic properties of some kind whereby a sudden warming spike over a short period, like an impulse wave in electronics, could cause the system to oscillate over a period of years.

      The atmosphere is bound to have internal feedbacks combined with the effects of the oceans and the land. I’m talking about the kind of feedback one might find in a servomotor system, not a positive feedback requiring amplification.

      A servomotor system uses voltages that are fed back from a sensor to correct deviations from a desired outcome. I don’t see why the atmosphere could not have such a system in effect, where changes in one aspect of the atmosphere are fed back to adjust for deviations in another part.

      That warming spike in late 1997 from the El Nino raised global warming briefly more than any other force in the previous century. That degree of sudden warming may have lead to the following 0.2C spike in 2001, which leveled off at about 0.25C above the 1980 – 2010 average.

      If you look at Roy’s running average (red line) it certainly has the appearance in places of an oscillating system. The thing with an oscillator in electronics is that it dies of exponentially…unless it receives another spike of voltage to re-enable it. El Ninos in between, like the 2010 spike, may be doing just that.

      I find it frightening that people like NOAA/NCDC could go back into temperature history and rewrite it based on a purely mathematical homogenization. To do that completely ignores natural effects like the PDO and ENSO.

      • KevinK says:

        Gordon wrote;

        “The atmosphere is bound to have internal feedbacks combined with the effects of the oceans and the land. I’m talking about the kind of feedback one might find in a servomotor system, not a positive feedback requiring amplification.”

        Negative feedback as used in servomotors and the thermostat in your residence require amplification to function. The thermostat “calls for heat” which amplifies the energy content of your residence (by converting some other form of energy; oil, natural gas, electricity) into heat with a furnace (or baseboard heater). Once the temperature has risen the thermostat “cuts off the heat” which is a negative feedback. Your furnace amplifies the “call for heat” signal from the thermostat. Same concept for a servo motor, some sensor “calls for faster” and an amplifier supplies more energy to a motor.

        “A servomotor system uses voltages that are fed back from a sensor to correct deviations from a desired outcome. I don’t see why the atmosphere could not have such a system in effect, where changes in one aspect of the atmosphere are fed back to adjust for deviations in another part.”

        In my opinion the atmosphere has no gain mechanism. It has resonances and time delays. These cause periodic cycling of the temperature which might look like a feedback mechanism but they are not. The climate science community has misunderstood this basic concept.

        The wheel of your car resonates up/down (with respect to the frame) when you hit a pothole. This frequency is set by the spring between the frame and the wheel. Once energized the wheel “bounces” a few times before the shock absorber dampens out the resonant frequency. No feedback is present. There is no feedback in the atmosphere; positive, negative, or using imaginary numbers….

        Cheers, KevinK.

        • Gordon Robertson says:

          @Kevin…”In my opinion the atmosphere has no gain mechanism”.

          Agree totally.

          In the following I am preaching to the converted so please don’t take me as trying to tell you about engineering. You likely know far more than I’ll ever know.

          My point in referring to servomotors (actually servomechanisms) is that you don’t need the feedback signal to be part of a gain stage in a servo. With true positive feedback, an amplifier is required, the feedback signal is part of the gain stage and without gain it cannot function.

          An amplifier depends on external power to achieve gain (amplification). The transistor (or vacuum tube) operates by having a small input current or voltage control a larger output current which is supplied by a power supply external to the transistor circuit.

          You don’t get something for nothing. Amplification in electronics depends entirely on current (power) supplied from a wall socket (or battery…which is already DC), rectified to DC to drive the transistors. The power supply also supplies all the currents used in the amplification process. Turn off the power and amplification is not possible.

          How anyone can claim that back-radiation from GHGs in the atmosphere can possibly cause a positive feedback that increases surface temperature is beyond me.

          A mistake that is made about positive feedback, and the mistake is quite prevalent on the Net, is that positive feedback produces amplification. Not so. The amplification has to be provided independently of the feedback signal and it is entirely dependent on externally supplied power.

          I think it’s important to understand the difference in meaning between the positive feedback used as part of an amplified system and the feedback used in a servomechanism. They are not the same. The names are the same but they are more homonyms than alike.

          I was implying in my comment that maybe systems similar to a servomechanism exist in the atmosphere that could explain the 1998 El Nino producing a 0.25C temperature spike in 2001. Such systems would involve negative feedbacks, however. Tsonis et al have already claimed that the major oceanic oscillations like the AMO, PDO, and ENSO can do something like that to control warming/cooling.

          That could tie into this article and it concerns me that warming/cooling periods are being written off as statistical errors whereas they may be due to natural oscillations, possibly related to negative feedbacks. Tsonis concluded that before we go wasting time on anthropogenic warming theory we should investigate natural oscillations in the oceans.

          Positive feedback in an amplified system involves a fractional sample from an amplified output signal being fed back and mixed ‘in-phase’ with the input signal of an amplifier. The summed signals are amplified again and in each subsequent cycle, the output signal increases exponentially till something blows, unless it is designed like an electronic oscillator to sustain rather than increase its amplitude.

          Feedback in a servomechanism does not require gain and is nothing more than a +ve or -ve control signal indicating an error. It is used to indicate to a controlling device which way to position something or to change a motor speed, etc. Such a feedback can never run away into destructive oscillation or behave like the tipping point suggested by Hansen for the atmosphere.

          Is it too far out to theorize that natural mechanisms similar to a servomechanism exist in the atmosphere-ocean-land system?

          I think it’s safe to claim that all processes in the atmosphere involve negative feedback. True positive feedback is not possible. I know I’ll be hated for this by some but the 2nd law of thermodynamics was produced by Clausius to prevent positive feedback in heat engines. Carnot had implied there were no losses in heat engines and the 1st law would allow perpetual motion type feedbacks in some cases.

          It’s erroneous to think that a feedback signal can simply amplify itself causing an amplification. It can in certain situation in nature, like the resonance associated with harmonic motion. The Tacoma Narrows Bridge disaster is a perfect example of that.

          Wind blowing through the under-damped suspension cables on the bridge caused them to vibrate like a guitar string. Because they were under-damped, the cables contributed to the natural resonance in the bridge, which increased till the bridge tore itself apart.


          The wind is not causing that bridge deck to vibrate with harmonic motion but the wind set it off by vibrating the suspension cables. The rest is due to a natural amplification related to resonance.

          There is no such natural mechanism in the atmosphere nor is there the amplification required to produce a positive feedback like the squeal one hears in a PA system when the mic is too close to the speakers. To prove that squealing is not producing the amplification, turn the amp off and it disappears instantly. You will never get such feedback with an acoustic guitar, although you can get good resonance and some sustain.

          You might even claim that servo-like mechanisms exist already in the atmosphere albeit with negative feedback and no positive feedback. People talk about effects of albedo, for example, but they seem to mistake such a system with positive feedback, which can run away destructively.

          I think a mistake has been made is the alarmist position that a tipping point is in the offing due to a positive feedback generated by anthropogenic CO2. I think people who claim that do not understand true positive feedback and the requirement of gain.

          Here is part of an exchange between engineer Jeffrey Glassman and Gavin Schmidt of NASA GISS. Half way down the page is a section titled “GAVIN SCHMIDT ON POSITIVE FEEDBACK”. It’s amazing that a mathematician programming climate models for NASA GISS does not understand positive feedback. Nor does it seem his former boss James Hansen did either.


          • Mike M. says:

            Gordon Robertson wrote: “Feedback in a servomechanism does not require gain and is nothing more than a +ve or -ve control signal indicating an error. It is used to indicate to a controlling device which way to position something or to change a motor speed, etc.”

            That is the way “feedback” is used by the climate modelers. Except, of course, that it is not a control signal created by an intelligent designer. I am not saying that is a good choice of terminology, only that it is the terminology used.

            “Such a feedback can never run away”

            What if you make a sign error in your programming or in wiring your circuit? For example, I might carelessly set things up so that a furnace temperature that is too high results in an increased current going to the heater. That would be a “positive feedback” in the sense used by the modelers.

            The dominant feedback is the Planck feedback (Stefan-Boltzmann Law) The total of all feedbacks is negative, although some of the individual terms are positive. To make things more confusing, they often use the term feedback to refer to all the contributions except the Planck feedback. Then “positive feedback” means “less negative overall than the Planck feedback alone” and “”negative feedback” means “more negative overall than the Planck feedback alone”. Confusing language and possibly an indicator of muddled thinking, but not in itself wrong.

            So far as I can tell, the tipping point stuff is alarmist speculation, unsupported by either observation or models.

  5. Alan Poirier says:

    I find it extremely odd that the past temperatures should be repeatedly changed. There have been successive alterations to the land based data sets and each time the past cools. It is truly amazing. It is impossible to have any faith in these data sets.

  6. Thanks, Dr. Spencer.
    I don’t find the fact that data adjustments add up to more global warming. They give support to the dieing hypothesis of catastrophic anthropogenic global warming. Intensive care is required.

  7. Sorry, I meat to write:
    I don’t find odd the fact that data adjustments add up to more global warming.
    I would find it odd if they did not.

  8. denny adams says:

    I don’t see just the adjustments as the problem. As Lindzen observes, why do they always seem to be adjusted for greater warming? On a very large time scale, it should even out but it seems not to.

  9. Steve Case says:

    There are some old GISSTEMP pages available on the Internet Archive’s WayBack Machine. I found meteorological stations only table data from August 2005 and compared the differences to the current version, December 2014. Looks like this:


    • Svend Ferdinandsen says:

      A good place to look at the adjustments is climate4you.com
      Look for “Change over time of global monthly temperature estimates:”

    • RB says:

      The thing that seals it for me that the adjustments were biased is the changes between 1910 and 1940. One of the biggest bits of evidence that the warming during the 20th century was natural is that the rate of warming pre-SUV was just as high as at the end of the century. Its not a coincidence that they seem to have tried to flatten it out.

  10. Scott says:

    In addition. The surface data sets are making adjustments to temperatures that were recorded by others long before they were born. How can they possibly know, for sure, what went on in a park in 1900?

  11. Alan Davidson says:

    Simple explanation, it is co-ordinated fakery. Same thing has been observed and reported in Australia, New Zealand, Paraguay, Iceland, Arctic etc.

  12. dave says:

    ” …adjustments…”

    The philosopher turns away, and says “Oh! It is the old game.”

  13. Kristian says:

    “Being the co-developer of a climate dataset (UAH satellite temperatures) I understand the need to make adjustments for known errors in the data…when you can quantitatively demonstrate an error exists.”

    Glad to see you point this out, Roy:

    UAH need to adjust their tlt product

  14. Lewis says:

    Along that line,

    When I drive home from town at night, the automobile temperature gauge almost always drops 4 or 5 degrees F in the 10 miles it takes me to get home. Do I need to adjust my temperature at home up, or the one in town down, in order to get the correct temperature?

    On a side note, the airport would be about halfway home, so is the temperature there the ‘normal’?

  15. kevin roche says:

    I have read Zeke’s papers trying to explain the rationale for their adjustments that cool recorded temperatures in the earlier decades of the 20th century. What I have not seen is any explanation why the adjustments in the later decades warm recorded temperatures, which presumably don’t have the supposed TOBs issue. And while Zeke, whose efforts and approach, I generally appreciate, describes the recorded US temperatures as very bad, he is unfortunately very reticent about acknowledging the lack of likelihood that anyone attempting to adjust this record is getting any closer to the “truth”. If the raw data is that bad, not clear to me that there is any formula for adjustment that doesn’t end up with wider error ranges. Somehow, we never seem to see a strong and prominent statement in all the press releases, blog posts, comments, etc. that there is great uncertainty in these adjusted temperature records. And that is just for TOBs, there are innumerable other issues, like UHI, that are just as troublesome. For the purpose of trends, not clear to me that the original recorded temperatures are any worse than the adjusted ones.

  16. Ulric Lyons says:

    Even without the adjustments there would be an upward trend starting from a cold AMO mode and ending on a warm AMO mode, because of changes in rainfall:
    Increased forcing of the climate (positive North Atlantic Oscillation) makes the region wetter and cools it. Negative NAO in summer months causes drought and heat like 2012. Decades of strongly negative NAO would put sand dunes across the great plains.

  17. Keith says:

    Dear Roy,

    As you mention, many bloggers have commented on this.

    Elsewhere, bloggers have noted that individual or groups of weather station data which show decreasing temperature trends over time have been converted into warming trends through the adjustments made by NCDC. Also, as you point out here, the adjustments exacerbate this phenomenon over time.

    Your UAH data set is independent of this. Why do you think the final NCDC / GISS temperature trend is relatively close to the UAH or RSS trend, when, as you and others mention, negative trends are often turned into positive trends by the adjustments?

    One would have thought that the satellite trends would see something similar to the original trends before adjustments, or at least the trends as shown 10 or 15 years ago.

    I would really appreciate your view.

    THank you and best wishes, Keith

    • Gordon Robertson says:

      @Keith “Why do you think the final NCDC / GISS temperature trend is relatively close to the UAH or RSS trend…”

      You did not indicate the respective trends. A trend bandied about for UAH is 0.14C/decade, which alarmists use to compare to trends like GISS. That 0.14 C represents a 33 years trend that began in a negative anomaly region and did not reach a positive anomaly region till 2/3ds the way through the range. At that point, the El Nino of the century struck, driving global temps as high in a few months as any other force in the previous century.

      This is all explained in the UAH 33 year report. In the report, it is explained that after the cooler part of the 33 year range is adjusted, the UAH warming trend is about 0.09C/decade.

      Here’s the difference. 3.3 decades x 0.14C/decade = 0.462 C over 33 years. Whereas 3.3 decades x 0.09C/decade = 0.297 C, which is about what you see on Roy’s graph.

      You have to be extremely careful with trends in that you also have to explain the context. For example, the first 17 years of the UAH trend was affected by the cooling of volcanic aerosols. The last 16 years has seen two extreme El Ninos (late 1997 and 2010).

      The GISS graph shows none of that. It shows a gradual positive trend increasing till 1998 then leveling off. There are no negative anomalies shown for a period involving volcanic aerosols. In other words, their trend suggest a continual, true warming since 1950 whereas UAH shows variable cooling from 1978 – 1997.

      Of course, the GISS graph uses a much different baseline but that does not explain how they came about a purely positive trend through the years 1978 – 1997.

      IMHO, there is no comparison between the trend of GISS and UAH.

  18. Gunga Din says:

    For my little spot on the globe the 2002 and 2007 lists of record highs and lows show no new records set between 2002 and 2006. The list from 2013 shows 6 record highs and 4 record lows set in that same period.
    If we’re not sure of the past temperatures, how can they be sure of the future temperatures?

  19. Pat Michaels says:

    Roy–you use the word “significant” to apply to the JJA data, but actually it applies to the trend adjustment. I’d be very surprised if the trend in the JJA data itself has become significant.


  20. Joel Shore says:


    You use your own UAH data set as an example, saying there have been both positive and negative corrections. However, the fact is that the positive corrections have dominated.

    Several years ago now, I looked at how much of the change in the trend for the UAH data set was due to the longer record and how much was due to the changes in analysis. Here’s the result:

    Spencer & Christy’s pre-1998 analysis method gave a trend of -0.076 C / decade for the Jan 1979 – Apr 1997 data (as per their 1998 paper: http://ams.allenpress.com/archive/1520-0442/11/8/pdf/i1520-0442-11-8-2016.pdf ); their current analysis gives +0.029 C / decade for that same data. That is a change in trend of +0.105 C / decade due solely to changes in their analysis.

    Since the trend for the full data set we now have through Dec 2008 is +0.127 C / decade, the change due to the longer time series is +0.098 C / decade.

    Thus, a tiny bit over half of the change in trend is due to changes in the analysis, not the longer data series.

    • Kristian says:

      Don’t forget the considerable upward adjustment of the last two-three years of the UAH timeseries going from their version 5.5 to the current 5.6.

    • Gordon Robertson says:

      @Joel “…their current analysis gives +0.029 C / decade for that same data. That is a change in trend of +0.105 C / decade due solely to changes in their analysis”.

      Joel, why would you not expect the trend to change between 1978 and 1998 with respect to a broader baseline and more data?

      The slope of a range in a trend line involving anomalies is going to change if the range is part of an ever-changing and broadening baseline.

      That’s why I don’t like trend lines or anomalies. I especially don’t like temperature trends calculated statistically, they tell you nothing. I don’t see how any kind of trend can be applied to atmospheric temperatures given the way they deviate all over the place.

      What we should have, if we are going to use trends, is a short-term trend that apply only to the context in which they were determined.

      For example, the UAH trend over the period you mentioned was not about global warming at all. It was largely a recovery from cooling. That’s not the same thing as the post 1998 trend which involved true warming caused by a strong El Nino.

      I don’t even like the term “true warming”. All it means is that the average in a range was exceeded. Until we see a graph of absolute temperatures we can’t really see what is going on.

      Of course, if you look at absolute temps in the range of 15C, the piddly amount of warming we are talking about will be dwarfed by the absolutes and lost. What we call global warming will appear on an absolute scale as little more than a straight line.

      • Joel Shore says:

        “Joel, why would you not expect the trend to change between 1978 and 1998 with respect to a broader baseline and more data?

        The slope of a range in a trend line involving anomalies is going to change if the range is part of an ever-changing and broadening baseline.”

        You misread what I wrote. What I am saying is that the trend changed by +0.105 C / decade FOR THE EXACT SAME PERIOD. There is no more data “and broadening of the baseline” does not affect the trend. The only reason for the change in trend that I mentioned is adjustments of the data due to changes in the analysis. (The trend increased additionally because of an increase in data but that is not what I am talking about.)

        “For example, the UAH trend over the period you mentioned was not about global warming at all. It was largely a recovery from cooling.”

        This “recovery from cooling” claim is nonsense that we talked about before;’ it is because you don’t understand how a baseline works.

        “What we call global warming will appear on an absolute scale as little more than a straight line.”

        The change in global temperatures from the Ice Age to now would also appear as little more than a straight line. Am I supposed to conclude from that that the change has an insignificant effect?

        • JohnKl says:

          Hi Joel Shore,

          You stated:

          “The change in global temperatures from the Ice Age to now would also appear as little more than a straight line. Am I supposed to conclude from that that the change has an insignificant effect?”

          Of course not. good point. However, we still experience ice age conditions.

          Have a great day!

  21. RB says:

    It seems to be too far from left field to accept that you can not get the changes in global average temperatures from thermometer records. There is UHI, operator error, missing data, poor coverage + effects on temperature of weather patterns, effects of humidity – and then you need to correct to get a trend of 0.6 instead of 0.2°C/decade? And this is after TOB correction? And many years after those who did it claimed that the science is settled, that AGW is real?

    • Gunga Din says:

      Well, it looks real on my monitor.
      (Then again, I’ve had Japan take out the Nazis and win WW2 on my monitor. Hmmmmmm…)

    • Gordon Robertson says:

      @RB…with respect to the surface record you forgot to mention that thermometers cover mainly the land surface and miss most of the oceans, which cover 72% of the planet.

      In another comment, I supplied a link to chiefio in which Michael Smith has done stellar work exposing the practices of NOAA/NCDC, who are responsible for collecting and analyzing surface data.

      I am likely biased, having grown up in an intimate association with electronics. I think the satellite telemetry is light years ahead of surface thermometers based on its 95% coverage of the planet and the number of oxygen data points it can cover in an instantaneous scanner position.

      I think the sat scanners can give a broad spectrum average of atmospheric temperatures that thermometers could never do. In fact, I think the thermometer surface method is so primitive it should be abandoned.

  22. Aaron S says:

    As the delta grows between earth’s temp and a datasets measurements, it will be harder and harder to bluff… it cant go on forever. If it is really happening systemically to produce warming then it will become obvious in time. Thanks for sharing the observation. Im curious of motivation in this example… ethanol political debate perhaps? Even some republicans support the absurd green practice.

    • JohnKl says:

      Hi Aaron S,

      You state:

      “If it is really happening systemically to produce warming then it will become obvious in time.”

      The data set covers over a hundred years. How long before Captain Obvious makes a leap and acknowledges the UNDENIABLE!!!

      As to Ethanol Al Gore, some Dems and Repubs representing agricultural interests support it for very obvious financial reasons. Who doesn’t like FREE TAXPAYER SUBSIDIES all designed to make you look like some CLIMATE HERO for accepting them? In reality, pseudo-science myths aside if one contemplates the factors involved with ethanol production (especially from corn) one will easily see that if CO2 and supposed warming prove to be your concerns this simply won’t work.

      Have a great day!

      • Aaron S says:

        Agreed ethanol was absurd at 100 dollar a barrel from an economic perspective and at 60 dollar a barrel of oil it is errrm… superabsurd! Of course it is not green either… it is as u say a subsidy. I just feel like they need the warming present in the belt to justify the practice and it is very convenient for the upcoming election to clear the water with warming. I cant see it going much longer. The Hadcrut4 already made their correction which produced a recent negative trend…. it is only a matter of time… just not before the election… haha.

  23. KevinK says:

    Dr. Spencer, this happens all the time in engineering. We now know that 120 volts AC back when Edison was inventing light bulbs was actually only about 87 volts AC. That explains why all those old interior photographs look so poorly illuminated. Volts have been getting slowly more energetic (energy trapped in the atmosphere) but luckily the resistance of modern copper wire is dropping, so you can hardly see the effect. /sarc off.

    All corrections going in the same direction indicates confirmation bias, in my opinion. They just believe that it is getting warmer, so when an adjustment algorithm confirms their belief they simply keep that adjustment and start looking for more adjustments, If an adjustment makes it cooler they believe they have made a mistake and discard that adjustment.

    Cheers, KevinK

    • Gordon Robertson says:

      @KevinK….”We now know that 120 volts AC back when Edison was inventing light bulbs was actually only about 87 volts AC”.

      I got your /sark off comment so I know what you say is TIC. I did not know that about voltage, however.

      What comes to mind is that back in those days they were undecided as to whether they should use AC or DC on power grids. Along the way, they started specifying AC as a DC equivalent, which is the RMS value (0.707 of the peak AC value). If they were specifying 120 VAC by it’s peak swing then the DC equivalent RMS would be around 84 volts.

      I don’t know when they started using the RMS value exclusively but looking back at that is a far cry from going back in temperature history and adjusting it based on a human created mathematical adjustment. We are talking very simple thermometer readings.

      It was not so long ago that GISS totally screwed up temperature readings sent in from Russia. Based on that erroneous reading, they claimed it had been colder in September than October of that year (the inference being that it had warmed in October due to anthropogenic causes).

      When they got caught, they shrugged it off, claiming they did not have the funds to verify all temperature data sent to them.



      Climate alarmists should be stopped now from tampering with historical data. If they want to do that for their own jollies they can fill their boots. However, data that affects everyone should be left alone.

      • KevinK says:


        “Along the way, they started specifying AC as a DC equivalent, which is the RMS value (0.707 of the peak AC value). If they were specifying 120 VAC by it’s peak swing then the DC equivalent RMS would be around 84 volts.”

        Yes, the equivalent energy of a DC voltage of ~84 volts is equal to 120 Vac (peak to peak). It gets more complicated with nasty things like “power factor”, etc.

        There are actually light bulbs that are designed for DC voltage versus those designed for AC voltage, it mostly has to do with the lifetime of the filament.

        Those electrical engineers are like wizards; “They’ve been studying electricity for years, nobody knows how the heck it works”….. Ha ha ha

        I can say that as a degree holding electrical engineer with several decades of practical experience.

        Cheers, KevinK.

        • Gordon Robertson says:

          @KevinK “It gets more complicated with nasty things like “power factor”, etc”.

          You mean real power versus imaginary power? I’ve done my share of those calculations, thank you. 🙂

          I like the inference, that power can be imaginary. If you got it across you by interfering in the magnetic circuit of a 600 volt motor, you’d soon see how imaginary it is. 🙂

          Same with power correction capacitors. I have seen banks of them in factories operating at 600 volts. I steered well clear of them.

          The other thing that’s a hoot is complex number theory.

          i^2 = 1, therefore i = the root of -1.

          I had it out with a math prof who tried to convince me that under certain conditions you could take the root of -1. I asked him what times what equaled -1 (with regard to roots, not straight algebra) and he got a funny look on his face like he’d like to kick me. His ears actually turned a bit red.

          The fun thing about math is that you can invent a virtual reality where time dilates and time and space curves. Those are the same people programming climate models.

          • dave says:

            Indeed, the real number ” -1 ” never has a square root. BUT, the ordered PAIR of real numbers ” (-1,0) ” has something LIKE a square root when “multiplication” for such pairs is defined in a certain way (indicated by ” ** ” instead of ” * “); and that “square root” is the ordered pair of real numbers ” (0,1) ” [Also, ” (0,-1) ” works]. Such ordered pairs are called complex numbers – i.e. just a complex, or association, of real numbers

            The ordered pair (0,1) is shown by the letter ” i ” or sometimes ” j “.


            ” i^2 = -1 ” (1)

            ACTUALLY MEANS

            (0,1) ** (0,1) = (-1,0). (2)


            1^2 = 1 in the context of complex numbers ACTUALLY MEANS

            (1,0) ** (1,0) = (1,0) (3)

            which is LIKE

            1 * 1 = 1, but DIFFERENT.

            The fact that this is is not made CLEAR is due to the fact that most mathematicians do not know their arses from their elbows, when it comes to the foundations of mathematics.

            The confusion arises from BAD NOTATION. The signs for multiplication, exponentiation, mean something different for complex numbers than for reals. It is just the old mistake of using the same symbol for different things. Actually, it helps a lot to use different colours when writing maths. Use red for the “multiplication” sign for complex numbers, and blue when you are using singleton reals. Note, again, that for “multiplication of pairs”, above, I used ” ** “, not ” * “. Similarly, use magenta for the equals sign when you mean that two things can be collapsed into one (1+2=3) and yellow when you mean the algebraic sense of “equivalent to” (a+b=c).

            et cetera, et cetara,et cetera…

            Some Cambridge mathematicians in the 19th Century had a different approach. They said that ” -4 ” and ” +4″ was just bad notation for (-,4) and (+,4). The ” – ” and the ” + ” merely indicate an associated QUALITY, like owed or owing in a commercial relationship. QUANTITY as such is always positive in the sense of existent or conceivable.

            The only things, then, that have a ‘natural’ square root are objects without the quality, such as ” 1 “. Saying that “the square root” of ” 4 ” is ” +2 ” or “the square root” of ” +4 ” is ” 2 ” is then just a mistake. One should say the square root of (+,4) is DEFINED as (+,2) – and, actually, one can’t think of a useful definition for the square root of (-,4).

            Since there IS an easy definition for the CUBE root of “a negative number,” e.g. the cube root of (-,8) is (-,2), this was a little awkward, although perfectly valid as a scheme.

            A THIRD formulation is geometrical and involves the idea of replacing the real number line by a plane, with a particular line as the real number line’s analogue. Then – roughly – multiplying by -1 is turning the line a half turn and multiplying by sqrt(-1) is turning it by a quarter turn. So we define the square-root of the minus SIGN as make a quarter turn and expand sqrt(-1) as sqrt(-)*sqrt(1) = “i”. This is the operator formulation. Of course, if you are going to use that same old symbol for a root in THIS WAY, you are going to need another colour! I can’t put any colours on this blog, unfortunately.

            It does not matter that some fool in the Middle-Ages called certain numbers “imaginary”. It is a misnomer.
            Even now, in the complex number a+bi or (a,b), a is called the real part and b the imaginary part when this only means the first and the second parts. Just like calling some rocks “acid” and others “basic”. It is gibberish.

          • dave says:

            The (1), (2),(3) after “Hence” are just numbering of three different equations for convenience. It means nothing else. The formatting on this blog sometimes goes screwy.

          • Alexej Buergin says:

            Dave: It is not gibberish, just some name. You need that. A “force” is not the same thing in religion and in physics, and a “group” is not the same thing in sociology and in mathematics.
            Shold you start with the (not very complicated) complex numbers, and define an addition and a multiplication, then why not use the same notation when the complex numbers are of the form (a,0).

  24. ossqss says:

    It would be comforting to see a paper disclosing the exact coding and due diligence for such published for every instance of adjustment.

    If justified, there is no reason to not do so. Just sayin

    • Gordon Robertson says:

      @ossqss “It would be comforting to see a paper disclosing the exact coding and due diligence for such published for every instance of adjustment”.

      Steve McIntyre of climateaudit.org was trying to do something along that line. He had asked for the historical record kept by Phil Jones of Climategate fame and Jones refused. His excuse was that McIntyre wanted to use the data against him.

      Doh!! McIntyre and McKittrick had already destroyed the hockey stick of Jones’s friend Michael Mann and I guess Jones was feeling a bit sheepish about the mess he had made of the record. He offered another excuse, that the record did not belong to him, but to contributing countries, and that the old record had been lost after adjustments.


      McIntyre submitted an FOI request to the UK government to get access to the record and Jones was seen in the Climategate emails trying to interfere with the FOI.

      Later, when Jones was investigated by peers who were also climate alarmists, McIntyre was not called to testify. Instead, they called in Lord Blaby of something or other who knew nothing about the shenanigans of Jones.

      In other words, the Climategate scientific mischief was whitewashed.

      I don’t think there’s much chance that the current mob of climate hoodlums will investigate themselves. If an external investigator tries, you can bet they will all band together up to the level of the UN and the IPCC to block such an investigation.

      Look at the hockey stick investigation by the National Institute of Science and the statistics expert Wegman. Total whitewash. NAS used to be a reputable institution who only allowed serious academics in the door. They slackened their requirements, however, allowing climate alarmists to get in and take over.

      When Wegman labeled the hockey stick math as essentially useless, and pointed a finger at Chapter 9 in IPCC reviews as being nepotic (he claimed they cited only the work of each other…also, they are friends of Mann), Bradley of MBB (hockey stick authors) retaliated by charging him with plagiarism.

      Plagiarism?? In an inquest investigating Bradley and his fellow authors??

      If these guys were politicians one might be able to understand somewhat. However, scientists are not supposed to carry on like immature snots.

      • Streetcred says:

        Gordon, the truth is that academics do carry on like “immature snots” and that these ‘climate scientists’® are from the same family. In the commercial world they would have their fannies kicked down the road.

        • Joel Shore says:

          No…In the commercial/corporate world, the immature snots are generally elevated to senior management positions (says someone who has actually been in the corporate world).

          • JohnKl says:

            Great point, the corporate world does often reward immature snots. However, to the extent the corporate world exists it now increasingly seeks remuneration from the state. Who better to seek low hanging fruit/patronage/patois from government ladles than immature snots prepared to spout any psuedo-science bilge that fills their coffers? Immature snots may really make the best statist conformists and corporate brown-nosers that can be pounded, moulded, dyed and made to appear as a pound of flesh. Or do I exaggerate?

            Have a great day!

    • Frank K. says:

      It would be nice if those who believe fervently in the TOBS and other adjustments could provide links to the software being used to process the raw data. That way we could all see what algorithms are being employed and decide for ourselves if they are rational or not.

  25. boris says:

    Yesterday there was a report on the Republicans in Iowa putting heat on the presidential candidates to continue supporting ethanol from corn subsidy. I think Roy was commenting a few weeks ago that he had been hearing that there were some secret “warmists” among republicans generally. What’s alarming is those “conservative” guys “free enterprising” themselves right into support for the green agenda as a matter of deed if not of word. There is no room for capitalism in a green world and republicans that think it “just business” better wake up. It’s not supply and demand when subsidy drives the demand. I know this is a bit off topic but somebody opened the ethanol door up

  26. Bill Hunter says:

    Hmmm, probably related to the lack of data archiving control in academia. Each new scientist discovers the temperature collection time database and adjusts the current temperature record to account for it. Then based upon lunchroom chatter about how skeptics will take the raw data and ruin your career over the methodology you choose, they suggest you lose the raw data.

    This would have the added bonus that when you get a BEST reconstruction of global temperatures, they start out with the “best” data.

    I mean why not? Weren’t there several investigations that gave their blessings to this approach?

  27. Chris Hanley says:

    The technique is called ‘incrementalism’.
    Like computer models, it’s just another of the helpful tools employed worldwide by Climate Change™ practitioners:
    “… [An] example would be small changes that make way for a bigger overall change to get past unnoticed. A series of small steps toward an agenda would be less likely to be questioned than a large and swift change …” (wiki).

  28. Streetcred says:

    Have they only discovered TOBS subsequent to 2014 ( I think not ) to require the data to be further tortured or have they discovered that the only way to make the models more believable is to adjust the observed data to match the model output?

    Hand, cookie jar, caught ??

  29. Noblesse Oblige says:

    Alas, in the corrupt world of Global Warming, the past is just as hard to predict as the future, except that we know that the past will get steadily cooler and the future steadily warmer — no matter what really happens.

    • dave says:


      It used to be said, in the murky world of communist dictatorships, that the future was certain – because Marx had prophesied it – but the past was always changing – as various people became unpeople.

  30. dave says:

    Just to finish up…my point being always that mathematicians may be a hoot but mathematics isn’t…

    The true statement that

    ” There is no real number ‘a’ such that a*a = -1 ”

    is replaced in a complex scheme by the true statement that

    ” there is no real ‘a’ such that (a,0)**(a,0) = (-1,0), ”

    it always being understood that the ‘ * ‘ and the ‘ ** ‘ in the two sentences stand for analogous but formally distinct ideas of multiplication.

    • Gordon Robertson says:

      @dave from your other reply…”Indeed, the real number ” -1 ” never has a square root”.

      I get the point of your entire reply and I have no argument with it. The prof to whom I referred used to drive us nuts dragging us through error analysis on the tails of series that were not a requirement for the course. For example, if a series, or even an exponential, was approaching the x-axis, he would get us into analyzing the error remaining if the series/curve was cut of at a certain point.

      As if engineering students had the time to study superfluous material.

      That’s partly why I tried to pin him down on his claim that the root of -1 could be found in some cases. I realized that by doing so I was making my life far more difficult than it had to be. You don’t alienate a math prof unless you’re a genius, and I am not.

      My reply to KevinK, who is an engineer, was mainly intended as in-house humour. I would hardly regard myself as having expertise in math, or physics for that matter.

    • Alexej Buergin says:

      In (a,0) it is implied that a is real; thus you should just say:
      There is no (a,0) such that …
      And why should there be more than 2 solutions to that equation, which are (0,1) and (0,-1)?

  31. Tim Hammond says:

    There is a difference between adjusting a result for known changes in a moving instrument that is recording data indirectly, and adjusting a result from an instrument that is essentially static, unchanging and accurate.

    The idea that we can “know”, to within tenths of a degree what the temperature “was” decades ago, is simply nonsensical. I can understand the need to homogenize data to create data that covers a larger area, but that has to be based on the existing data.

    To take homogenized data and then sue that to change the underlying data is utterly crazy.

  32. Travis Casey says:

    Dr. Spencer you are obviously a denier bought and paid for by Big Corn! /sarc.

  33. Frank K. says:

    Question: Is the TOBS adjustment only applied to U.S. data? If so, what does it say about the quality of the historical land temperature data throughout the rest of the world? How about the ocean temperature records? There’s quite a bit of arbitrary adjustment there…

  34. Mike says:

    It would be nice to see all of the adjustments made over the last few years in one list, along with a summary stating wether the slope went up, down, or stayed the same.

  35. Barnes says:

    The net of all this is simply that, at the end of the day, we don’t really know what the temperature has done. With all the slicing, dicing, homgenization, station moves, recalibration, etc. etc., the official temperature record is suspect at best. We have a lot to learn before we start using such information as the basis for forming any policy, much less a policy that has a much higher probability of doing real economic harm than the mere possibility that a degree or 2 increase in GMT (like we even know what that really is) will have on the climate system.

  36. Tab Numlock says:

    It’s just natural human optimism in wanting nicer weather.

  37. DHR says:

    Ole Humlum’s site, climate4you.com, cited in an earlier comment by Svend Ferdinandsen, keeps track of temperature adjustments made over time. I highly recommend the site for a source of all manner of climate data. In one of Dr. Humlum’s charts he shows the changes made since 2008 to the GISS global temperature dataset for the global temperature in January of 1910 and 2000. Before 2008, the difference was +0.45C and as of now, it is +0.69C Most of the changes to the dataset were made in January of 2013 but they are constantly being adjusted several times per year. What discovery was made just two years ago that required a sudden large cooling of the past and warming of the present? Has not TOBs been known and addressed for many years? And why is the data “adjusted” several times per year, just since 2008?

    And as Dr Spencer points out, another discovery of some kind was made some time after March of 2014 leading to further cooling of the past in the US corn belt.

    One wonders. And perhaps others.

    • Johan says:

      They are obviously using self-learning algorithms leading to insights beyond human comprehension.

    • Kasuha says:

      TOD adjustment is a heuristic process, derived from USCRN data. As these data keep piling up, the total value of the adjustment might keep changing.
      Of course that’s not the only adjustment applied to the data. And I speculate it’s not TOD but rather UHI what’s responsible for continued cooling of the past and warming of the present. But I know almost nothing about how UHI effect is estimated and how its adjustments are applied.

      Errors in output are necessarily smarter than people in QA.

  38. pfwells says:

    A news item a few days ago reported that the February temperature in Canada was the coldest in 115 years. Obviously a lot more adjustments will be needed.

  39. JohnG says:

    I am a retired project engineer, having led technically several mulch-disciplinary R&D projects. I’m really just getting started at looking into the adjustment rationale and methodology. Thinking about this latest occurrence, I am wondering if it might be related to “breakpoint” detection. If NCDC does look for breakpoints, and make adjustments for any detected, there might be an issue in algorithm sensitivity. If there is REAL cooling across many USHCN stations that is detected as a (false) breakpoint, I believe that they would likely make an adjustment as Dr Spencer has observed. Just a thought.

  40. Svend Ferdinandsen says:

    Could anyone tell if they allways start the homogenisation with the real original measurements?
    I mean, if they really believe the adjustments, then it would be intuitive to start the next round with these adjusted temperatures, that by definition must be the correct ones.

  41. Keith says:

    Gordon Robertson,

    Thanks for your response.

    The real background to my question is this. As Roy’s post, and those of others show, the adjustments seem wrong insofar as they invariably create warming. If that is so, where do we get a measure that would confirm the original measured surface station data are better than the adjusted data?

    Should the satellite data not then mirror the original measured data pre adjustment?

    THe satellite data are different from the adjusted data for sure, as you point out, but they do not give cooling trends as is the case for many surface stations before adjustment.

  42. Mike M. says:

    Steve Mosher has claimed (backed up, I think, by Zeke) that in the global data the positive and negative corrections to the trends do very nearly cancel out although in some areas (U.S.) they are mainly in one direction while in others (Africa) they are mainly in the other direction. He has posted graphs to that effect at both Watts Up With That and …andthentheresphysics.

  43. Dr. Strangelove says:

    Land temperature data are contaminated with UHI. When the warming is not enough, warmists further contaminate it with their “adjustments.” I dismiss these data as unreliable and recommend using satellite and weather balloon data.

  44. dave says:

    “…the error remaining…”

    What is the difference between a one-in-a million chance and a one-in-a-billion chance?

    If you are going to be exposed billions of times to the situation (For example, to the the chance that your next heart-beat will be your last,) the answer is “A lot!”

    If you are going to be exposed once (For example, to the chance that “a dinosaur-busting” asteroid hits during your life-time,) the answer is “Zilch!”

    I am flattered that you get the point of my reply. I must be getting brainier!

    • dave says:

      The above was a reply to Gordon Robertson, March 11 at 2:57 AM

    • Gordon Robertson says:

      @dave “What is the difference between a one-in-a million chance and a one-in-a-billion chance?”

      Our local lottery requiring 6 numbers out of 49 available has odds of something like 1 in 13 million. Someone wins it almost every week, so I have stopped believing in odds.

      “…the error remaining…”

      That was in reference to ball parking the remainder of a series or function that was diverging to zero, or an asymptote, at infinity. Obviously you don’t want to calculate the area under a curve that is diverging to infinity, in practical terms, so you cut it off at an arbitrary point and estimate the error in doing so.

      It was drummed into us in lab classes that you always had to include error margins for measurements.

  45. hunter says:

    Dr. Spencer, \
    Congratulations. It seems from the new paper on albedo that at least some of your ideas about how the cliamte can be more accurately described are finally finding traction.

  46. Hi Roy,

    What I suspect is happening here is that the “Corn Belt” dataset is based on the nClimDiv dataset. This was transitioned last year from using raw coop data (with no TOBs, MMTS, or other bias corrections) to using homogenized GHCN-Daily data as the basis: http://www.ncdc.noaa.gov/news/transitioning-gridded-climate-divisional-dataset

    So this isn’t a “new” adjustment per se; rather its replacing a dataset that used to be calculated from raw data to one using adjusted data.

    • For more details, see this paper: ftp://ftp.ncdc.noaa.gov/pub/data/cmb/GrDD-Transition.pdf

      “In the TCDD, many divisions experienced a
      systematic change in average station
      location and elevation during the 20th
      Century, resulting in spurious historical
      trends in some regions (Keim et al., 2003;
      Keim et al., 2005; Allard et al., 2009).

      Finally, none of the TCDD’s station-based
      temperature records contain adjustments for
      historical changes in observation time,
      station location, or temperature
      instrumentation, inhomogeneities which
      further bias temporal trends (Peterson et al.,

    • Alan Poirier says:

      Considering the importance of temperature data to so many agencies and institutions, I am totally mystified as to why these data sets have not been subjected to rigorous audits. By audits, I do not mean the cursory analyses that I see in scientific papers, but genuine, financial-style audits by independent teams of auditors who then sign off on the audits and swear to the validity of the data.

      • Gordon Robertson says:

        @Alan Poirier …”I am totally mystified as to why these data sets have not been subjected to rigorous audits”.

        Gavin Schmidt of NASA GISS, when caught using faulty data from Russia, admitted GISS did not have the budget to verify it’s data.

      • rooter says:

        Please do it Alan. Do your genuine financial-style audit.

        I guess you are independent.

        • Alan Poirier says:

          I take it, rooter, you’ve never participated in a full-fledged audit? They’re a wonder to behold. It’s amazing what a well-trained auditor can find. Sadly, it’s not my area of expertise, but I know enough when to call upon one. A forensic auditing of temperature data sets would settle much debate.

  47. scott says:

    Why the Corn Belt? Why the Summer only? Funniest cherry pick I’ve seen in a long time.

    What happened to the rust belt in Fall, or the Ural Mountains on prime number days? This is just crazy. Maybe next time you’ll want to look at the entire globe, because adjustments are making the warming trend less warm. But that would be too much of a shocker to your audience, so back to cherry picking.

    • lemiere jacques says:

      and so what?

      as long there is a valid reason to adjsut data there is non problem with it…

    • Alan Poirier says:

      Hmm, might have something to do with the growing season. Last time I looked we can’t grow corn in winter.

  48. Doug   Cotton says:


    Roy and others:


    This experiment is a real breakthrough (written up only last year) because it proves that a force field like centrifugal force or gravity does in fact create a temperature gradient at the molecular level, cooling at the top where potential energy is greatest.

    The Second Law of Thermodynamics is all about entropy maximization, and that is achieved when the sum of mean molecular kinetic energy and gravitational potential energy is homogeneous at all altitudes. This explains the temperature gradient, not back radiation, and, as I have told you several times, Roy, your assumption of isothermal temperatures (and that of Hansen and Pierrehumbert) is wrong. This is now proven empirically in this centrifuge machine and also in the vortex cooling machine which works on the same principle.

    I shall be posting comments about your error, Roy, on several hundred climate threads, mostly in social media. Don’t take it personally, for I understand how you have been misled by the AGW crowd and brainwashed with their false physics.

    • Mike M. says:

      In case anyone is in doubt, Doug Cotton is weakly coupled to reality, as usual. The Chervenkov et al. device appears to operate under near collision-free conditions, so that the energy of individual molecules is conserved. Then by forcing the molecules to do work against an external electric field, they reduce the temperature of the molecules. Very interesting, but of no relevance to the atmosphere.

  49. dave says:

    I think Gordon Robertson’s math prof was talking about the VARIANCE of variables in certain mathematical models of probability distribution.
    The series formulae do not converge and therefore the concept is not defined. He should have left it at that. It does not matter for inferences about reality, since no probability model is expected to be an explanation of everything ON ITS OWN.

    • Gordon Robertson says:

      @Dave “I think Gordon Robertson’s math prof was talking about the VARIANCE of variables in certain mathematical models of probability distribution”.

      The course was on multivariable calculus not on probability theory. We had a separate course on probability and statistics.

      After further thought I am sure we were discussing the convergence of a curve to a value because we were discussing the area under the curve. The error was the remainder when the curve was cutoff as it went to infinity.

      It’s been a long time but I had remembered that a series can converge. I looked it up:


      The way we learned it a curve can represent a polynomial which can be represented by a series. Bit foggy but that what I remember.

  50. dave says:

    As to “estimates of error in measurements,” in general:

    We did an experiment in school to illustrate how tricky it could be. We were asked to estimate the bulk modulus of a material by a method which would be certain – provided there were no errors; and we were asked to estimate the range of the inevitable error of the final estimate. Of course, it was obvious that the method was deliberately ridiculous, and so I put a range of + or – 60% on my final figure. Then I looked up the true figure in an Industry Handbook – and my error was more like 10,000 %

    The lesson intended was the necessity of inventing a GOOD, PRACTICAL approach to a problem. Another interesting point was that the several groups of students all had similar final estimates. So there was a particular BIAS built in to the method and not merely a large VARIANCE.

    It is instructive to read the descriptions by older scientists, such as Faraday, of their work. None of this “Here is my hat! Here is my rabbit!” It is all, “First I tried cat-gut but it was noticeable that it stretched immoderately under the weight I was investigating; and therefore, following a suggestion of Mr Davey, my assistant made for me…”

    • Gordon Robertson says:

      @Dave…”It is instructive to read the descriptions by older scientists, such as Faraday…”

      Agree 99.9999999% but you have to calculate the error remaining. 🙂

      I have learned a heck of a lot by reading the original works of Clausius and so on. The thing I like is that they tended to explain a lot subjectively, so you got a feeling for the physical reality. For example, he explained exactly what he meant by entropy. To hear a lot of people talking about it today I am left with the impression that they have no idea what it is as related to reality.

      David Bohm, a great physicist and a friend of Einstein once claimed that any equation with no reality to back it is garbage. Feynman claimed that quantum theory worked but that no one knew why. He was another one of those profs who claimed he would not give a lecture unless he could explain the reality behind it.

      I think physics has taken a turn for the worse with the ambiguity of quantum theory. Even Schrodinger, circa 1925, got out of it when Bohr started making ridiculous assumptions about reality, based on quantum theory.

      In programming, object oriented language has obfuscated programming to the point where it seems there is no hardware underlying a computer. One of the basic structures in C++ is a class. I read book after book trying to understand what was meant by a class, all to no avail. As with entropy, a lot of authors talked around it but no one could explain what it was.

      Finally, I found a book by Bjorne Stroustrup, who invented C++. On a discussion of classes his opening line was “a class is a user-defined type”.

      Ding, ding…lights go on, bells ring. Of course, anyone studying programming is almost immediately introduced to a type, like char, int, or struct. All of them are already defined by the system but a class was inserted by Stroustrup into C++ to give the programmer a means to define his/her own type.

      Final example. While studying transistors over the years I had studied the theory of holes in semiconductors. Some universities teach that holes have mass and some define them as positive test charges.

      Got an article by Thompson who developed the notion of holes in transistor. One of the first things he stated was that a hole is nothing but a concept. It was developed to lend credence to a hole being left when an electron vacated a valence position. Also, it’s much easier in p-type silicon to visualize hole flow as opposed to electron flow.

      • dave says:

        “…a class is a user-defined type.”

        That actually DOES clarify it for me, who does not program often.

        Whenever I get an “Aha!” moment – usually long after “the clever people” have moved on – I am left genuinely unsure whether:

        (a) The clever people saw it all along but thought it was too trivially obvious to need explaining, or,

        (b) the clever people actually were – and still are – not very clever at all.

        • Gordon Robertson says:

          @Dave…I opt for (b).

          When I mention class, please note that I am talking about C++, which Stroustup wrote. Other languages may define a class differently.

          Still, I don’t see why a class should be anything other than a type.

          • RealityObserver says:

            You ran into the difference between programmers who think in terms of the (artificial) language, and those who think in terms of the (real) things they are symbolically manipulating.

            Never had a problem with it, myself – whether I wrote “typedef char[100] StringThing” or “class StringThing” (with a member “char[100]”, I was always thinking about “StringThing” – not how the language was being constructed.

            (I have a similar fight with almost every fellow developer I run into that uses C++. I write something like “int* SomeKindOfThingThatIsAnInteger”, while they write “int *SomeKindOfThingThatIsAnInteger”. I’m concentrating on the fact that I am manipulating one or more “SomeKindOfThingThatIsAnInteger”, not a “ThingThatPointsToSomeKindOfThingThatIsAnInteger” (maybe, maybe not).

  51. Reziac says:

    Someone’s comment reminded me — on Another Forum[TM] someone remarked that the climate modeling software being used to make these predictions (and to massage the data) is all closed source. In other words, there’s no way for independent parties to vet the integrity of the modeling software by inspecting the source code, let alone check it for logic/math errors or other egregious bugs (which are all too likely in any large software project).

    If anyone knows to the contrary, please inform me.

    • Doug   Cotton says:

      What we do know is that the models start from the wrong assumption that a planet’s troposphere would be isothermal in the absence of any molecules that absorb anything in the range of frequencies emitted by the Sun. It would not be and it would be colder at higher levels and probably liquefy and rain to the surface.

      In other words, those like James Hansen (with little understanding of radiation, let alone thermodynamics) had absolutely no understanding that the Second Law is all about entropy increasing as unbalanced energy potentials dissipate.

      So of course the models are totally wrong.

      In fact water vapor causes surface temperatures to be lower – as proved in a study of 30 years of real world data in the paper linked here which I suggest you read if you have any interest in finding out what really happens in planetary tropospheres, crusts, mantles and cores.

      • Reziac says:

        That orbit-related plot is very interesting.

        I wonder what that would look like combined with the sunspot data that someone found correlates so well with climate cycles? Average the two and see how closely the combined data follows the temp records.

        • Doug   Cotton says:

          Yes Reziac

          We know that magnetic fields from the planets reach to the Sun and it seems feasible that the angle at which they do may affect Sunspots. Whatever the reason for the past correlation, it is so compelling that I feel confident about the implied predictions here up to the year 2200.

          Of course I am also 100% confident in the physics in that website which completely debunks the greenhouse conjecture and presents an hypothesis which is supported by more than adequate evidence.

    • Mike M. says:


      The NCAR Community Climate Model is open source: http://www2.cesm.ucar.edu/working-groups/wawg/code

      I think that many other models are as well.

  52. Doug   Cotton says:

    No Mike M:

    The molecules are working against the huge centrifugal force field gaining potential energy in that force field at the expense of kinetic energy. They would not get down to 1°K due to guiding electrodes if the whole system were not spinning rapidly. As to the density, that makes no difference you clot as you obviously don’t understand how kinetic energy is shared in collisions. It is well understood by others with better understanding of Kinetic Theory (as used by Einstein) than yourself, that mean kinetic energy has a gradient in a force field because the Second Law is all about minimizing unbalanced energy potentials. So thermodynamic equilibrium is attained when Kinetic Energy of each pair of molecules about to collide is equal. The system must thus be isentropic which means mean molecular (PE+KE) is homogeneous. Now you prove otherwise, but don’t give me garbage based on equations for thermodynamic potentials which are derived by assuming gravitational potential energy does not vary – such as are used to derive the Clausius “hot to cold” corollary which only applies in a horizontal plane. This comment is continued here: http://climate-change-theory.com

  53. KTM says:

    Since the land-based temperature record is reported as an average of the daily minimum and daily maximum, I started to wonder if the satellite temperatures collect similar minimum and maximum data.

    Do the satellites track daily min/max temperatures at all, and if not how do they handle the diurnal variations of the signal they’re measuring?

    • MarkB says:

      My understanding is that the satellites observe a particular point on the earth twice per day, so they likely wouldn’t be seeing the daily min and max temperature. The orbits nominally observe the same point on earth at the same two times each day but there is a drift over the lifetime of the satellite and the time of observation is different across the series of satellites. Hence there is a diurnal variation problem that must be addressed in processing the satellite data.

      You’ll find a relevant discussion in this paper: http://images.remss.com/papers/rsspubs/Mears_JTECH_2009_MSU_AMSU_construction.pdf

    • Gordon Robertson says:

      @KTM…”Since the land-based temperature record is reported as an average of the daily minimum and daily maximum, I started to wonder if the satellite temperatures collect similar minimum and maximum data”.

      Why would you need a min/max when you are sampling trillions and trillions of data points during an instantaneous position of the sat scanners? The max/min used on thermometers is part of the construction of the unit. There are sliders on top/bottom of the mercury, or whatever is used, that remains fixed at the max or min position.

      Here’s a max/min thermometer typical of what you’d find in a surface station container:


      I find that to be incredibly inaccurate as far as stating a global average. On top of that, the surface stations only cover about 30% of the planet’s surface (the land) and NOAA/NCDC has cut the number of reporting stations from about 5000 world wide to about 1000.

      The max/min hardly matters since they are filling in missing reporting stations with interpolation and homogenizing the resulting temperatures using climate models.

      I can see scanner position on the surface being an issue due to orbital variations but the significant errors, which were well within the stated error margin, were fixed circa 2005.

  54. pochas says:

    They’re doubling down in the face of the Senate data tampering hearings. They expect to be able to BS their way through these and continue to be licensed as data adjusters. Hence the globe will continue to warm right through the coming Grand Minimum. Take that, you DENIERS!

  55. The same effect of cooling the past to make the present warmer is also found in the LOTI data set. In discussions with NASA I was told that no past records are kept as the current issue is always the most accurate. The oldest LOTI I had was December 1998 but it was for the Meteorological data not the full set; so I compared it to the same Meteorological data for January 2015.

    What I found was that the cooling of the past was more evident in this data set that in the full LOTI data. But what I also found was the for the period 1880 to 1950 there was the most cooling in 1880 and the least in 1950. From 1951 to 1980 (the base 14.0 degrees C) there was no warming or cooling for the period (There can’t be as this is the base so the cooling in 1951 was exactly offset by the warming in 1980. From 1981 to the present there was an upward trend in temperatures. The link below is to the analysis I did last month.


    My opinion is that there will be a push by the administration to get a climate treaty at the Paris COP21 climate conference in November/ December this year. To insure that it is agreed too NASA and NOAA have been instructed to make 2015 the hottest year ever by any means possible.

    Its easier to make the past colder then present warmer. So a lot of past manipulation down and only a little for the present up; and so long as it makes 2015 the hottest year on record it doesn’t matter where the numbers came from. I’ll be tracking this each month until the end of the year.

  56. Lewis Guignard says:


    You would lead a person to not trust government.

    The alarmist position is one of controlling the economy and thus the people. It has nothing to do with science or warming or changes to the ocean level, it is solely about controlling lives. The unfortunate part is that those who live without electricity are being condemned to 16th century lives while those who do the condemning live in air-conditioned and heated comfort, with all the associated amenities. I could go on….

  57. Robin McMeeking says:

    This discussion is way above my pay grade. I had 4 years of weather observing experience in the mid ’60s. This whole 24 hour min/max determination is new to me. Obviously it was an administrative function performed at a later date. This may have been done by an observer in his/her spare time. I used to prepare sunrise/sunset tables for our location.

    What I don’t understand is why these numbers would be used instead of the complete hourly record since the potential for discrepancies is obvious. Perhaps this is the only data that has been digitized.

    It should be fairly simple to take a few years of recent digitized data for a reasonable sample of reporting stations and analyze the impact of TOB on the calculation of annual averages and the relative accuracy of using min/max versus the full hourly readings.

    Pardon my butting in.

    • Robin McMeeking says:

      Additional thoughts. If these min/max values were not important for some reason, such as forecasting accuracy measurements, there is added potential for human error in them as well as TOB.

      One thing seems clear, increases in atmospheric CO2 result in cooling in the past. Spooky!

      • Robin McMeeking says:

        Further research has resolved my initial confusion. Apparently the weather stations in question had a min/max thermometer that would have to be reset after taking a reading. However, that leads to additional questions. Why was this done? Was this a local weather station initiative or a requirement of the NWS or its equivalent? It seems that if it was part of a large program there would have been standards specifying the time of day to take readings. In that case the TOB would seem to be minimized. Is using a mean daily temperature a suitable proxy for average daily temp?

  58. John Robinson says:

    Has NOAA adjusted the USCRN dataset yet? If not I guess it’s just a matter of time before they do, the “narrative” requires it.

  59. Ron C. says:

    I have recently published a study on this subject.
    Adjustments Multiply Warming at US CRN1 Stations

    A study of US CRN1 stations, top-rated for their siting quality, shows that GHCN adjusted data produces warming trends several times larger than unadjusted data.

    The full text and supporting excel workbooks are available here:


  60. Ron C. says:

    Update to Adjustments Warming US CRN#1 Stations

    In response to a comment, this post shows the effect of GHCN adjustments on each of the 23 stations. The average station was warmed by +0.58 C/Century, from +.18 to +.76, comparing adjusted to unadjusted records.

    19 station records were warmed, 6 of them by more than +1 C/century. 4 stations were cooled, most of the total cooling coming at one station, Tallahassee.

    So for this set of stations, the chance of adjustments producing warming is 19/23 or 83%.

    Details are here: https://rclutz.wordpress.com/2015/03/19/update-to-adjustments-warming-us-crn1-stations/

  61. bobmaginnis says:

    If they considered the cooling effect of artificial irrigation, which in the cornbelt has increased in the last 100 years, they should probably adjust the trend even higher. The evaporation of just a few millimeters of water at the surface requires as much energy as heating the entire miles high column of atmosphere above it 1 degree C.

  62. Yi Gupton says:

    This blog about Even Though Warming Has Stopped, it
    Keeps Getting Worse? < Roy Spencer, PhD has helped me a lot, is very well written. You can lose weight easily with this product: https://s96.me/fit . Kiss you All!

Leave a Reply