Corrected RCP Scenario Removal Fractions

February 6th, 2020 by Roy W. Spencer, Ph. D.

Well, as I suspected (and warned everyone) in my blog post yesterday, a portion of my calculations were in error regarding how much CO2 is taken out of the atmosphere in the global carbon cycle models used for the RCP (Representative Concentration Pathway) scenarios. A few comments there said it was hard to believe such a discrepancy existed, and I said so myself.

The error occurred by using the wrong baseline number for the “excess” CO2 (atmospheric CO2 content above 295 ppm) that I divided by in the RCP scenarios.

Here is the corrected Fig. 1 from yesterday’s post. We see that during the overlap between Mauna Loa CO2 observations (through 2019) and the RCP scenarios (starting in 2000), the RCP scenarios do approximately match the observations for the fraction of atmospheric CO2 above 295 ppm.

Fig. 1. (corrected) Computed average yearly rate of removal of atmospheric CO2 above a baseline value of 295 ppm from (1) historical emissions estimates compared to Mauna Loa CO2 data (red), (2) the RCP scenarios used by the IPCC CMIP5 climate models Lower right), and (3) in a simple time-dependent CO2 budget model forced with historical emissions before, and EIA-based assumed emissions after, 2018 (blue). Note the time intervals change from 5 to 10 years in 2010.

But now, the RCP scenarios have a reduced rate of removal in the coming decades during which that same factor-of-4 discrepancy with the Mauna Loa observation period gradually develops. More on that in a minute.

First, I should point out that the CO2 sink (removal rate) in terms of ppm/yr in three of the four RCP scenarios does indeed increase in absolute terms from (for example ) the 2000-2005 period to the 2040-2050 period: from 1.46 ppm/year during 2000-2005 to 2.68 ppm/yr (RCP4.5), 3.07 ppm/yr (RCP6.0), and 3.56 ppm/yr (RCP8.5). RCP2.6 is difficult to compare to because it involves not only a reduction of emissions, but actual negative CO2 emissions in the future from enhanced CO2 uptake programs. So, the RCP curves in Fig.1 should not be used to infer a reduced rate of CO2 uptake; it is only a reduced uptake relative to the atmospheric CO2 “overburden” relative to more pre-Industrial levels of CO2.

How Realistic are the Future RCP CO2 Removal Fractions?

I have been emphasizing that the Mauna Loa data are extremely closely matched by a simple model (blue line in Fig. 1) that assumes CO2 is removed from the atmosphere at a constant rate of 2.3%/yr of the atmospheric excess over a baseline value of 295 ppm.

OK, now actually look at that figure I just linked to, because the fit is amazingly good. I’ll wait….

Now, if I reduce the model specified CO2 removal rate value from 2.3 to 2.0%/yr, I cannot match the Mauna Loa data. Yet the RCP scenarios insist that value will decrease markedly in the coming decades.

Who is correct? Will nature continue to remove 2.0-2.3%/yr of the CO2 excess above 295 ppm, or will that removal rate drop precipitously? If it stays fairly constant, then the future RCP scenarios are overestimating future atmospheric CO2 concentrations, and as a result climate models are predicting too much future warming.

Unfortunately, as far as I can tell, this situation can not be easily resolved. Since that removal fraction is MY metric (which seems physically reasonable to me), but is not how the carbon cycle models are built, it can be claimed that my model is too simple, and does not contain the physics necessary to address how CO2 sinks change in the future.

Which is true. All I can say is that there is no evidence from the past 60 years (1959-2019) of Mauna Loa data that the removal fraction is changing…yet.

There is no way for me to win that argument.


284 Responses to “Corrected RCP Scenario Removal Fractions”

Toggle Trackbacks

  1. Sisyphus says:

    So what exactly is the reasoning behind the RCP reduced uptake curves? Therein resides the mechanism which needs cogent address…

    • Roy W. Spencer says:

      Good question. I have no idea.

      • wert says:

        I guess there has been speculation on reducing ocean uptake due to surface warming, and disregard on global greening. This is something in which claims are possible and fears can be promoted as theory requiring action.

      • Leitwolf says:

        Excuse me, but I did answer it already in my post a few days ago.

        http://www.drroyspencer.com/2020/02/will-humanity-ever-reach-2xco2-possibly-not/#comment-427625

        The RCP uptake model is extremely simple and in all scenarios they have a seconday reservoir which is exactly twice (!!) the size of the atmosphere. You just need to look at the accumulated amount of CO2 sinks to find this out. Lets go through it.

        First of all the RCP model as I have it, starts at the year 1765 with a 278ppm of CO2 in the atmosphere. Then again, what is correct conversion factor for ppm to Gt of CO2? 7.8 or 8.1? I’ll stick to the latter for the sake or argument. The figures are not perfectly precise, but good enough to show what goes on. So..

        RCP3: By 2034 the atmosphere holds 405ppm, which at a cut-off of 278ppm for “natural levels” means an additional 1030Gt of CO2. Also by 2034 CO2 sinks have taken up an accumulated 2050 Gt. of CO2. At this point in their model, CO2 sinks turn negative, as the secondary reservoir holds about 2times as much CO2 as the atmosphere and thus is equally “full”.

        RCP4.5: Same story here. Since they expanded their models up to the year 2500(!?) things turn relatively simple. By the year 2500 atmospheric CO2 is at a stagnant 543ppm, which translates into 2138Gt of anthropogenic CO2. CO2 sinks at this point have taken up an accumulated 4097Gt of CO2, sightly less than double that amount. So the secondary reservoir is not quite as full as the primary one, and thus CO2 sinks still take up some 2.47Gt. p.a.

        So even though I can not name the exact parameters, that part with 2x sized secondary reservoir is quite obvious I guess.

    • barry says:

      Yes, I’ve read many times in various places that because warmer water is less soluble to gases, the ocean uptake will diminish in the future. This contention appears in the IPCC First Assessment Report. There is more detail in the 3rd Assessment Report.

      https://www.ipcc.ch/report/ar3/wg1/

      Chapter 3 – THE CARBON CYCLE AND ATMOSPHERIC CARBON DIOXIDE

      See section 3.2.3.3 – Future changes in ocean CO2 uptake – for this and other physical processes that could impact CO2 sink rate.

      • rah says:

        Ok. So can someone explain to me why the IPCC models are correct when according to the accepted science, atmospheric CO2 levels during the Cambrian -Ordovician- Silurian periods ran around 5,000 to 6,000 ppm? The period when the greatest explosion of life in the oceans occurred and mollusks first appear in the fossil records along with a plethora of hard corals.

        And then there was the Jurassic-Cretaceous periods when atmospheric CO2 levels ran around 4,000 ppm

        Were ocean temps cooler in those periods than now despite the high levels of CO2 and thus able to uptake more CO2?

        • bdgwx says:

          Solar forcing was about -12.0 W/m^2 600 million years ago. It would take about 3000 ppm of CO2 to create a +12.3 W/m^2 offsetting force back then just keep the temperature the same as today assuming all other things remained equal. The extra CO2 at 6000 ppm relative to 3000 ppm only generates +3.7 W/m^2 of force. It was several degrees warmer during the Cambrian period so that matches up reasonably well. Although there are many forcing components to consider just using the net effect of only solar and GHG provides a pretty close match through most of the paleoclimate record. The big point here…the Sun was dimmer in the past.

    • Nate says:

      As noted in the last article, the the co2 pressure in the ocean mixed layer Pml is closely tracking the Patm. Thus NEW annual emissions are almost immediately shared with ML, and to some extent the land biosphere.

      Thus the fraction removed may have little to do with Patm – P(1895) that is in the model. Rather it is proportional to annual emissions. This explains the fraction removed ~ constant.

      It just so happens Patm- P(1895) history is similar in shape to history of emissions. So the model works for now.

      That wont be true in future when emissions flatten, but Patm-P(1895) continues to grow.

      In the longer term the Pml-Pdeep must equilibrate.

      • Nate says:

        Corr: ‘Rather the amount removed should be proportional to emissions, and the fraction removed should be ~ constant.’

      • Svante says:

        Nate says:
        “NEW annual emissions are almost immediately shared with ML”.

        That makes a lot of sense Nate.

    • TimTheToolMan says:

      So what exactly is the reasoning behind the RCP reduced uptake curves?

      Because they only consider the chemistry and not the life behind it.

  2. Perfecto says:

    Euan Mearns found a similar CO2 sequestration rate of 2.5% / year. He includes some discussion of the Bern model that seems to be the basis for IPCC work.

    http://euanmearns.com/the-half-life-of-co2-in-earths-atmosphere-part-1/

    Understanding “Carbon Feedbacks in the Climate System” is considered a Grand Challenge of climate research.

    https://www.wcrp-climate.org/grand-challenges/grand-challenges-overview

    Regarding land and ocean carbon sinks, the question of “How do they currently operate?” seems to be open.

  3. Elliott Bignell says:

    Carbon reuptake is a process which human activity can presumably modulate to some degree. Any physical model, equally presumably, attempts to factor in the response of vegetation and soils to climate change. The two cannot, however, be separated, so the models scenarios must contain implicit or explicit assumptions about policy. Policy is an open question and also an opportunity. Once again, it appears we can only hope to rely on an aggregate of many scenarios for projection.

    Even leaving aside the complexity of soil and vegetation systems, it is not surprising if this is an area open to considerable uncertainty.

  4. Steve McGee says:

    The budgets appear uncertain.

    Some of the uptake could be going into increased plant and phytoplankton growth, both predicted to increase with increased CO2.

    Some of the uptake could be going into the oceans, particularly the Southern Ocean which is also thought to be taking up heat.

    Ocean uptake of CO2 increases exponentially with wind speed:
    https://media.springernature.com/original/springer-static/image/chp%3A10.1007%2F978-3-642-25643-1_2/MediaObjects/978-3-642-25643-1_2_Fig10_HTML.png

    That would be consistent with CO2 induced cooling over Antarctica.

    And, this paper indicates an increase in oceanic speeds:
    https://www.the-scientist.com/news-opinion/global-ocean-circulation-is-speeding-up-67066

  5. A by-line to the above note that global ocean circulation is speeding up, and connected to growing sequestration of CO2 into the ocean, would probably result in increased upper layer bioproductivity and also increased ocean floor deposition of organic matter.

  6. Perfecto says:

    The main reference for the Bern carbon model seems to be Joos et al. 1996 “An efficient and accurate representation of complex oceanic and biospheric models of anthropogenic carbon uptake” (free online). The paper lists many decay constants and amplitudes in the appendix, but the only model validation appears to be comparison to other models and some cursory discussion of C14 concentration.

    Is there a validation paper for the consensus carbon model? What are the parameter uncertainties?

    • I don’t know. Yes, it looks like they have an approximation to the carbon cycle models to run fast. I wish I had more time to look into this issue, I’m sure the papers those models are based upon describe why the CO2 uptakes changes on long time scales. It would have to be related to some quasi-“saturation” effect that causes a major CO2 sink to be less efficient over time.

  7. Greg Goodman says:

    Here is a quick comparison I did years ago between ICOADS SST and *rate of change” of CO2 ( MLO ).

    https://climategrog.files.wordpress.com/2013/12/ddtco2_sst_15mlanc.png

    The similarity between ICOADS and d/dt(CO2) is striking. Also note the underlying rise in both. This may reflect Henry’s law relating to partial pressures.

    Basic physics would refute the idea that you can not expect a constant flat rate %age like 2.3%. This will diminish both as SST rises ( Henry ) and as pCO2 in oceans increase. Both of these are known facts. There may be some more realistic middle ground between Dr Spencer’s fixed 2.3% and RCP4.5

  8. barry says:

    IPCC references on physical processes that could affect CO2 uptake rate.

    Third Assessment Report

    https://www.ipcc.ch/report/ar3/wg1/

    Chapter 3 THE CARBON CYCLE AND ATMOSPHERIC CARBON DIOXIDE

    Section 3.2.3.3 Future changes in ocean CO2 uptake

    Fourth Assessment Report

    https://www.ipcc.ch/report/ar4/wg1/

    Chapter 7 – COUPLINGS BETWEEN CHANGES IN THE CLIMATE SYSTEM AND BIOGEOCHEMISTRY

    Section 7.3.4 – Ocean Carbon Cycle Processes and Feedbacks to Climate

    Fifth Assessment Report

    https://www.ipcc.ch/report/ar5/wg1/

    Chapter 6 – CARBON AND OTHER BIOGEOCHEMICAL CYCLES

    Section 6.3.2.4 – Carbon Dioxide Airborne Fraction [observations]

    Section 6.4 – Projections of Future Carbon and Other
    Biogeochemical Cycles

  9. Tim S says:

    I have a simple question. Are they using Henry’s Law? Good data should be available.

  10. Midas says:

    Watch these 5 videos for an analysis of the uptake of CO2.
    (1) https://tinyurl.com/Carbon-Cycle-Overview
    (2) https://tinyurl.com/Anthrop-Carbon-Keeling-Crve
    (3) https://tinyurl.com/Ralph-Keeling-Missing-Carbon
    (4) https://tinyurl.com/Conversation-w-Ralph-Keeling
    (5) https://tinyurl.com/Carbon-Uptake-Land-Ocean

    The first is only the an overview of the carbon cycle. The rest are relevant to this discussion.

  11. Entropic man says:

    Tim S

    There was some discussion of this a while back at Realclimate. Henry’s Law was part of it.

    A number of papers suggested that the ocean carbon sink would take up less CO2 over time.

    http://www.realclimate.org/index.php/archives/2007/11/is-the-ocean-carbon-sink-sinking/

    • barry says:

      That appears to be a consensus view to date, whereas the future land uptake has a less uniform view.

    • Entropic man says:

      Just spent a few minutes playing with Henry’s Law.

      Around 14C equilibrium CO2 concentration in surface waters increases in proportion to atmospheric CO2.

      An increase of 1K reduces dissolved CO2 by 5%.

      What would that do to conditions in 2050, when CO2 is expected to double from 1880 and temperature to increase by 3C?

      CO2 in surface waters would increase by 100% due to increased partial pressure and decrease by 15% due to increased temperature.

      That is an increase of 85%, an effective decrease in the % of CO2 stored in the ocean sink.

  12. bdgwx says:

    Another interesting angle is the duration of the pulse. We already know that the IPCC advocates for longer adjustment times for larger pulse sizes. But what about pulse duration? The reason I ask is because the ~100 ppm pulse from the glacial eras to the interglacial eras took thousands of years yet the depletion back toward the glacial level takes upwards of 100,000 years to play out sometimes. Additionally the PETM pulse was an order of magnitude less in terms of duration and yet it hung around for thousands of years.

    • Stephen Paul Anderson says:

      This is total garbage. A 100 ppm pulse that took thousands of years is not a pulse. Also, we know nothing about e time or even if the proxy data has any real validity. Instead of advocating why doesn’t the IPCC do some real science? You don’t even realize the continuity equation is a mass balance equation.

      • bdgwx says:

        A pulse is an increase in mass/ppm in the atmosphere. That increase can be quick or slow. It’s still an increase and still a pulse. If your challenge is with the terminology then please provide another word that is less offensive to you and I’ll use that instead.

        • Stephen Paul Anderson says:

          A pulse implies an instantaneous increase. We know this is not an instantaneous increase. That is very old proxy data you are trying to use.

          • Stephen Paul Anderson says:

            Also, you’re only use this odd argument because there’s no physical or mathematical basis for your arguments using contemporary data.

      • bdgwx says:

        Is “release” acceptable to you? I’m good with it if you are.

  13. Old Chemist says:

    CO2 + H2O = H2CO3 (carbonic acid ) + CO3(2-) (carbonate) = 2 HCO3(-) (bicarbonate).

    The oceans are basically carbonate buffered solutions. The amount of carbonate available to the oceans in the Earth’s crust is orders of magnitude greater than the CO2 in the oceans which in turn is some 60 x greater than CO2 in the atmosphere. There are multiple dynamic equilibria going on. CO2 dissolves in water and is released from water into the air (Henry’s Law). Dissolved CO2 forms carbonic acid which is a weak acid and ionized only to a small extent. Calcium carbonate the predominant form of carbonate is soluble to a very low level (13 mg/L) but is very available. Rates of chemical reactions (greater than 1st order) are dependent on concentration and temperature. If the ocean warms the equilibrium between CO2 in the air and in the water shifts towards the air, but the amount of soluble carbonate, the rate of its solubilizarion, and the rate of bicarbonate formation will increase. The oceans can easily sequester the amount of fossil fuel produced CO2 as bicarbonate. The pH of bicarbonate is around 8.2 which is why “ocean acidification” is a red herring

    • Nate says:

      The measured reduction in Ph up to now is 0.1, expected to be 0.2 with co2 doubling.
      Is that wrong?

      In the distance past, in hothouse Earth with 5 – 10 x preindustrial co2, the acidity wreaked havoc on ocean ecosystem.

    • Bob says:

      OC
      Let me guess – you’re one of those nuts who believes “acidification” requires the pH to drop below 7.

    • Stephen Paul Anderson says:

      You’re right on OC. Not only are the ocean surface and the atmosphere in equilibrium but this doesn’t tell us the direction of flow. However, if the direction of flow is from deep ocean to surface ocean to atmosphere you’d expect pH to increase. Going around measuring ocean pH’s doesn’t tell you anything unless you know the pH of the ocean at every point coupled with all the direction of flows and have a mass balance at the same time.

      • bdgwx says:

        Ocean pH is declining so we know the direction of flow is from the atmosphere into the hydrosphere. Therefore the hydrosphere is not the source of mass for the increase in the atmosphere.

        • Stephen Paul Anderson says:

          Don’t think so. The flow could be from deep ocean to surface ocean to atmoshere. pH doesn’t really indicate direction of flow.

          • bdgwx says:

            Is deep ocean carbon C13 depleted like fossil carbon?

            Can deep ocean processes source a carbon increase in the shallow ocean that quickly?

            Do we have evidence of similar carbon increases in the shallow ocean that were known to be sourced by the deep ocean in the paleoclimate record?

            Why were these deep ocean processes dormant over the holocene and then suddenly activated at the exact same time as the industrial revolution?

            Where did all of that fossil carbon (I’m talking mass here; not isotope tracers) go if not into the atmosphere and hydrosphere?

            Why is ocean pH declining?

            What kind of tests can you think of that can falsify the deep ocean -> shallow ocean -> atmosphere hypothesis?

          • Stephen Paul Anderson says:

            I don’t think we know a lot about deep ocean but the IPCC believes carbon flows from surface to deep ocean and from deep ocean to surface. That there is an equilibrium.

          • Stephen Paul Anderson says:

            What kind of tests? I’ll ask you the same question? But, if most of the increase from 1750 to 2020 is due to natural emission that would be a strong falsifier of ocean being a net sink, wouldn’t it?

          • Stephen Paul Anderson says:

            Why is ocean pH declinging? I’m not sure that it is. But, it seems reasonable that it would be since atmosphere is in equilibrium with surface ocean and since atmospheric CO2 is increasing then surface ocean CO2 should be increasing.

          • bdgwx says:

            Tests of the deep ocean -> shallow ocean -> atmosphere hypothesis might include…

            Sample deep ocean carbon and see if it C13 depleted.

            Sample deep ocean carbon and see if it C14 depleted.

            Sample deep ocean pH and see if it increasing.

            Sample shallow ocean pH and see if it declining.

            See if the paleoclimate record has evidence of large atmospheric increases of equivalent magnitude and duration.

            See if the paleoclimate record has evidence of large shallow ocean increases of equivalent magnitude and duration.

            See if the deep ocean is warming at a rate that can explain the increase in mass of the shallow ocean.

            I don’t know what the expectation would be I am envisioning tests of oxygen levels in the deep ocean, shallow ocean, and atmosphere.

            I’m sure there are more tests that could be done. I just thought of these off the top of my head. It would be nice to have an expert weigh on this topic.

          • bdgwx says:

            A decline of pH at least in the shallow ocean would be an expectation of both atmosphere -> shallow ocean and deep ocean -> shallow process net flows. An observation of a increase of pH in this layer would falsify both hypothesis. The evidence of declining pH in the shallow ocean is pretty convincing though.

            So really all that test suggests is that the net flux of mass is positive in the shallow ocean. As long as more carbon is moving deep ocean -> shallow ocean than is carbon moving shallow ocean -> atmosphere then the complete flow deep ocean -> shallow ocean -> atmosphere is still consistent with shallow ocean pH observations.

            The problem is that pesky mass from the fossil/cement/landuse reservoirs. No matter what model you use to dispatch it that mass got incorporated into the carbon cycle in a geological blink of an eye. Trying to pretend like that mass doesn’t matter is problematic to say the least. It definitely throws a monkey wrench in alternative hypothesis.

          • Stephen Paul Anderson says:

            Currently with information available no way to know.

  14. Bjarne Bisballe says:

    All Gts are gigatonnes of CO2
    We are informed that fossile 36 Gt is released to the atmosphere every year right now. That figure will increase with 0.6% each year. From Mauna Loa we know that approx 2 ppm stays in the atmosphere and as a ppm is 7.81 Gt, approx. 16 Gt stays in the atmosphere, so 20 Gt of the 36 Gt disappear with 8 Gt in the ocean and 12 Gt in the plants. As there is 3200 Gt in the atmosphere and it goes up with 16 Gt in a year, the level is 0,5% up each year. Thus the traffic the to ocean and the plants is also 0.5% up every year. 36/06 up and 20/05 down ends up with 16 up with 0,2%/year (weighted) and that can be calculated with a ‘compound interest’ table. It shows a very linear increase in the concentration (approx. 2ppm/yr) BUT: Plants grow from that CO2, and the earth is greening at at rate of 0.4%. (NASA?)
    With repect to this (again calculated with a ‘compound interst’ table), the increase is slowing down so much, that after 100 years and ahead the curve for the atmosphere concentration will slowly go near horisontal at a level of 280 – 290 ppm above the level of today and that will happen after approv 150 years.
    Conclusion: If we continue as today (36/0.6%), the max CO2 level will be approx 700 ppm in approx 150 years and it will not go higher. – sorry for my english

  15. ren says:

    Sorry.
    Subzero temperatures combined with a brisk north wind will send AccuWeather RealFeel® Temperatures plummeting to dangerously cold levels Thursday morning.
    https://wordpress.accuweather.com/wp-content/uploads/2020/02/RFThursAM.jpeg?w=632

  16. Rcik Jafrate says:

    From a qualitative point of view…

    1. With such tiny CO2 concentrations, it’s not surprising that a simple model fits the data. (i.e. bottoms of an exponential curves can be approximated by a straight line)

    2. Old Chemist makes a compelling case for a very large reservoir which seems to be consistent with the simple model.

    3. If the growth rate, of plant and other CO2 gobblers, increases with CO2 concentration, it would seem to me that biological uptake would increase exponentially with CO2 concentration. At the very least one would expect a proportional increase in uptake.

    Thanks Dr. Spencer for hosting and stimulating interesting discussions such as this.

    • Scipio says:

      3. If the growth rate, of plant and other CO2 gobblers, increases with CO2 concentration, it would seem to me that biological uptake would increase exponentially with CO2 concentration.

      OK – is that true. Even if so, that hardly means CO2 concentration in the atmosphere isn’t also increasing exponentially.

      an exponential – an exponential = an exponential

      • Stephen Paul Anderson says:

        Show me your derivation that atmospheric CO2 is increasing exponentially and what exactly does that mean?

        • Svante says:

          It means that the derivative is increasing.
          You can ask Bart about it.
          https://tinyurl.com/yyfoompa

          • Stephen Paul Anderson says:

            So, based upon your chart it looks like it has increased exponentially before. And if you look at ice core data today’s temperatures are within the realm of natural variability.

          • bdgwx says:

            Look at those mass relaxation rates. What would you say is the e-time for the last glacial cycle? 50,000 years or so?

          • Stephen P Anderson says:

            Who knows what e time is during a glacial.

          • bdgwx says:

            SPA,

            This isn’t a trick question.

            How long does it take for the CO2 concentration to relax from its interglacial peak to its glacial trough? What is the time in terms of one e-fold?

            Almost everybody can answer this question. This only requires elementary school level skills. You don’t even need a calculator to estimate it. The only thing you need is the knowledge that one e-fold is about 37% of the original.

            Give it a shot.

        • Stephen P Anderson says:

          I think e times during a glacial must be very long but there is no way to calculate it from that chart.

          • bdgwx says:

            The adjustment time is easy to estimate. You just eyeball it. It is the point on the ppm relaxation where it reaches 37% of the original increase. It’s about 50,000 years give or take.

            The residence time is impossible to calculate because this chart does not present exchange rates or proxies (like isotope depletion) from which it can be calculated. I don’t think there is any compelling reason to believe exchange rates were substantially different in the past so it is not unreasonable to assume (or use as the null hypothesis) that RT was on the order of a decade or so.

          • Stephen Paul Anderson says:

            It is the way mother Earth adjusts and keeps plants alive during an ice age.

          • Stephen Paul Anderson says:

            No e time or Harde’s residence time is going to be very very long. It is the time it takes to move 0.63 distance from level to balance level. Tens of thousands of years during glacial.

          • Stephen Paul Anderson says:

            You actually might have an argument during a glacial that anthropogenic emission will raise CO2 levels very high. AE will be much higher than NE during glacial-if any people are left.

  17. Entropic man says:

    “3. If the growth rate, of plant and other CO2 gobblers, increases with CO2 concentration, it would seem to me that biological uptake would increase exponentially with CO2 concentration. At the very least one would expect a proportional increase in uptake. ”

    You would also get a larger decay rate when the Northern Hemisphere growing season ends.

    To test this hypothesis look for an increase in the amplitude of the annual CO2 cycle.

    Unfortunately greening will take up more CO2 during the growing season, but most of it does not go into long term storage.

    • Sisyphus says:

      “Unfortunately greening will take up more CO2 during the growing season, but most of it does not go into long term storage.”

      Interesting. What might we define as “long term storage” in this context?

      • Entropic man says:

        Most of the growth each year is annual plants which decay at the end of the growing season and return their CO2 to the atmosphere.

        Perennials such as trees store some of the CO2 in their trunks for a few years or a few centuries, then decay.

        Peat bogs accumulate CO2 which may be stored for longer periods. The peat in Ireland is mostly 5000 years old.

        Siberian or Canadian permafrost is peat which may have been frozen since the Eemian interglacial 100,000 years ago.

  18. Sisyphus says:

    It would seem as though the core matter which undergirds most all of the current alarm message must therefore turn on two central issues:

    1) Substantive ocean warming is due to manmade CO2 emissions, entailing a marked reduction in the ocean’s capacity as a CO2 sink as time moves forward; and

    2) CO2 sequestration by the oceans has a reasonably short-term viability envelope in any case, due to both saturation of the sink and a proposed subsequent destruction of that biological system by a resultant pH lowering effect.

    Criteria 1 seems unlikely to be a prime factor when one surveys clean data sources. Natural warming appears to be the dominant phenomena moving forward.

    Criteria 2 is the greater menace if viable, objective evidence can provide tangible insight to the extent necessary to reliably predict catastrophic outcomes for this system from forecast atmospheric CO2 emission levels.

    If criteria 2 cannot be sustained without recourse to unlikely scenarios and abstract theory, then the whole matter collapses; and we remain in the world of opinion, conjecture, and hysteria…

    • bdgwx says:

      In regards to criteria 1 what “clean” data sources are you referring to?

      • Sisyphus says:

        Any source which follows good practice and does not convey any of an assortment of agendized tenets in its aims.

        The core matter of criteria 1 is made null when one attempts to correlate global CO2 emissions with the observed climate warming phenomena over time. Take out the agendas and the contaminated datasets, and any reasonable pattern disappears.

        In sum, criteria 1, as a bludgeoning instrument which targets historic/present/future manmade CO2 emissions, falls apart with any sensible examination of the unvarnished facts. Relevantly, the actual mild natural warming of the climate system falls well outside of the alarm message scope.

        As I like to say, choose your own data source(s) without bias, and make up your own mind.

      • Nate says:

        “Criteria 1 seems unlikely to be a prime factor when one surveys clean data sources. Natural warming appears to be the dominant phenomena moving forward”

        What natural sources explain the data? Please show us a natural-only model fitting the temp record.

        Which data sets are clean and how do you determine that?

        • Sisyphus says:

          Look to the long-term historical records which do not bear urban heat island effects, are properly sited, and show evidence of no substantial metrological aberrations throughout their record spans. Then, compare those data side-by-side over time with manmade global carbon emissions tracking from sources like the DOE.

          There’s your fishing pole, folks. You’re certainly smart enough to get the job done for yourselves…

        • Nate says:

          ‘Which data sets are clean?

          There are at a dozen or so compiled data sets. But you can’t point to one?

          ‘urban heat island effects’ Ber*keley Earth has looked carefully at this issue and finds that the data homogenization methods of the other main data sets are working well.

          Europe would be expected to have its urban heat island effects peak in the 19th or early 20th century, before Asia, and way before Africa.

          But you look here at temp records of different continents, Americas, Africa, Asia, Europe, they all show the same pattern. Low trend until ~ 1980, then a sharp upturn with a high trend thereafter.

          This is a global effect. https://tinyurl.com/stwup9l

          • Sisyphus says:

            “There are at a dozen or so compiled data sets. But you can’t point to one?”

            I most certainly can, but I deliberately won’t point to any one or more.

            The time has come for people to transition their energy from pointless argumentation and model promotion, and look to clean data for what nature has to say (and has been saying) in plain sight. It takes work to sort, but the data is there for anyone who is truly invested in this topic.

            So, survey those “dozen or so compiled data sets” for yourself, do the work, and make up your own mind.

          • bdgwx says:

            We have already surveyed known datasets. None support your assertion. So if you know of something no one else does then now is the time to lay your cards on the table.

          • Nate says:

            ‘ It takes work to sort, but the data is there for anyone who is truly invested in this topic.’

            ‘what nature has to say (and has been saying) in plain sight. ‘

            If its in plain sight then not much work seems to be needed.

            Unspecified ‘sort’.

            So I assume this means cherry pick to your liking until you get what nature is already plainly whispering in your ear, Sysiphus.

            Rather than use a systematic, published, homogenization method, which is the work that has been done to produce the main sets.

    • Entropic man says:

      Regarding Criterion 1), the Henry’s Law calculation for a doubling of CO2 and consequent 3C warming projects that while atmospheric CO2 doubles, the CO2 content of the surface ocean would only increase by 85%.

      That is a 15% decrease in the effectiveness of the ocean sink as a buffer of CO2 increase.

      There is no natural warming. Measure all the natural forcings and their net combined effect is about 0.05C since 1880.Since we observe 1C warming in the same time, so 105% of the warming is artificial.

      Criterion 2) depends on whether you accept the consensus science. If you accept it, then concern is the proper response.
      If you reject the science then my concern is “opinion, conjecture, and hysteria”

      • Entropic man says:

        curses! That should have been

        ” Measure all the natural forcings and their net combined effect is about -0.05C”

  19. Dan Pangburn says:

    Why all this fuss about CO2. Simple calculations using data from Hitran show that the increase of water vapor has been about 10 times more effe.ctive than the increase of CO2 at ground level warming.
    Measured water vapor trend has been increasing faster than possible from feedback. https://watervaporandwarming.blogspot.com

    • Nate says:

      Dan,

      Simple calculations are alluring, but in this complex world often too simple to be reliable.

      • Dan Pangburn says:

        Nate,
        I am apparently on some sort of restriction on this blog. I just posted something which would not post previously.

        What you have posted reinforces an idea that I have had for several years and that is, often, there is a fundamental difference between the reasoning process of engineers vs scientists. I do not think that is a bad thing at all but might help explain where each of us is coming from.

        Engineers (or at least in my case) are aware that anything we thought we knew or any experiment or data can be faulty. As a consequence, consciously or not, we place a confidence assessment on each factor. I remember in the distant past I was running pressure de.cay experiments. In solid rockets, if the pressure de.cay is fast enough, combustion stops and, at the time, we thought we needed to know that. I had false confidence in the instrumentation be.cause it had been used by others in the past but my results did not make sense. What I eventually figured out (I derived the pressure vs time for a venting cavity with burning propellant) was that instead of pressure vs time measurements I was g.etting the response capability of the instrumentation. Lesson learned.

        As to climate, I am aware that a lot of peer reviewed stuff has turned out to be faulty. Also, a lot of assump.tions and possibly mistakes went in to the GCMs. Everything that I am aware of indicates that the Clausius-Clapeyron equation is misunderstood/misapplied but perhaps it is merely the people I have heard from.

        What I have come up with is perhaps a different percep.tion of how the atmosphere works and what is important in it. Things that I have not seen extensively pursued by others include thermalization and the 1200 to 1 de.cline in WV mole.cules from ground to tropopause. Another factor is data from Hitran which, with a little calculation, shows that WV increase has been about 10 times more effe.ctive than CO2 increase at warming the planet at ground level. Another, which you assert is inadequately proven, is that measured WV is greater than possible from feedback. The calculations show that it is.

        My presump.tion is that any of these factors might be faulty, even if they have been accep.ted as proven. But put tog.ether in a rational scenario they are consistent with all available observations. They are internally consistent, i.e. the Hitran assessment and most probable WV measurements vs possible feedback comparison; the hash between wave number 200 & 600 in TOA graphs of flux vs wn are consistent with the 1200 to 1 de.cline in WV population; the notches at the top of flux vs wn graphs are explained by thermalization and above-mentioned emission to space by WV; the part of warming attributable to human activity is explained by the observed 1.47% per de.cade rise in WV due mostly to the increase in irrigation. Comparisons with small-island radiosonde measurements and and [sic] ground-based GPS water vapor data demonstrate rms errors of ~1.0 mm. is from RSS Documentation of WV measurements. I am not sure what that means but 1 mm is about 3.4%. The trend should be a lot better be.cause the uncertainty in individual measurements g.ets smoothed out so most of what is left is bias and instrument drift, neither of which is mentioned.

        The common observation that cloudless nights cool faster and farther where it is dry than where it is humid demonstrates that the misleadingly named greenhouse effe.ct exists and that it is caused by water vapor.

        Perhaps science could treat this entire scenario as a hypothesis and devise experiments to prove/falsify any of the factors. Water vapor and CO2 are both still increasing but there is a limit to how much WV the atmosphere can hold. If CO2 continues up and averag.e global temperature does not, they are wrong. If averag.e global temperature continues to go up but WV does not, I am wrong. About all I can do is wait and see.

        • Nate says:

          Dan,

          “Engineers (or at least in my case) are aware that anything we thought we knew or any experiment or data can be faulty.”

          Couple of things here.

          Scientists, particularly from my experimental side, are also aware of this. That is why we demand replication before accepting controversial results.

          Too many people come to this blog and to easily dismiss the value of scientists, their methods, findings, and expertise.

          In a subject as complex as this one it takes years to become expert enough to make an original contribution.

          It is hubris to assume that an amateur dabbling in such a field, is likely to find that the many experts have overlooked something simple.

          Much more likely is that the amateur dabbler is unaware of relevant facts and details that make his ‘simple’ idea wrong.

          • TimTheToolMan says:

            “Much more likely is that the amateur dabbler is unaware of relevant facts and details that make his simple idea wrong.”

            Much more likely again is that the “amateur dabblers” who take this seriously have a much broader view of the science because scientists who do this for a living necessarily have a narrow view concentrating more directly on their funded interests.

          • Nate says:

            Oh Puleez, lets hear about all the recent scientific advances made by the ‘amateur dabblers’

          • TimTheToolMan says:

            “lets hear about all the recent scientific advances made by the amateur dabblers”

            Its less about making “scientific advances” and more about having a holistic understanding of the multidisciplinary sciences that make up climate science.

          • Nate says:

            “having a holistic understanding”

            I assume you choose auto mechanics or doctors who are amateur dabblers with a ‘holistic understanding’ rather than training and expertise.

            Hows that working out?

          • TimTheToolMan says:

            I’d choose a doctor who had an interest in, and kept up with the entire field over one who was a GP who mostly deals with colds and flues, any day.

            Its true of all professions – the broader someone’s experience, the more valuable they are. Deep experience in one area is valuable too. But often not for long as the field moves on.

          • Nate says:

            Lets get real Toolman.

            Your child faces a serious health crisis, you and your wife won’t choose an amateur dabbler to treat her.

            Like all of us, you will go with the one who has professional training, expertise, and strong track record.

    • Scipio says:

      DanP, you must submit your work to peer review or you are wasting your time

      • Dan Pangburn says:

        I actually only got involved in this because of my own curiosity.
        There are a few problems with the ,,peer reviewed literature,, re climate change; especially in the name journals:
        1. Only about 10 % of submittals get published.
        2. It takes way too long from submittal to publish.
        3. If the paper appears to disagree with their existing beliefs it is rejected without review.
        4. Paper journals do not have hot links to references.
        5. I will not perish if I do not publish.
        I do not care what others think but appreciate constructive challenges.
        Mother Nature peer reviews my stuff.

        • Scipio says:

          so youre afraid to submit your work to a real journal because you’re afraid it will get rejected.

          yeah that’s how it goes — bad work is rejected.

          • Dan Pangburn says:

            Sci,
            A lot of published stuff turns out to be wrong so apparently sometimes bad work is also accep.ted.
            I do not consider satisfying my curiosity a waste of time.

        • Scipio says:

          5. I will not perish if I do not publish.

          no one who matters
          will ever see your work or take it
          seriously if you do not have the courage
          to submit it to a real scientific journal

          that’s the way it is

          • Dan Pangburn says:

            Sci,
            It is simple. If when CO2 and WV stop increasing in parallel, averag.e global temperature follows WV I am right. If averag.e global temperature follows CO2, I am wrong and Hitran is wrong.

          • Svante says:

            Not going to happen, CO2 drives WV (except for what is left in the pipeline).

        • Dan,

          This is exactly how I feel about it too. But if you are curious as you state, some people like Prof Herman Harde do publish on this discussion. See in example “What humans contribute to Atmospheric CO2 : Comparison of Carbon cycle models with observations.”
          He also published in Elsevier “Scrutinizing the carbon cycle and CO2 residence time in the atmosphere.” But peer review pulled it down after publication. He was not allowed to react in Elsevier and needed to move to another publisher to publish his response on the reactions.
          He did however proposed to update climate models with solar and cloud formation feedback that are not part of the previous climate models. Interesting contributions to the discussion I would say.

          • bdgwx says:

            Harde conflates residence time with adjustment time. Kohler et al. provide excellent commentary on the differences and how Harde confuses the two and the profound effect it has on his conclusion.

            The journal Global and Planetary Change did not retract his paper though they probably should have.

            Models have been incorporating solar and cloud forcing/feedback for quite some time.

          • Stephen P Anderson says:

            No, Harde doesn’t conflate adjustment time with residence time. Harde’s residence time is the same as Berry’s e time. You are the one conflating.

          • bdgwx says:

            SPA,

            That doesn’t even make any sense. If I’m the one saying two things are different (regardless of whether they are or not) and you’re saying they aren’t then who is doing the conflation? Even if the two concepts were the same and they could be reasonably conflated I’m still not doing anything resembling conflation.

          • bdgwx says:

            SPA,

            Let me try to give you another illustration of residence time and adjustment time.

            Consider a store that has widgets for sale. The store currently has 100 widgets on stock. Each day 49 new widgets arrive from the supplier to restock the shelf and 50 widgets are sold.

            The residence time is the amount of time a specific widget remains in stock. The e-time for this is 1.4 days. After 1.4 days only 37% of the original stock remains.

            The adjustment time is the amount of time it takes for the stock to deplete. The e-time for this is 37 days. It takes 37 days to reduce the total stock to 37% of the original amount.

            I fear I cannot make this any simpler so hopefully this is the epiphany moment.

          • Stephen Paul Anderson says:

            You keep trying to change Berry’s physics model. If e time is 1.4 years then balance level is 49(1.4) or 68. Outflow would be level divided by e time. In 1.4 years it will decay to 80. In 2.8 years it will decay to 72. In 4.2 years it will decay to 69.5.

          • Stephen Paul Anderson says:

            I’m sorry. Change that to days if you want.

          • Stephen Paul Anderson says:

            Have you read Harde’s paper? His residence time is clearly the same as Berry’s e time.

          • Stephen Paul Anderson says:

            In 5.6 years it will decay to 68.5, essentially at the balance level. About 4 times e time is your “adjustment time.” 16 years. If all anthropogenic emission stopped in 16 years CO2 level would be close to natural balance level.

          • bdgwx says:

            SPA,

            I’m not critiquing Berry’s model…yet. I’m just trying to help you understand what residence time and adjustment time are. I still don’t think you’re understanding.

            In my store example this is how the numbers lay out.

            day…total stock…original stock
            day 0…100…100
            day 1…99…50
            day 2…98…25
            day 3…97…12
            day 4…96…6
            day 5…95…3
            day 6…94…1.5
            day 7…93…0.7
            day 8…92…0.3
            day 9…91…0.1

            There is no balance level. That was intentional to make the example as easy as it can possibly get.

            Do you see what’s happening here?

      • Nate says:

        Dan,

        You appreciate constructive feedback, but you will never get it from the real experts who will be able to tell you exactly what youve done right snd wrong if you dont submit for publication. They will not seek out your blog.

        Are non experts simply supposed to take your word that you have gotten the details right and not ignored some established facts?

        • Dan Pangburn says:

          This would not post 2/13/20
          Nate,
          I had hoped that you were expert enough to provide constructive feedback.

          I did not do any experiments. As far as I can tell, none of the data that I used are a product of peer review. It is stuff off the web from NASA, Hitran/Spe.ctracalc, reported by various ag.encies, or common knowledg.e. I compare from various sources where possible. Some stuff, like thermalization, is just educated common sense. The end product explains all measureable aspe.cts of climate chang.e since reasonably accurate measurements of temperature have be.come available, about 1895. It demonstrates that CO2 has nothing to do with climate and the main driver of climate chang.e has been water vapor increase.

          • Nate says:

            “It demonstrates that CO2 has nothing to do with climate and the main driver of climate chang.e has been water vapor increase.”

            No, no it doesnt. It demonstrates only that you have convinced yourself, but not for any other scientifically literate people.

            Many pevious papers showed that CO2 DOES affect climate. And they take into account things that you dont, like general circulation. You havent bothered to address their findings, discuss them, nor show what they have done wrong.

            Your analysis is not quantitative, with no error analysis, as I have explained. Yet that is essential to mske any claim in science.

            It is the height of hubris for you to claim, that your analysis ‘demonstrates’ thst all previous results are wrong. You havent demonstrated that.

            You are simply expressing a belief.

            Water vapor condenses at ordinary surface temperatures, while co2 does not. It has been shown that removing all the co2 results in almost all water vapor becoming liquid or solid. Ice-albedo feedbacks take over and we get a snowball Earth.

          • Nate says:

            Dan, the problem with your HiTran calculation is that you are only looking at ground level “On average at ground level, according to the comparatively low populations used by Hitran, WV molecules outnumber CO2 molecules by “.

            This ignores the the enhancement of the GHE by CO2 which takes place high in the troposphere. That is where the action is, where added CO2 raises the effective radiating level, and temperature, of co2 molecules radiating to space.

          • Tim Casson-Medhurst says:

            “It has been shown that removing all the co2 results in almost all water vapor becoming liquid or solid. Ice-albedo feedbacks take over and we get a snowball Earth.”

            Shown? In models who’s control runs never vary much until CO2 is varied? Speaking of hubris…

          • Nate says:

            Correct, shown in general circulation models that incorporate established atmospheric physics. There is no other way to know, other than looking at the paleo record, where we do see Iceball Earths.

            We have observed all the model components in action. We see that changes in atm co2 produce changes in radiative forcing that agree with theory. We know from 1LOT that the large negative rad forcing from losing all co2 will result in cooling. We observe that cooler air holds less water vapor, which results in further neg radiative forcing. We have observed expected ice-albedo feedback.

            Would the model results be quantitatively accurate about temp and % ice cover? Prob not.

            But basic physics and observations assure us of the general pattern. that there will be a large reduction in temp and large increase in ice cover.

          • Dan Pangburn says:

            Nate,
            Many pevious papers showed that CO2 DOES affect climate. Can you name even one? I am unaware of any. The observation that, during the last and previous glaciations CO2 change FOLLOWED temperature change and several other observations listed in Section 2 of http://globalclimatedrivers2.blogspot.com refute this.

            what they have done wrong Is failing to consider the Quantum Mechanics determinations by Hitran which show that a WV molecule absorbs about 1.37 times as much energy as a CO2 molecule and that water vapor molecules have been increasing about 7 times faster than CO2 molecules. They also apparently did not even look at the measured WV increase let alone account for the WV increasing faster than possible from temperature increase.

            You are simply expressing a belief No, I am showing calculations and the conclusion that the calculations lead to.

            It has been shown that removing all the co2 results in almost all water vapor becoming liquid or solid. Perhaps in opinion or faulty model. Water, even as ice, has vapor pressure which forces WV into the atmosphere. WV has been increasing faster

          • Dan Pangburn says:

            Nate,
            problem with your HiTran calculation is that you are only looking at ground level I spent a lot of time looking at Hitran output at many different altitudes. That is where the action is Not so. Comparison shows absorb/emit intensity to be more than 100 times greater at ground level than at 10 km so any enhancement of the GHE by CO2 high in the troposphere would be tiny. Perhaps more important is the output from Modtran which corroborates thermalization and emission to space from WV mole.cules, espe.cially high in the tropopause. Added cooling from more CO2 in the stratosphere (more ghg present to emit to space) is, I believe, corre.ctly accounted for in the GCMs. The increased temperature at the poles is consistent with more CO2 be.cause the low temperature there means very little WV.

          • TimTheToolMan says:

            “shown in general circulation models that incorporate established atmospheric physics. ”

            You’re mistaken Nate. The GCMs have no physics for clouds, they’re fitted. And when you incorporate a fit into “physics”, no matter how accurate you might think that simplified physics is, it results in a fit.

          • Nate says:

            “Youre mistaken Nate. The GCMs have no physics for clouds, theyre fitted. And when you incorporate a fit into physics, no matter how accurate you might think that simplified physics is, it results in a fit.”

            All physics models are an approximation of reality, that averages over or neglects factors deemed unimportant.

            The question is whether they are able to predict.

            SaParameterization for clouds is done in weather models. Their prediction track record is very good.

          • Nate says:

            Dan,

            Youve ignored this portion of my post:

            “where added CO2 raises the effective radiating level, and temperature, of co2 molecules radiating to space.”

            This is a key point of how the enhanced GHE works, that your HiTran analysis at the surface is missing.

          • Nate says:

            Dan, Here is a decent explanation of why your HiTran calculation is missing the main CO2 effect:

            http://clivebest.com/blog/?p=1169

          • TimTheToolMan says:

            “Parameterization for clouds is done in weather models. Their prediction track record is very good.”

            Climate models are fundamentally different to weather models, Nate. Weather models aren’t trying to resolve the minuscule changes in climate from timestep to timestep. Climate models have to model climate changes to 15 mins.

            Modelers have most likely led you to believe they may not be able to predict the climate in 10 years but they can do it in a hundred years. Is that your expectation?

          • Nate says:

            Toolman, I get that you dont think it can be done, but not the why.

            In some ways modeling global warming over decades is easier than weather modeling over a week, where chaos is dominant.

            Modeling global warming is analogous to predicting the ave temp vs time of a pot of water put on a stove burner. While predicting weather is analogous to predicting the rapid local temp fluctuations throughout the pot.

          • TimTheToolMan says:

            “In some ways modeling global warming over decades is easier than weather modeling over a week, where chaos is dominant.”

            That’s what you’ve been led to believe. But what you probably dont get is that GCMs have to calculate the right forcings and feedbacks over those decades for every 15 minutes they step and they cant.

            The tiny climate signal must be accurately calculated so it can propagate from step to step over millions of steps.

            Weather is a different problem because it doesn’t matter if their energy calculations all go wrong over a few days because that’s all they’re capable of.

          • bdgwx says:

            It might be good to point out that GCMs are not the only kind of climate model. There are also the energy budget models and heuristic/statistical models based on observations. A model does not have to be a GCM to be useful. The interesting thing is that different models/approaches produce similar results albeit with varying details. I just want to make sure people understand that climate science is not dependent upon GCMs. In fact, GCMs weren’t even thing when the first climate models were developed in the 1800’s.

          • Nate says:

            ” But what you probably dont get is that GCMs have to calculate the right forcings and feedbacks over those decades for every 15 minutes”

            Sure, standard for a computer simulation to use time steps much shorter than the times of interest, so relevant changes are small. Why is that a problem?

          • TimTheToolMan says:

            “Why is that a problem?”

            Because they’re not capable of doing it. They’re necessarily a fit because they’re not actually physics based. The “physics” parts are simplified and low resolution and they’re combined at every time step with parameterised (ie fitted) components, not the least of which are clouds.

            You’ve been led to believe they’re modeling climate but in fact they’re “modeling” an expectation of climate.

          • Nate says:

            “Because theyre not capable of doing it. Theyre necessarily a fit because theyre not actually physics based.”

            False assertion without evidence, Toolman. Where are you getting this info?

            The models are MOSTLY physics based, they solve Newton’s laws and hydrodynamic equations to model the movement of air masses, and use thermodynamic laws to calculate the movement of heat into and out of air masses and surfaces. They can successfully model the general circulation pattern of the atmosphere with just physics!

            “The ‘physics’ parts are simplified and low resolution and theyre combined at every time step with parameterised (ie fitted) components, not the least of which are clouds.”

            Some ‘parameterization’ is needed to avoid wasting computer power on determining, needlessly, where every bloody cloud is.

            You really dont seem to understand how simulations work. They are solving differential equations numerically on small time and spatial steps.

            Small time steps are standard and good. The smaller the better, generally, because in nature time is continuous!

            This is how modern weather models have improved so dramatically from 50 y ago, by using smaller space and time steps.

          • TimTheToolMan says:

            “False assertion without evidence, Toolman. Where are you getting this info?”

            Its well known clouds aren’t based in Physics and are parameterised. If you haven’t already researched it, you should.

            “The models are MOSTLY physics based”

            But they need to be completely physics based to resolve the tiny amount of CO2 induced forcing and the associated feedback changes. As it is, clouds are responsible for hundreds of Watts of energy flow in the atmosphere and putting it in perspective measured ocean warming is about half a Watt.

            “Some ‘parameterization’ is needed to avoid wasting computer power on determining, needlessly, where every bloody cloud is.”

            There is no physics that describes clouds. Parameterisation isn’t a choice. The amount of energy the atmosphere is conveying to resolve the CO2 forcing certainly does require an accurate model of clouds.

            But they dont have it. The modelers would have you believe they’re resolving the CO2 forcing but its utterly swamped by the fitted cloud component as well as other parameterised components. Clouds are the easiest to understand.

            You dont believe any of this of course. In fact you probably believe the tuning to get the TOA imbalance in the required range is fine too and not indicative of a fitted model.

          • bdgwx says:

            Tim,

            Are you suggesting that a model has to be 100% perfect to be useful?

            Do you have a model that handles clouds better than what we already have?

            How would you solve this problem?

          • Nate says:

            Toolman,

            “But they need to be completely physics based to resolve the tiny amount of CO2 induced forcing and the associated feedback changes.”

            This is a feeling, not a fact backed by evidence, Toolman.

            The idea that if a model uses parameterization as well as physics then it will be useless, is contradicted by the numerical weather models, which successfully model chaotic weather patterns out to a week or so.

            https://en.wikipedia.org/wiki/Numerical_weather_prediction

            “In addition, the partial differential equations used in the model need to be supplemented with parameterizations for solar radiation, moist processes (clouds and precipitation), heat exchange, soil, vegetation, surface water, and the effects of terrain.”

          • TimTheToolMan says:

            “Are you suggesting that a model has to be 100% perfect to be useful?”

            The standard statement about models is they’re always wrong but sometimes useful. And so the question is what are the GCMs useful for?

            Actually projecting climate 100 years out? No.

            Describing a world where that assumes that warming progresses according to the assumptions built into them? Sure. Maybe.

            But is that really useful when we know those assumptions are unlikely to be true? Parameterised clouds wont be the same in a warmer world and everything falls apart from there.

            Nate naively claims “This is a feeling, not a fact backed by evidence” but there is plenty of evidence that the models aren’t modelling climate.

            They’re running too hot.
            They dont resolve regional climate change.
            They have common features that are wrong like the tropical hotspot.
            They cant model a haitus like we had at the turn of the century – Santer tested this and essentially told us the models couldn’t (at the 95% confidence level) have 17 years without warming in the face of the CO2 forcing but nature did just that.
            Recently Roy found discrepancies between regional heating around the tropic.
            The list goes on…and on.

          • Nate says:

            “Theyre running too hot.”

            Apparently that is a myth.

            https://pubs.giss.nasa.gov/docs/tbp/inp_Hausfather_ha08910q.pdf

            “They dont resolve regional climate change.”

            Oh, such as?

            “They have common features that are wrong like the tropical hotspot.”

            Jury out on that one, some analyses show it has one.

            “They cant model a haitus like we had at the turn of the century”

            Red herring. They are not capable of capturing short-term climate noise such as this, because ENSO, solar activity, and volcanoes are unpredictable. They are intended to predict long-term capture trends.

          • Nate says:

            ‘Actually projecting climate 100 years out? No.’

            They can only project that far for a given emissions scenario, that may or may not be realized.

            In any case, these projections still have healthy error bars.

          • TimTheToolMan says:

            “They can only project that far for a given emissions scenario, that may or may not be realized.”

            Its not emissions they cant predict. You need to read closely. From Hausfather et al

            “While some models showed too much warming and a few showed too little, most models examined showed warming consistent with observations, particularly when mismatches between projected and observationally informed estimates of forcing were taken into account. ”

            They dont blame unknown emissions because that wouldn’t stand up to scrutiny. They blame incorrect forcings. And its accurately calculated forcings (which is really the radiative forcing plus feedbacks) that the GCMs need to calculate to be able to project climate.

            In other words they’ve said the GCMs didn’t get future climate right because they didn’t get the climate change part of the calculations right.

            That’s no surprise to me.

          • Nate says:

            Tim,

            “They dont blame unknown emissions because that wouldnt stand up to scrutiny. They blame incorrect forcings.”

            That means adjusting model results for the actual GHG concentrations and resulting forcings, which ARE due to actual emissions, rather than the projected values, which were invariably off.

            Absolutely the sensible thing to do.

            You are reading between the lines in ways that match your pre-conceived notions.

          • TimTheToolMan says:

            “That means adjusting model results for the actual GHG concentrations and resulting forcings, which ARE due to actual emissions, rather than the projected values, which were invariably off.”

            Do you believe they acquired the old models, supplied the actual emissions and re-ran them to get the changed climate?

            No of course they didn’t. They’d most certainly have mentioned it if they had.

            Between “emissions” and “forcings” is the calculation and it involves the feedbacks.

            So no, its not valid to change the forcing independent of the calculation and call that the models output under a different emission scenario.

            You’ve only got to look at the projected warming to see large discrepancies. Much larger than any mis predicted emissions would have been. To put that in perspective the difference between say 400 and 420 ppm is only about 5% but the models are out by way more than that.

            Their calculations are wrong.

            “You are reading between the lines in ways that match your pre-conceived notions.”

            How ironic. I’m reading and quoting their words. You’re the one reading between the lines to match your pre-conceived notions.

          • Nate says:

            “Between ’emissions’ and ‘forcings’ is the calculation and it involves the feedbacks”.

            Nonsense! You ARE reading between the lines.

            Forcing and feedback are two different things.

            Show me in the txt where it says feedbacks are adjusted.

          • Nate says:

            “see large discrepancies.”

            Where? Their charts show little discrepancy.

          • Nate says:

            “So no, its not valid to change the forcing independent of the calculation and call that the models output under a different emission scenario.”

            All indications are that that for small changes in forcing the temp response will be linearly proportional to forcing. That is very reasonable assumption.

            It is not logical to assume FB will be active for first 2/3 of forcing but inactive for the last 1/3 of forcing, as you seem to believe.

          • Nate says:

            Toolman, pls ignore my ‘read between the lines” comments.

            My last comment is more relevant, re: linear response between forcing and temp response is expected for small changes in forcing, and found in models. Thus it is reasonable for paper to use this approach.

          • bdgwx says:

            Tim,

            The question being answered is whether model vs observation discrepancies is the result of bad model physics or bad model inputs.

            When you see terminology like ‘forcing adjusted’ understand that this means the model is ran with actual inputs instead of scenario based inputs. The inputs in this context are emissions of GHGs, human aerosols, volcanic aerosols, solar TSI, etc. The actual amount of warming those agents causes are still a product of the internal model physics. The reason they use this terminology is because these emissions ‘force’ the climate.

            What publications like Hausfather (and his isn’t the only one) shows is that most of the discrepancy is the result of bad projections of human behavior and volcanic activity.

          • bdgwx says:

            Tim,

            For example, Hansen’s 1988 model study was ran with 3 different scenarios: A, B, and C. Scenario A was a worst case emission scenario. Scenario C was a complete emission cessation scenario. And scenario B was considered Hansen’s best guess at human and volcanic behavior. In reality none of those 3 scenarios actually happened. Due to higher volcanic activity and emission reductions (like from the Montreal Protocol) the actual scenario that played out was between B and C. When Hansen’s model was ran with a real volcanic and emission scenario the modeled warming was almost spot on with observations. And that was a 30 year old model.

            What this tell us that if we want to improve future predictions of warming we need to get better at predicting volcanic activity and human behavior.

          • bdgwx says:

            Tim,

            BTW…please don’t hear what I’m not saying. I’m not saying that model physics is perfect. It isn’t. And it never will be.

            One problem is the so called mid troposphere tropical hotspot discrepancy. Models tend to overestimate the warming in this region. Though much of the discrepancy appears only when comparing with UAH.

            Another problem is the underestimation of cryosphere declines. This is particularly evident in the Arctic region and with sea ice which we know has a high albedo warming potential feedback.

          • TimTheToolMan says:

            “BTW…please don’t hear what I’m not saying. I’m not saying that model physics is perfect. It isn’t. And it never will be.”

            But what I believe you’re saying is that its a climate calculation but an imperfect one. What I’m saying is that its not a climate calculation at all. Due to its parameterisation (ie entirely non-physics) its a projection of warming based on a fitted expectation of warming, tuned from the past.

            The fact that some of the calculation is based on physics (albeit simplified and coarse) is irrelevant to how the climate signal in the model changes.

          • Nate says:

            Toolman,


            “Due to its parameterisation (ie entirely non-physics) its a projection of warming based on a fitted expectation of warming, tuned from the past.

            The fact that some of the calculation is based on physics (albeit simplified and coarse) is irrelevant to how the climate signal in the model changes.”

            We have shown you examples of modeling that is very successful with just such parameterization, but you persist in declaring your belief that it just cannot work, with little to back it up.

            Sorry it is just unconvincing.

          • TimTheToolMan says:

            “We have shown you examples of modeling that is very successful with just such parameterization”

            You have? Not that I’ve seen.

            What is the best piece of evidence you’d consider shows that GCMs are modeling climate and not just a fitted projection of warming based on warming during its tuning period?

            And just to head you off at the pass, any answer involving successful weather prediction really does show you’re entirely missing the point and indeed the difference between a weather prediction and a climate prediction.

          • Nate says:

            “just a fitted projection of warming based on warming during its tuning period?”

            Toolman this is your from the hip claim. Can you back it up with any evidence, anything at all?

            The models are largely physics based as we showed you already. There is no need to solve massive amounts of hydrodynamic equations, apply heat transfer or thermodynamic laws, for the whole Earth, if all the models are doing is fitting a projection of the past into the future.

        • TimTheToolMan says:

          “Toolman this is your from the hip claim. Can you back it up with any evidence, anything at all?”

          Its self evident that if you have a calculation that has a component that is is physics based and add a non-physics based component to it, then the result is non-physics based.

          Wouldn’t you agree?

          • Nate says:

            No, because all simulations work that way.

            ‘Self evident’ means you have nothing but afeeling.

          • bdgwx says:

            I’m still not sure what you mean when you say they aren’t physics based.

            A parameterization scheme is physics based. It is a model within a model that is used to simulate a real physical process. In fact, parameterizations schemes are often referred to as the physics component of the model.

          • TimTheToolMan says:

            “A parameterization scheme is physics based. It is a model within a model that is used to simulate a real physical process.”

            No. Just no. Simulation doesn’t make it physics.

            A parameterisation scheme is a calculation based on non-physics “rules of thumb” and is built from historical observations.

            Compare this to “precise” physics like the equation used to describe the force required to accelerate a mass : F=ma
            which just works (until you reach relativistic speeds, anyway)

            Clouds are approximated and that makes the climate projection useless when the CO2 forcings are so tiny compared to how the clouds impact the SW radiation getting to the earth’s surface.

            https://eos.org/research-spotlights/a-super-solution-for-modeling-clouds

            “This uncertainty with respect to clouds is the main source of uncertainty in model-based projections of future global warming; more clouds in a future climate will dampen global warming while fewer clouds will amplify the warming. Furthermore, uncertainty in cloud representations also contributes to systematic errors in simulated precipitation patterns.”

            and

            “The model more accurately reproduced cloud top height measurements observed by the Moderate Resolution Imaging Spectroradiometer aboard the Terra satellite compared with the standard parameterized version of OpenIFS. The superparameterized model also showed improvements in representing specific humidity.”

            …all from past measurements. They’re a fit and they apply to what we’ve seen.

          • Nate says:

            Toolman, if you would stop with the black and white thinking and tossing out terms liike ‘useless’ and instead say that cloud evolution in models adds some uncertainty to their projections you would find some agreement from us.

          • Nate says:

            “A parameterisation scheme is a calculation based on non-physics ‘rules of thumb’ and is built from historical observations.”

            Sure thats called empirical knowledge. Much of science is based on this, and it works. Empirically, summer will come in June every year. This enabled humans to plan their agriculture, for millenia without any physics model.

            “Compare this to ‘precise’ physics like the equation used to describe the force required to accelerate a mass : F=ma.”

            Great, but often we dont yet have that level of understanding, but we still have useful empirical facts.

          • Nate says:

            Hurricane path projections use physics models and parameterization. They do much better projecting 72 hours than 30 y ago. We learned by comparing to observations.

            No one can reasonably say these models are useless. They can reasonably say they have uncertainty.

          • TimTheToolMan says:

            “if you would stop with the black and white thinking and tossing out terms liike useless and instead say that cloud evolution in models adds some uncertainty to their projections you would find some agreement from us.”

            Useless is the correct word. The GCMs are trying to resolve forcings and feedbacks to an accurately modeled couple of Watts every time step over 100 years and clouds add more than a little uncertainty!

            Clouds cover about a third of the earth at any one time. A tiny change in them that parameterisation doesn’t account for is likely to be more than a couple of Watts and so makes the climate calculation useless.

            Compare that to your examples where the projection is for a few days only and the models are mainly about modeling large scale systems. If they’re out by a few Watts then it wont matter over the timescales we’re interested in (and the models are good for, anyway)

            Climate projection is an entirely different problem to weather projection.

          • bdgwx says:

            Tim,

            That last post is something we can at least partially agree with.

            The uncertainty that arises as a result of how clouds are modeled is definitely debatable. I personally think clouds account for a large portion of the uncertainty as well.

            Our point is that just because there is uncertainty does not mean that the model used is not based on physics. Cloud microphysics parameterizations are physics based. It is just that they are approximations of a physical process that is really hard to model. This difficulty is partly due to having an imperfect understanding of the process and the limitations with the computing resources required to run the model.

            One interesting line of discussion is with the cloud physics utilized by the CMIP6 suite of models. It is an improvement upon what was used in CMIP5. This more realistic handling of cloud physics is believed the dominating reason behind the higher equilibrium climate sensitivities in CMIP6 vs CMIP5.

          • Nate says:

            ‘Useless is the correct word.”

            Toolman has no interest in finding common ground.

            “The GCMs are trying to resolve forcings and feedbacks to an accurately modeled couple of Watts every time step over 100 years”

            And you wrongly assume that such noise sums over space and time to become enormous.

            Thats not how it works, Toolman. You are stating opinions not facts. Its as if weve had no discussion at all.

            Just not credible.

          • Nate says:

            “Compare that to your examples where the projection is for a few days only and the models are mainly about modeling large scale systems.

            Climate projection is an entirely different problem to weather projection.”

            Oh so you think hurricane path prediction 72 h or more out is a piece of cake?

            We couldnt do it 30 y ago. It requires massive computation and a very good understanding of how to parameterize many things, even clouds.

            We leared how to do it from observation. You act as if doing that for climate modeling is somehow wrong.

            Climate projection is much longer so you falsely assume it is much harder.

            No not necessarily, it IS very different, because there is no need to include the intermediate timescale chaotic dynamics from weather.

            It is modelling the AVERAGE behavior over time. Cloud fluctuations produce noise on that trend.

          • TimTheToolMan says:

            “And you wrongly assume that such noise sums over space and time to become enormous.”

            No. I rightly understand that climate change resulting from anthropogenic CO2 is as a result of a small forcing over a long period. And the earth’s response to that forcing.

            And so if the model produces say 4 Watts by the end of the 100 years when reality was no more than 2 Watts, the climate change realised in the model wont be anything like the actual climate change we experience.

            And the models could easily be out more than that after 100 years.

          • bdgwx says:

            Tim, Yes, that’s called uncertainty. The sensitivity in units of C per W/m^2 is even more uncertain. And remember that uncertainty is a double edged sword. It’s just as likely we are underestimating climate sensitivity as we are overestimating it. In fact, the upper bound is less certain because it is not being constrained with time like how the lower bound is.

          • TimTheToolMan says:

            “Tim, Yes, thats called uncertainty. ”

            I expect we’ll have to agree to disagree.

            Uncertainty is something you have when you have a sound, but imperfect calculation. Since the cloud component is unsound (ie fitted) then its not uncertainty. Its a projection based on the assumption that what we’ve seen in the past applies in the future.

            Now you might feel that’s an ok assumption but I dont.

          • bdgwx says:

            Tim,

            Do you know of a better way to handle clouds?

          • TimTheToolMan says:

            “Do you know of a better way to handle clouds?”

            No. I think climate projection is simply out of reach for us currently and I dont see that changing in the near or even mid term.

            If the modelers had given the right message from the outset then things would have been different.

            Their message should have been if warming proceeds at the current rate, then the climate models describe what the earth may look like in the future.

            But instead their message was that the climate models are projecting climate change for the earth and prove CO2 is the majority cause rather than natural variation and that’s basically a scientific lie.

            It may be true (or may not) but either way its not science through the models.

          • bdgwx says:

            Tim,

            So you don’t a better alternative?

            How do you want scientists to research climate sensitivity, equilibrium climate response, and regional effects?

          • Nate says:

            “Since the cloud component is unsound (ie fitted) then its not uncertainty”

            More unsound declarations from Toolman. Why is something empirical (ie based on observation) unsound?

            Makes little sense.

          • TimTheToolMan says:

            “How do you want scientists to research climate sensitivity, equilibrium climate response, and regional effects?”

            The truth is they cant. Sensitivity fundamentally is based on assumptions about CO2’s cause and the earth’s response. They can model to their hearts content based on assumptions but at the end of the day none of their projections have any meaning that requires urgency and panic and they should be up front about it.

            But its probably too late for that. Too many people are too emotionally and financially invested in impending disastrous climate change to be able to let go.

            Meanwhile societies needs to keep working to protect against weather events like droughts, floods and storm surges because with or without anthropogenic climate change that’s the prudent thing to do.

          • TimTheToolMan says:

            “Why is something empirical (ie based on observation) unsound?”

            Here’s an analogy for you.

            If you tossed a coin 10 times and it came up heads 7 times you would project that 70% of the time heads will come up.

            You’ll rail against that because you “know” the truth so let me rephrase it.

            If you nubbed the fangle 10 times and mish came up 7 times then you’d project mish will come up 70% of the time.

            The point is we dont know what the outcome should be and past performance is no predictor of future performance.

          • Nate says:

            “If you tossed a coin 10 times and it came up heads 7 times you would project that 70% of the time heads will come up.”

            You seem determined to turn highly sophisticated computer modeling into something it is clearly not, a coin toss, because that better fits your beliefs.

            No point discussing further

  20. Scipio says:

    here is the disgusting man that Roy praises:

    So youre faced with dyed-in-the-wool socialist whos not even a Democrat, the conservative talker declared. A gay guy, 37 years old, loves kissing his husband on debate stages. Can you see Trump have fun with that?

    https://www.thedailybeast.com/rush-limbaugh-claims-mr-man-trump-will-have-fun-with-gay-guy-buttigieg-kissing-his-husband?ref=home

    • Sisyphus says:

      “here is the disgusting man that Roy praises”

      Limbaugh is no prize, but some of the things he decries are worthy of mature consideration. Vice has turned 180 degrees to virtue in the public square; and that is a clean fact.

      We now are even led to associate people of color with vice to advance pernicious agendas which have torn the very fabric of society to the point of destruction.

      So, like him or not, at least he has the fortitude to put forth some objective matters of moral decency for reflection on the public stage.

      “We will remember in November…”

      • Stephen Paul Anderson says:

        Scipio,

        No matter how enlightened you think everyone should be many Americans, including traditional Democrats, are going to have trouble watching Buttigieg kissing his husband on stage, just a fact. I would not vote for a gay man for President,even a conservative, it is abnormal.

      • Nate says:

        What a surprise that Stephen, so concerned about infringing on individual liberty, cant abide people loving whoever they choose.

        “It is abnormal”

        Sure just as black-white couples, female doctors, etc used to be abnormal.

        We adapt or die off, your choice.

        • Stephen Paul Anderson says:

          So, it’s a choice?

        • Nate says:

          No Stephen, its pretty clear what I meant.

          Intolerance is a choice. Letting it affect who you hire is discriminatory, and not recommended.

          Homophobia is a choice, and sometimes indicative of repressed homosexual tendencies… You maybe?

    • Nate says:

      “So, like him or not, at least he has the fortitude to put forth some objective matters of moral decency for reflection on the public stage.”

      Puleez, when it comes to Trump’s prolific failures of moral decency where is his fortitude? Where is the fortitude from evangelical Christians?

  21. Eben says:

    For CO2 in atmosphere watch Prof Salby
    https://youtu.be/sGZqWMEpyUM

  22. ren says:

    If not for the Earth’s strong magnetic field (as of today), temperature changes on Earth would be extremely susceptible to changes in space weather. The decrease in the strength of the geomagnetic field in the Western Hemisphere clearly confirms this.

    • ren says:

      In June 2014, after just six months collecting data, Swarm confirmed the general trend of the field’s weakening, with the most dramatic declines over the Western Hemisphere. But in other areas, such as the southern Indian Ocean, the magnetic field had strengthened since January. The measurements also confirmed the movement of magnetic North towards Siberia. These changes are based on the magnetic signals stemming from Earth’s core.
      http://www.esa.int/ESA_Multimedia/Images/2014/06/Magnetic_field_changes

  23. m d mill says:

    Thanks Dr. Spencer for your work. I will continue to follow Occam’s Razor and prefer your simple model as long as it holds true. So far so good(actually great). I will be highly surprised if it starts to fail significantly within the next 50 years.

  24. RG says:

    I tried to read all the comments, but… too many…
    Just a question: why nobody even mention the role of calcareous plankton in this co2 “war”?
    I mean, as far as I know CaCO3 shell formation is a process that throw co2 into the water and thus in the atmosphere. Now, it seems the rate of calcification is reducing (foram and nanno should do that), which means less co2 to the ocean and more uptake from the atmosphere. Is it right?

  25. Dan Pangburn says:

    Apparently there is a way to post quotation marks here. How is that done?

  26. Entropic man says:

    RG

    More detail.

    When CO2 dissolves in water it reacts to form carbonic acid.

    CO2 + H2O H2CO3

    Carbonic acid breaks down to bicarbonate and hydrogen.

    H2CO3 HCO3- + H+

    bicarbonate becomes carbonate and hydrogen.

    HCO3- CO3– + H+

    Forums and shellfish make their shells from carbonate.

    Ca+ + CO3– CaCO3

    First thought. When the atmosphere has more CO2, more CO2 dissolves in water, more carbonate forms and shellfish make more calcium carbonate. CO2 is permanently removed from the system!

    In practice it goes the other way.

    These are equilibrium reactions. When extra CO2 dissolves, the extra H+ released reduces the pH. This shifts the balance from carbonate back towards bicarbonate and carbonic acid.
    This limits further CO2 absorbtion, so a lower pH ocean is a less effective carbon sink.

    A lower pH ocean has less carbonate, so forams etc make less calcium carbonate. The lower pH also damages their metabolisms. In the end extra CO2 makes the ocean less favourable for life.

    • RG says:

      Actually, I found that the reaction involving calcite shell formation is usually depicted as:
      Ca2+ + 2HCO3- = CaCO3 + CO2
      So that the more acidic is the water less calcification which means less CO2 in the UML and higher possibility that CO2 from atmosphere to the ocean.
      I read a few paper about foram and coccolithophores, foram shells seems to be thinner than before and cocco should also go in the same direction. This would increase the bio/pump against the carbonate pump. This should be a negative feedback counteracting the amount of CO2 in the atmosphere.

  27. Rob Mitchell says:

    According to Multisensor Analyzed Sea Ice Extent (MASIE) data, the 2019 minimum Central Arctic region sea ice extent was 2,950,024.55 km^2 in September (Day 258).

    Question – will 274 deg. K be enough to melt the Central Arctic Region ice below 1,000,000 km^2 this century? I say no. The reason being is that the Arctic warming during this multi-decadal modern warming period has occurred in the NH winter. Not in the summer. To melt the Arctic ice away, there has to be warming in the summer. That isn’t happening.

    All of this concern about CO2 in the atmosphere is mostly hysterical. And our news media has done a disservice to the public by fanning the flames of this hysteria. I think Dr. Spencer’s comments about CO2 on his Global Warming: Natural or Manmade page are valid.

    What say you human-caused global warming alarmists?

  28. TimTheToolMan says:

    One would prefer to choose a doctor who kept up with the latest in medicine even in fields one wouldn’t expect a GP to need day to day over a GP who didn’t ,wouldn’t you think?

  29. Rob Mitchell says:

    Let me try that again. Hopefully this will take you right to the Central Arctic MASIE diagram.

    https://tinyurl.com/v6lkj2b

    • Rob Mitchell says:

      See those brief little dips in 2007, 2012, 2013, 2016, 2018? I seriously doubt that will be enough to make a dent in the Central Arctic region.

      • Svante says:

        Gee, the ice never reached its full extent in the winter of 2017/2018. In the central arctic!!!
        That’s alarming.

        • Rob Mitchell says:

          Anything alarming about the Northern Hemisphere sea ice area diagram Svante? Since 2005, the sea ice area has been quite stable.

          • bdgwx says:

            For Arctic sea ice area…

            – The trend of the annual mean is -0.23e6/decade and -0.22e6/decade since 1979 and 2005 respectively.

            – The trend of the September mean is -048e6/decade and -0.32e6/decade since 1979 and 2005 respectively.

            – The trend of the March mean is -0.04e6/decade and -0.16e6/decade since 1979 and 2005 respectively.

            For Arctic sea ice extent…

            – The trend of the annual mean is -0.55e6/decade and -0.42e6/decade since 1979 and 2005 respectively.

            – The trend of the September mean is -0.82e6/decade and -0.47e6/decade since 1979 and 2005 respectively.

            – The trend of the March mean is -0.42e6/decade and -0.30e6/decade since 1979 and 2005 respectively.

          • Svante says:

            bdgwx, he cherry picked the central arctic.

            You would think it should be the last bastion to fall, but his diagram shows it never reached its usual winter maximum 2017/2018.

          • bdgwx says:

            Oh my.

            Rob, that is the equivalent of drawing a conclusion regarding how fast an ice cube melts by observing only the center most portion of it. Of course it’ll appear stable…until it too starts to melt that is. See the problem?

      • Entropic man says:

        For physical reasons the high Arctic stays close to 273K even in Summer. Turning ice at 273K to water at 273K takes the same energy as warming that water by eighty degrees, so all the available energy goes into ice melt rather than raising the temperature.

        The sea ice is shrinking from the edges towards the centre.

        The central Arctic is not losing extent, but look at the graphs for the Bering Sea, the Baltic Sea and the Cook inlet in your link. All are showing reduction over time.

        Because of an old troll this site does not tolerate the letter D and the letter C next to each other. Hence the stars.

        • Gordon Robertson says:

          entropic…”The central Arctic is not losing extent, but look at the graphs for the Bering Sea, the Baltic Sea and the Cook inlet in your link. All are showing reduction over time”.

          During one month of summer. The rest of the year it’s business as usual.

          • Svante says:

            Yeah, and tell us about that book you read, you know that intrepid north pole explorer who reported solid ice all the way.

  30. Dan Pangburn says:

    Nate,
    Up-thread you asserted that I ignored a portion of your post relating to radiation above the tropopause. Apparently you have not looked very closely at Se.ction 3 of (click my name). More CO2 above the tropopause enhances the cooling there. That might be the only place where I agree with the GCMs. The added cooling from increased CO2 above the tropopause counters the added warming from added CO2 at ground level.

    Apparently Clive is unaware of the 1.47% per de.cade increase in water vapor. He does not mention this or thermalization. Also he couldnt remember where he got the 25 m mean free path of photons from and did not refute a post of 1.5 m. I do not think it matters much in that rang.e but do think the combination of absorbers, CO2 and WV, should be considered tog.ether. The important understanding is that the absorbed radiation energy is shared with surrounding mole.cules, i.e. thermalization.

    • Nate says:

      “Up-thread you asserted that I ignored a portion of your post relating to radiation above the tropopause.”

      Nope, never said that.

      It is all about the emission elevation WITHIN the troposphere of the CO2 and the increase in that elevation with increasing CO2 level.

      You still have not addressed this point.

      Here is another post by Clive Best that maybe useful for you to look at and respond to.

      http://clivebest.com/blog/?p=4597

      He notes that:

      “Instead of calculating radiative transfer from the surface up through the atmosphere to space, exactly the opposite is done. IR photons originating from space are tracked downwards to Earth in order to derive for each wavelength the height at which more than half of them get absorbed within a 100 meter path length. This identifies the height where the atmosphere becomes opaque at a given wavelength. This also coincides with the effective emission height for photons to escape from the atmosphere to space….”

      And

      ” The calculation can then show how changes in CO2 concentrations affect the emission height and thereby reduce net outgoing radiation(OLR). The net reduction in OLR is found to be in agreement with far more complex radiative transfer models. This demonstrates how the greenhouse effect on Earth is determined by greenhouse gases in the upper atmosphere and not at the surface.”

      • Dan Pangburn says:

        Nate,
        OK, that is how I inter.preted what you did say which was: ,,where added CO2 raises the effective radiating level, and temperature, of co2 molecules radiating to space.,,
        My point is that I did not ignore high altitude radiation where more CO2 makes it cooler. This cooling counters the tiny part of the warming attributed to added CO2 at ground level. Many other observations as listed in Section 2 of (click my name) indicate that the net effect of CO2 increase is negligible.

        Comparison of Hitran runs at ground level vs at 10 km (tropopause) shows absorb/emit intensity to be about 300 times higher at ground level. Thus the slowing of the energy flux is much greater at lower altitudes where WV dominates. That is consistent with WV increase being more important to GW than CO2 increase. The hash in wavenumber range 200 to 600 in TOA graphs of flux vs wavenumber shows that a lot of outward directed radiation from WV makes it all the way to space.

        • Nate says:

          ‘My point is that I did not ignore high altitude radiation where more CO2 makes it cooler. This cooling counters the tiny part of the warming attributed to added CO2 at ground level.”

          No Dan. No it doesnt make it cooler. That is a declarative statement lacking evidence.

          Clive Best is explaining why , and you have not shown why he and all others are wrong.

          With added CO2 the effective radiating level rises in the troposphere, and the outgoing radiation, OLR, is reduced, as you can see in his HiTRan generated plots.

          • Dan Pangburn says:

            Nate,
            OK, I think I understand what Clive did and how he inter.preted what he found. He reversed the direction of the flux and discovered that more ghg molecules absorbed the radiation sooner, Sooner means higher altitude when the flux is coming in. Common sense says that is true qualitatively but Hitran and the Standard Atmosphere puts numbers on it.

            At this point it appears that Clive either assumed that higher altitude means colder and/or higher altitude means fewer CO2 molecules because of lower pressure. Higher means colder in the Standard Atmosphere (below the tropopause) but Std Atm no longer applies. Lower pressure yes but the atmosphere is very thin compared to ground level. This situation is ripe for confirmation bias to raise its ugly head. Finding what looks like an explanation for a forgone conclusion that CO2 increase causes warming. Because of the low pressure, there are few molecules so the elevation change is a tiny change in IR active molecules compared to ground level.

            I am describing a fundamentally different understanding of the cause of the human contribution to warming. It should be obvious (existence of the GHE) that molecule relaxation time must be greater than zero. The fact that air containing a ghg like water vapor gets warmer when IR radiated demonstrates that the relaxation time is much longer than the time between molecule contacts. Measurements of relaxation time demonstrate several orders of magnitude longer. While the absorbed energy is residing in a molecule and shared with surrounding molecules they are warmer. More ghg molecules means more cumulative relaxation time, more warming. It is the increase of ghg molecules, mostly WV, in the bulk of the atmosphere that contributes to GW.

          • Nate says:

            Dan, thanks for taking other peoples ideas seriously.

            ” I am describing a fundamentally different understanding of the cause of the human contribution to warming.”

            Perhaps, but that should not give you free license to ignore or reject all previous understanding, which has been developed by many smart people over many decades.

            You must show why that previous understanding of the co2 ghe and its magnitude is incorrect.

            You havent done that.

            Your analysis of water vapor is wholly separate issue.

          • Nate says:

            “It is the increase of ghg molecules, mostly WV, in the bulk of the atmosphere that contributes to GW.”

            That is an assertion, but not consistent with prior results. The bulk atmosphere is already largely opaque in the relevant spectral peaks of CO2. Yet in the upper troposphere it is not. This is where the added CO2 has its greatest effect.

        • Nate says:

          ‘Many previous papers showed that CO2 DOES affect climate.’

          “Can you name even one? I am unaware of any.”

          Should have said many previous papers have calculated added CO2’s effect on GHE and find a different result than you do.

          See eg here, discussion surrounding eq 3.

          https://pubs.giss.nasa.gov/abs/ha04600x.html

          • Dan Pangburn says:

            You know very well that correlation does not prove causation.

          • Nate says:

            Non sequitur.

            The paper discusses and quantifies the mechanism for the co2 ghe. It is the very same one discussed by clive best. It is describing action in the upper atm a d ivolving the lapse rate. Your quantification of only the surface co2 effect is insufficient.

            It also accounts for the GHE on 3 planets, snd correctly predicts the warming and its spatial pattern over the next 40y.

          • Dan Pangburn says:

            Nate,
            Not a non sequitur at all.
            ,,… is consistent with…,, means correlates with and besides, the model had been trained to do that. The rest is hand waving and weasel words: possibly, appear, should, Potential.

            The pathetic ignorance of Hansen and co-authors 39 years ago is exhibited with this statement: ,, Carbon dioxide absorbs in the atmospheric “window” from 7 to 14 micrometers,,

            If 1981 T was compared with 1941 T there was no T change. Sea level has been rising about 20 cm per century for millennia and the Northwest Passage never opened.

          • Dan Pangburn says:

            Nate,
            I have no intentions of trying to directly refute blaming CO2 increase for GW. Mother Nature is doing that with average GCM predictions (ex.cep.t the Russians) being about twice measured. (I get a lot of Russian visits to my blogs but I would be sur.prised if there is a connection)

            One problem in separating CO2 effects from WV effects by means of warming performance is that they have been increasing pretty much in parallel The Hitran assessment separates them by showing that a WV molecule is 1.37 times more effective at absorb/emit than a CO2 molecule. At ground level, WV molecules have increased 441/59 = 7.47 times faster than CO2 molecules. Again according to Hitran, absorb/emit intensity for WV declines to about the same as CO2 at the tropopause but the intensity level is about 1/300 of what WV is at ground level. That is the basis for the GHE caused by WV in the bulk of the atmosphere.

            The clincher is that WV has been increasing faster than possible from temperature increase.

            Saying that the atmosphere in a wn range is opaque to CO2 is misleading. It is a term that applies to light shining into dusty air, etc. but applies only to the photons emitted by the surface to ghg molecules. The flux is maintained (ex.cep.t for the energy redirected to WV molecules and radiated to space) by emission from ghg molecules below.

            (WV is imposed in various unnatural ways in the different models in Hansens 1981 paper but then WV had not yet been measured accurately worldwide). Also, the paper presents the results of models. There is no way to show that the models are not faulty even if they apply the laws of physics correctly. To predict, they would need future TPW maps.

            A reason why my predictions in (click my name) are fuzzy is that AMO (a major contributor to SST cycles) is not decliningyet. Also, WV is still on the rise. I have no way of predicting when these will change.

            The importance of WV, results from an understanding and application of thermalization, redirection of part of the energy absorbed by CO2 to WV and radiation of the energy from WV to space (which explains what happened to the energy missing from the notches in TOA graphs of flux vs wavenumber), radiation from WV to space which explains the hash in the range wn 200-600. The cooling from increased CO2 above the tropopause counters the warming from increased CO2 at ground level.

          • Nate says:

            Dan,

            You are incorrigible. No matter what I show you or explain, you return to the same talking points.

            Im specificall addressing your CALCULATED co2 GhE, NOT the reslized GW.

            Your anslysis of the magnitude of the co2 GHE IGNORES the main effect in the upper troposphere that hss been understood for 40 y!

            Therefore when you keep repeating the FALSE claim that CO2 is 10 x less effective.

            It is only 10 x less effective when you ignore its main effect, and simply assume that this main effect is absent!

            You need to be more self skeptical.

          • Nate says:

            “The pathetic ignorance of Hansen and co-authors 39 years ago is exhibited with this statement: ,, Carbon dioxide absorbs in the atmospheric window from 7 to 14 micrometers”

            Not seeing anything like that in the paper. The say they include ‘weak’ lines in 8-12 but that 90% is absorbed at 15 microns.

            “If 1981 T was compared with 1941 T there was no T change. Sea level has been rising about 20 cm per century for millennia and the Northwest Passage never opened.”

            NONSENSE! 20 cm/century would be 4 m lower in Roman period. We KNOW from archaeology that that is preposterous!

          • Nate says:

            “Potential effects on
            climate in the 21st century include the creation of drought-prone regions in North
            America and central Asia as part of a shifting of climatic zones, erosion of the West
            Antarctic ice sheet with a consequent worldwide rise in sea level, and opening of the
            fabled Northwest Passage.”

            All darn close to what is happening. The NWP has opened in late summer several times, and we are just at the beginning of the 20th century. W Antarctica ice sheet IS eroding and becoming ustable.

            Reread the paper.

          • Nate says:

            Arrgh. 21st century.

  31. Snape says:

    @Entropic Man

    [For physical reasons the high Arctic stays close to 273K even in Summer. Turning ice at 273K to water at 273K takes the same energy as warming that water by eighty degrees, so all the available energy goes into ice melt rather than raising the temperature.]

    For my own benefit, I try to put ideas like this into a more familiar, easy to understand context:

    – Shine a heat lamp onto a very cold slap of concrete. There is nothing to prevent the concrete from warming to a temperature above 32 F.

    – Shine a heat lamp onto a slab of ice. The ice heats up until it reaches 32 F, at which point it starts to melt instead of continuing to get warmer.

    – Air is mostly heated by the ground, mostly transparent to sunlight. Therefore the air temperature will not rise above 32 F if the ground temperature does not rise above 32 F.

    ******

    Is this an accurate way of looking at the situation?

  32. Gordon Robertson says:

    Good for you, Roy.

  33. Snape says:

    @Nate

    [The calculation can then show how changes in CO2 concentrations affect the emission height and thereby reduce net outgoing radiation(OLR). The net reduction in OLR is found to be in agreement with far more complex radiative transfer models. This demonstrates how the greenhouse effect on Earth is determined by greenhouse gases in the upper atmosphere and not at the surface.]

    I dont like that idea, at least in principle. Is it supported by climate model simulations?

  34. Snape says:

    Hi Nate

    It has become a standard description only the last few years, right? In any case I need to read up on it some more. Get back to you later.

  35. Snape says:

    Thanks Nate, I stand corrected.

  36. Snape says:

    Nate,
    This is the popular idea that for SEVERAL reasons has been driving me nuts. Maybe you could help?

    https://youtu.be/4PAbm1u1IVg

    For starters, he says the emission level to space needs to be about 255 K, the temperature needed to produce an upwelling flux of 240 w/m2, equal to the absorbed downwelling flux from the sun.

    Obviously, a thin layer of atmosphere at that temperature wont be enough to do the job, so he is really talking about a layer that is perhaps a kilometer or more thick, with the cumulative emissions adding up to 240 w/m2.
    Fine, I have no problem with that.

    What bugs me is why he thinks this layer would only radiate towards space, and not also towards the surface (for a total emission of 480 w/m2) ??

    After all, does the green plate only radiate from one side?

    • Nate says:

      yeah, I think it is an ~ opaque layer that radiates from top and bottom ‘surfaces’. Like the Green plate the NET flux from below would be an INPUT to the layer, while upward the NET flux is OUTPUT and ~ the cumulative emissions 240 W/m^2.

      • Svante says:

        If you look at radiation layer by layer, you get a pattern like this (unrealistic example) to send out 80 W/m^2 through four layers, each layer sending the same amount in all directions.

        Layer Down In Up
        0 0 400 400
        1 320 640 320
        2 240 480 240
        3 160 320 160
        4 80 160 80

        It’s different for different frequencies so you have to put together an average spectrum from small bands.

        For Earth, the average number of layers is less than one.
        IIRC the average TOA altitude is something like 6500 m, and the enhanced GHE is something like 200 m.

        It’s also worth noting that radiation is a small player inside the troposphere because it has convection, and there is no GHE without the lapse rate that it is largely responsible for.

        More knowledgeable people such as Nate and barry can correct me.

  37. TimTheToolMan says:

    “You are reading between the lines in ways that match your pre-conceived notions.”

    That’s ironic. I’m reading the carefully constructed words. You’re reading between the lines and it’s what they want you to do.

  38. Snape says:

    Nate
    Huh? The green plate radiates equally from both sides, just like every layer of atmosphere has an upward and downward flux (upwelling/downwelling).

    Andrew has calculated a temperature based only on the upwelling.

  39. Snape says:

    @Nate, Svante

    Thank you both. The example was very helpful.

    ******

    [Emits equally, but NET flows not equal up down.]

    This confuses me WRT the Stefan-Boltzmann law. Does the law assume the net flow of conduction is zero?

    Start with the earth surface at 0 K, and give a radiative input of 400 w/m2. As the surface warms, it begins to emit upwards, but it also begins to conduct downwards.

    As the SUBsurface warms, the net downward flow of conduction would decrease, but why would it ever equal zero?

    My problem, then, is that If the conductive flow varies, this would cause the surface temperature to vary as well, and the surface temperature would not be predictable based solely on the radiative input.

  40. Svante says:

    This sort of thing, 400 W/m^2 up from the surface, 80 W/m^2 net through four layers:
    Layer Down In Up
    0 0 400 400
    1 320 640 320
    2 240 480 240
    3 160 320 160
    4 80 160 80

  41. Svante says:

    You’re welcome.

    The Stefan-Boltzmann law does not apply to conduction, it depends linearly on temperature difference, conductivity and distance. These factors make it tiny compared to solar input. We have thousands of degrees in difference adds something like 0.018 W/m^2.

    Starting at zero it would take a very long time, but when the centre has caught up the net flow would be zero, but you would hardly notice the difference.

    Kristian convinced me that radiation occurs inside solids, which made me wonder if part of conduction was radiation. Since conduction has no T^4 term the answer is no. My best guess is that there are so many “layers” in solids that net radiation is negligible.

  42. Snape says:

    Svante, Nate
    I am still confused, but maybe that is because Eli misapplied the Stefan-Boltzmann Law (SBL) when he explained how the blue plate acquired its initial temperature?

    A) Imagine a 400 w/m2 flux from the sun, shining on a slab of dark concrete. Assume the concrete is a perfect black body.

    B) if the slab is sitting atop a layer of molten lava, you could not find the temperature of the slab using the SBL. The upwelling conductive flow from the lava would mess up the result, right?

    C) similarly, if the slab was sitting atop a thick layer of liquid nitrogen, a downwelling conductive flow would mess up the result, right?

    *****

    Conduction is just one of the three means of heat transfer, and can therefore be expressed in terms of w/m2, just like the others.

    With this in mind, what if in example C we substituted radiation for conduction…… same downwelling flux of energy?

    Why would the SBL temperature for the slab of concrete suddenly be accurate????

  43. Snape says:

    Mind you, I am thinking out loud. Already starting to answer the questions I just asked.

  44. Snape says:

    Nate, Svante

    [Using the Stefan Boltzman Law you can calculate the temperature of the plate when it reaches equilibrium (400 W/m2) = 2 σ Teq4 where σ is the Stefan Boltzmann constant 5.67 x 10-8 W/(m2 K4), factor of 2 for a two sided plate per m2. Run the numbers Teq=244 K.]

    – Eli

    *********

    Factor of 2 for the two sided layers in the atmosphere, but not for the surface, which is a one sided layer.

    Sound about right?

  45. Snape says:

    @Svante
    [Its also worth noting that radiation is a small player inside the troposphere because it has convection, and there is no GHE without the lapse rate that it is largely responsible for.]

    Here are the numbers for convection versus radiation near the surface. The radiative portion obviously increases with altitude.

    Net upward flux, convection: ~105 w/m2
    Net upward flux, radiation: ~ 58 w/m2

    https://en.wikipedia.org/wiki/Earth%27s_energy_budget

    ******

    Also worth noting…. when people describe the lower troposphere as being mostly saturated or opaque, this is from the viewpoint of a satellite. Not true when viewed from within the troposphere.

  46. Snape says:

    @Svante

    [We have thousands of degrees in difference adds something like 0.018 W/m^2.]

    I was not talking about energy from Earths core. During the day, there is a downwelling conductive flux as the sun warms the soil. At night, the flux is reversed.

    I realized the two fluxes are therefore offset using a global average. Net zero conduction, so the SB temperature is not effected.

  47. Svante says:

    You see the same opaqeness when you look to the sky. You can not see deep space in those bands, you see something warmer. Heat loss is driven by the T^4 difference. Add a little more opaqeness and you lose less.
    The saturation argument fails because you have to repeat the calculation through more layers.

  48. Svante says:

    Anything that evens out the surface temperature will reduce energy loss rate a bit because of the T^4 assymetry.

  49. Snape says:

    Svante,
    [You see the same opaqeness when you look to the sky. You can not see deep space in those bands, you see something warmer.]

    Yes. I didnt explain it right.

    Much of the radiation detected by satellites has been found to originate from high in the troposphere. However, an IR detector located mid troposphere and pointed down would also detect large amounts of radiation. This time emitted from the lower troposphere.

    So you need to be careful about what conclusions are drawn from satellite based observations.

    *****

    Sorry, I dont know what you mean by T^4 assymetry.

    • Svante says:

      T^4, output power (energy per time unit) depends on temperature to the power of four, so a small increase means much more power. The same decrease in temperature means less.

      You need a computer program to calculate up/down flux at different altitudes:
      http://climatemodels.uchicago.edu/modtran/

      Energy balance must be satisfied at both surface and TOA, but it is more easily understood at the TOA.

      If there is TOA imbalance the surface must follow (unless the lapse rate changes).

  50. Snape says:

    Thanks for the cool link.

  51. Snape says:

    To my original point way upthread:

    given an unchanged lapse rate, any warming to the surface – GHE, less albedo, UHI, whatever – must result in a higher ERL.

  52. Svante says:

    No,that altitude is determined by the opaqueness, less albedo at the surface will do a parallel shift of the lapse rate curve.

  53. Svante says:

    In other words, less albedo means more output power at the same ERL.

    More GHGs mean the same output power at a higher ERL.

  54. Snape says:

    [In other words, less albedo means more output power at the same ERL].

    Understood!
    It is refreshing to argue with someone who is smart and well informed.

  55. Svante says:

    Thanks, I’m only slightly ahead of you here.

    Most of the criticism you see here is rubbish, but an interesting one is from Kristian, saying output power is not actually going down. Well informed people like barry points out that data uncertainty is too high.
    I think it may well be true due to a) feedbacks and b) the Earth has been playing catch up for over two hundred years, as in fig. 1D here:
    https://www.pnas.org/content/111/47/16700

    To get further you need to sort out the different components like fig. 8 here:
    https://journals.ametsoc.org/doi/full/10.1175/JCLI-D-18-0045.1

    8b) does show a slight decline in the GHG output (gold).

    Critics will point out that this is modelling, but when you compare theoretical calculations with observations they are pretty good.

  56. Rah says:

    But today the IPCC doesn’t seem to believe total solar irradiance is much of a factor and yet there was no cascade of positive feedbacks resulting in catastrophic warming.

    Instead the oceans apparently able to take up so much that there was an explosion in life forms to take advantage of the situation.

    Also during the later part of the Cretaceous insolation was apparently high enough to result in the algae bloom which eventually resulted in the formation of the White cliffs of Dover.

  57. Snape says:

    @Svante

    I would like to add something to your dam analogy (other thread):

    A higher dam forces water entering the reservoir to take a less direct route to get to the other side. This is due to greater vertical mixing.

    A less direct route means it takes longer. Taking longer equals an increase in residence time. An increase in residence time increases the volume of water within the reservoir.

    You could widen the dam instead of making it taller – increasing the time spent in lateral mixing. Same result.

    ******

    When CO2 absorbs energy that was moving in the direction of space, much of the energy is forced to travel in a less direct route ……

    • Svante says:

      Temperature corresponded to water level in the analogy.
      So volume would correspond to thermal energy/heat capacity.

      We could send water through real fast with turbo pumps, but the dam would still have the same contents.

      In the atmosphere, residence time would depend on how long molecules hold on to vibrational energy, and how long they play billiards with other molecules. I don’t think that will affect the temperature distribution in the atmosphere. Once there is equilibrium there will be one quanta in and one quanta out regardless of residence time.

      You should ask for a professional answer from Tim Folkerts next time he comes around!

  58. Snape says:

    Do you think downwelling infrared leaves the atmosphere just as soon as upwelling infrared?

  59. Snape says:

    I think Tim Folkerts understood the idea, but didnt know how it could be measured WRT the atmosphere.

    ******

    One turtle per second enters a soccer field, and one turtle per second leaves the soccer field.
    How many turtles are on the field?

  60. Snape says:

    Better yet, think of a cardboard box, closed on all six sides.

    There are two little holes in the box, where bees are constantly entering and leaving. One bee per second is entering the box from one of the holes, one bee per second is exiting the box from the other hole.

    How many bees are in the box?

    • Svante says:

      Snape says:
      “Do you think downwelling infrared leaves the atmosphere just as soon as upwelling infrared?”

      No, I think DWIR will be stopped nearer the top on average, so a bigger fraction will exit immediately. There’s a lot more IR near the surface and it has a long way to the top.

      Does your tracking include time spent thermalized in gas molecules?That would be the most of it since EM has the speed of light.

  61. Snape says:

    Stopped nearer the top still takes longer than not stopped at all.

    As for thermalized…..

    With no GHGs, radiation travels spaceward in a strait line at the speed of light. If you think of the atmosphere as a box, then these joules leave the box in a fraction of a second.

    When thermalized, the same quanta of energy travels at maybe 2 or 3 MPH in the direction of space. Some of the energy even travels laterally (advection), and some sinks back towards the surface or is returned as rain.
    So yes, a huge increase in residence time as a result of thermalization.

  62. Snape says:

    [If energy has no efficient way to escape, then even a weak rate of energy input can lead to exceedingly high temperatures, such as occurs in the sun. I have read that it takes thousands of years for energy created in the core of the sun from nuclear fusion to make its way to the suns surface.]

    Residence time, Svante. Are you ready to throw in the towel?

  63. Snape says:

    How many bees are in the box (steady state) if it takes each bee a thousand years to leave?

  64. Snape says:

    Tired of my rants?

    Take care, Svante. I hope you stay healthy.

    • Svante says:

      No, but it’s easier to mess around with crazy stuff from the brat et al.

      Half hoping I’ll catch it, it’s the only way to get good immunity.

  65. Quintus says:

    Mis amigos me han dicho que fumar en pipa es mas saludable. Me estoy planteando comprarme una pipa en esta web que tiene buenos comentarios. ¿La conoceis?

Leave a Reply