Tuesday, October 27, 2015

Hansen's 1988 predictions revisited.

Hansen's famous 1988 paper used runs of an early GISS GCM to forecast temperatures for the next thirty years. These forecasts are now often checked against observations. I wrote about them here. That post had an active plotter which allowed you to superimpose various observation data on Hansen's original model results.

We now have nearly four more years of results, so I thought it would be worth catching up. I've updated to Sept 2015, or latest available. Hansen's original plot matched to GISS Ts (met stations only), and used a baseline of 1951-80. I have used that base where possible, but for the satellite measures UAH and RSS I have matched to GISS Ts (Hansen's original index) in the 1981-2010 mean. That is different to the earlier post, where I matched all the data to GISS Ts. But there is also a text window where you can enter your own offset if you have some other idea.

A reminder that Hansen did his calculations subject to three scenarios, A,B,C. GCM models do not predict the future of GHG gas levels, etc - that must be supplied as input. People like to argue about what these scenarios meant, and which is to be preferred. The only test that matters is what actually occurred. And the test of that are the actual GHG concentrations that he used, relative to what we now measure. The actual numbers are in files here. Scenario A, highest emissions, has 410 ppm in 2015. Scen B has 406, and Scen C has 369.5. The differences between A and B mainly lie elsewhere - B allowed for a volcano (much like Pinatubo), and of course there are other gases, including CFC's, which were still being emitted in 1988. Measured CO2 fell a little short of Scenarios A and B, and methane fell quite a lot short, as did CFCs. So overall, the actual scenario that unfolded was between B and C.

Remember, Hansen was not just predicting for the 2010-15 period. In fact, his GISS Ts index tracked Scenario B quite well untill 2010, then his model warmed while the Earth didn't. But then the model stabilised while lately the Earth has warmed, so once again the Scenario B projections are coming close. Since the projections actually cool now to 2017, it's likely that surface air observation series will be warmer than Scen B. GISS Ts corresponds to the actual air measure that his model provided. Land/ocean indices include SST, which was not the practice in 1988.

So in the graphic below, you can choose with radio buttons which indices to plot. You can enter a prior offset if you wish. It's hard to erase on a HTML canvas, so there is a clear all button to let you start again. The data is annual average; 2015 is average to date. You can check the earlier post for more detail.

Update - I have hopefully improved the Javascript to keep everything together.




66 comments:

  1. Nick,
    this is wonderful. in order to get the Hansen B&W raster, I needed to press the "clear all" button. Thank you for your many fascinating posts.

    ReplyDelete
    Replies
    1. Thanks, JF. The original post, frrom 4 years ago, had that fault, so I put in the warning. I could never find out why, being fairly new to JS. This time I didn't have the problem, but yes, that is the remedy.

      Delete
  2. Hmm. I am not sure I agree that the land-only indices should be a better match to the Hansen chart. While in 1988, they may have used surface air temperature over the oceans rather than SSTs, my guess is that those over ocean air temperatures are likely to be closer to the SSTs than they are to the land air temperatures. But, you do give us the option of comparing whatever we want, which I appreciate.

    Anyway, very neat tool, thank you!

    -MMM

    ReplyDelete
    Replies
    1. NIck, I'm inclined to agree with Anon/MMM as to the reliability of a land-only index, which is what I thought GISS Ts was. I'm happy to be corrected on this, though.

      I also agree with anon/MMM: a very useful tool!

      Delete
    2. Thanks, Bill and MMM,
      GISS Ts uses met station data. But it is the original index from GISS 1988 times. In effect, the land data is extrapolated over ocean to get a global average. This of course puts a big premium on ocean stations.

      Of course it isn't an ideal index. But it is the only one that is air-based and claims to be global surface. On the general question of relating air-based GCM output to land/ocean indices, there is the recent paper of Cowtan et al 2015.

      Delete
  3. My stupid Firefox on stupid Android is messing this up. Comments over graph. Hansen plot no longer showing. No, I won't clear the cache. :-(

    ReplyDelete
    Replies
    1. A trick I learnt recently was Ctrl-F5, which reloads just that page from source rather than cache. I don't know if it works on Android.

      Yes, I re-used the Javascript from the old post; if starting it again, I would do it differently. Sorry about the trouble.

      Delete
    2. Revisited with new/different Android+Google Chrome. Now it's not even not working. ?.. Ah duh, after wiping to the left some buttons appeared on the right who have an effect: Random unknown graphs appearing. Just the text at the buttons is missing.

      Delete
  4. Three points Nick...
    1) Wouldn't it be better to start the satellite recordings at the TempLS (or GISS Ts) recorded value for 1978.
    2 ) As pointed out above the previous messages are overwriting the graph in Firefox,
    3) In Firefox the buttons for presets don't work. Also they need to be in 10ths of a degree as 1 - 10 degrees goes off the graph!
    Otherwise a superb tool.
    Dave UK

    ReplyDelete
    Replies
    1. Thanks, Dave, I guess the message is, needs more work. I'll see if I can rewrite using my more recent practices, which should fix these issues.

      The presets are meant to work just by writing in whatever you want as a decimal. Then it will be applied when you press the radio button.

      As to satellite, I don't think there is any best answer; really it isn't a comparison that should be made. If the offsets work, that is meant to let you make that choice.

      Delete
    2. Dave,
      I've fixed the step size - thanks for noting it.

      Delete
  5. Per this site: http://www.esrl.noaa.gov/gmd/aggi/aggi.html, added GHG forcing from the 5 gases specified in the linked file was 2.82 W/M2 in 2014. Using the NOAA site's methods, estimated GHG forcing for Scenario C and B are 2.74 and 3.61 W/M2. So current forcing is much closer to Scenario C than B.

    ReplyDelete
    Replies
    1. Forgot to mention that Scenario B and C values above are for 2014.

      Delete
  6. Interesting, scenario B and C seem to produce Agung or Chichon-size eruptions every 20 years, or so. It seems to be such eruptions 1996 and 2016, but no Pinatubo...

    I believe that Gistemp dTs would benefit from GHCN v4. V3 has become really poor with islands in recent years. The Gistemp 1200 km interpolation map based on land stations only doesn't include Hawaii and the Azores for instance.
    I think it is possible to construct better global 2 m observational indices, e g by taking a global reanalysis field and "anchor" it in real observed station temps. A similar approach as the Cowtan and Way hybrid indices, but without SST data..

    ReplyDelete
  7. Nick, Thanks for the reference to the Cowtan et al paper. I agree this does indicate use of GISS Ts as appropriate.

    I also note that Oct 2015 NCEP/NCAR anomaly values remain more than 0.2 degrees above September values as we near the end of October. This is quite a spike. Temperatures here in Twickenham, UK, are very mild indeed at the moment with 19 degrees forecast for the big Antipodean sporting event here on 31st Oct. Things look somewhat cooler than they have been where you are, though other parts of Oz are still toasty.

    ReplyDelete
    Replies
    1. Hansen himself commented on this in a paper from 2006:

      http://pubs.giss.nasa.gov/docs/2006/2006_Hansen_etal_1.pdf

      "Temperature change from climate models, including that reported in 1988 (12), usually refers to temperature of surface air over both land and ocean. Surface air temperature change in a warming climate is slightly larger than the SST change (4), especially in regions of sea ice. Therefore, the best temperature observation for comparison with climate models probably falls between the meteorological station surface air analysis and the land–ocean temperature index."

      At that time the match between model and temperature was spot on. Hansen commented:

      "Close agreement of observed temperature change with simulations for the most realistic climate forcing (scenario B) is accidental, given the large unforced variability in both model and real world. Indeed, moderate overestimate of global warming is likely because the sensitivity of the model used (12), 4.2°C for doubled CO2, is larger than our current estimate for actual climate sensitivity, which is 3°C for doubled CO2 , based mainly on paleoclimate data"

      Still applies. A good match is not good news considering the higher climate sensitivity of the 1988 model.

      Delete
    2. Hi Bill,
      I think ehak's quote is relevant there. Basically, Ts is trying to measure the right thing for comparison, but misses some ocean. Land-ocean isn't measuring quite the right thing, but is more complete.

      It is warm here. In fact, an article here saying that Australia will be making its contribution to a record hot global month in October.

      Delete
  8. Why is RSS and UAH plotting different here than in woodfortrees? http://www.woodfortrees.org/plot/hadcrut4gl/from:1978/plot/rss/from:1978
    I will think if you the same base lines, the fit for the satelite dat will be well below Hansen's scenario C?

    ReplyDelete
    Replies
    1. Anon,
      WFT is still using UAH 5.6. I'm using V6, which is very different. RSS, I'm not sure about. The baselines are of course very different.

      The thing is, though, that the satellites are measuring a very different place, so the curves are very different. There is no completely satisfactory baseline setting to compare them. They simply are not at all what Hansen was modelling.

      Delete
    2. Nick,

      Is it true that the balloon data show about the same as the satellites (slow warming compared to the models)?
      Is it also true that the troposphere needs to warm as quickly as the surface? If the surface heats quicker, then increased convection would bring the troposphere into alignment with the surface?

      If all this is true then it is difficult to reconcile the surface and satellite warming rates..
      Probably all this is not true then?

      Delete
    3. Anon,
      No, Balloon data show an unabated warming trend of 1.9 C/century since 1970
      https://tamino.files.wordpress.com/2015/06/ratpac97.jpeg

      The satellite series show a clear trend break versus balloon data at the turn of the century
      https://tamino.files.wordpress.com/2015/09/rssminusrat.jpeg

      The same applies to UAH v6. Something happens at about year 2000, probably some error in the merging of MSU and AMSU satellite data.
      If satellite derived temperatures had followed weatherballon data, they would have been 0.2-0.3 C higher today.

      For those who want an alternative to troposphere temperatures by UAH and RSS,
      Ratpac 850-300 hPa can be found here:
      http://www1.ncdc.noaa.gov/pub/data/ratpac/ratpac-a/RATPAC-A-seasonal-layers.txt

      Delete
    4. Po-Chedley analysis of MSU/AMSU data finds tropical tropospheric warming consistent with models as does Sherwood radiosonde analysis.

      http://www.atmos.uw.edu/~qfu/Publications/jtech.pochedley.2015.pdf

      http://web.science.unsw.edu.au/%7Estevensherwood/IUK2_v3.pdf

      Delete
  9. Interesting, indeed. Clean and clear. Thanks

    ReplyDelete
    Replies
    1. What's clear? A couple of years ago I read that Hansen is stupid and wrong. Is that still clearly the case?

      Delete
  10. Hi,

    AFAEIK the problem with going back to Hansen et al 1988 is that the climate sensitivity is too high and the forcing too low. The "good match" is a product of the two, so if you want to evaluate the model you have to start with rerunning it with updated forcings, in which case over 25 years, it would probably run too hot.

    The really impressive parts of the paper are the 100 year control run, and the predictions of spatial variation of the warming which, as the paper said, would be the most powerful way of evaluating the model. Don't think this has been done, or at least not to Eli's knowledge, which ain't that much

    ReplyDelete
    Replies
    1. Eli,
      Hansen says the ECS is 4.2. That is within the IPCC range, though at the upper end. The question of forcing of course relates to the scenarios. As said in the post, for Scenario B, CO2 is a little high for 2015, and CH4 and CFCs much too high. His CH4 was 2170 ppb in 2010, vs actual 1850.

      On spatial prediction, here is a comparison of change to end 2014, vs "2010's" in Hans88. I think it's about as good as can be expected, but not a major success. It's a bit hard to tell with the different color schemes.

      Delete
    2. As I posted above the GHG forcing for the 5 gases in the linked Hanson scenario file is slightly higher than Scenario C in 2014. Aerosols though are the missing piece and may be compensating for the relatively high model ECS.

      Delete
  11. Talking about balloon measurements and the upper atmosphere, I am beginning to think scientists such as Richard Lindzen really screwed up the fundamental understanding of atmospheric patterns.

    I submitted a paper that clearly shows that the QBO of atmospheric winds is completely guided by the boundary conditions of the lunar gravitational potential.

    http://contextearth.com/2015/10/09/qbo-is-a-lunar-solar-forced-system/qbo_paper/

    It's the most bizarre situation where you read all these papers that claim QBO is not predictable beyond a year or two yet it's quite obvious that it aligned with the tidal forces, which is one of the most predictable phenomena that I am aware of. Too many scientists ran with Lindzen's original QBO theory, which is likely as wrong as all his other theories.

    At one time I was interested in closely tracking the temperature, but when you find these other interesting correlations, it seems kind of a waste of time. Much more important to show what the actual forcings are and start to build up a foundation for explaining natural variability. Like QBO, ENSO is also one of these phenomena that is
    guided by lunar gravitational potential boundary conditions.

    In straightening this out we would also get denier nutcases like David Young off your case who claim that realistic models are mathematically intractable.

    Scientists such as Lindzen and Young are only in it because they are born contrarians and enjoy over-complicating the science. It's really time to retrench and look at the simple patterns and apply these to natural variability.





    ReplyDelete
    Replies
    1. Tropical year of 265.242 days? Is that a typo?

      Delete
    2. yes, it should be 365.242 days, thanks

      Delete
  12. A fascinating discussion.

    Is there any reason why UAH (take your pick as to which of the numerous, er, corrections you use) and RSS have acquired "authoritative" status - not just among the haters of AGW theory, to use one possible alternative to the word d***ers?

    ReplyDelete
  13. This comment has been removed by the author.

    ReplyDelete
  14. WHT,

    Remarkable coincidence here:

    I notice in Nick's blog roll that the latest post on, of all blogs, Tallbloke's Talkshop is about lunar guidance of the QBO (and QDO). Tallbloke is keen on the idea that climate variation can be explained by the alignments of sun, moon and planets (once described by Watts as "transcendent rant", but maybe he is on to something here ;) ).

    ReplyDelete
    Replies
    1. Bill, The ones to pay attention to are NASA JPL, who had a fairly recent proposal
      "Importance of the Earth-Moon system for reducing uncertainties in climate modelling and monitoring"
      ftp://ftp.cerfacs.fr/pub/globc/exchanges/cassou/GOASIS/Fermat_2009.pdf

      I tracked this down and as it turned out the project never got funded and so the project lead went off on her own.




      Delete
    2. One of the interesting themes that Perigault from JPL included in the proposal was the number of times that tidal signals was intentionally removed from time series data. This given the fact that removing signals that might have an impact on the result is not necessarily the smartest path when one is trying to establish causality. It's like if someone removes a 60 Hz electrical signal from a voltage and then you are asked to find where the hum was coming from.

      Lindzen knew that the lunar signal could have been in the QBO and wrote this in one of his papers:

      " 5. Lunar semidiurnal tide : One rationale for studying tides is that they are motion systems for which we know the periods perfectly, and the forcing almost as well (this is certainly the case for gravitational tides). Thus, it is relatively easy to isolate tidal phenomena in the data, to calculate tidal responses in the atmosphere, and to compare the two. Briefly, conditions for comparing theory and observation are relatively ideal. Moreover, if theory is incapable of explaining observations for such a simple system, we may plausibly be concerned with our ability to explain more complicated systems. Lunar tides are especially well suited to such studies since it is unlikely that lunar periods could be produced by anything other than the lunar tidal potential." -- from Lindzen, Richard S., and Siu-Shung Hong. "Effects of mean winds and horizontal temperature gradients on solar and lunar semidiurnal tides in the atmosphere." Journal of the Atmospheric Sciences 31.5 (1974): 1421-1446.

      In fact the lunar signal was always in QBO, and hiding in plain sight. That's what JPL was trying to discover.


      Delete
  15. Thanks, WHT. Perusing Tallbloke's blog posts I notice: that he has some familiarity with your work; that some sort of collaboration was considered; he then went off in a huff, claiming that you had stolen his ideas. All this chimes with my own attempts at "dialogue" with TB. Like a lot of people hostile to AGW theory he's an ardent fan of Richard Lindzen ( https://tallbloke.wordpress.com/2013/03/09/lunch-with-lindzen/ ), ironic in view of Lindzen's dismissal of the idea of predictable cycles in climatology.

    Perigault et al. are of a totally different order compared with the "citizen scientist",Tallbloke. I'll follow this with great interest.

    ReplyDelete
    Replies
    1. Bill said: "Perusing Tallbloke's blog posts I notice: that he has some familiarity with your work; that some sort of collaboration was considered; he then went off in a huff, claiming that you had stolen his ideas. "

      Tallbloke and his followers are delusional. He must have thought I was some sort of ally of his, as I don't wear a AGW badge at my blog, preferring to concentrate on the science. I do wonder how Tallbloke can spin his belief in lunar and solar cycles when his hero Lindzen is the last person who will back him on it. There is so much schizophrenia on the denier side, as none of them can square their beliefs with their erstwhile allies.

      Delete
  16. Thanks for the interest Bill, we are also working this at the Azimuth Project forum:

    http://forum.azimuthproject.org/discussion/1640/predictability-of-the-quasi-biennial-oscillation#latest

    Lindzen is the interesting angle. He can't just dismiss the match of the lunar cycles with the aliased QBO cycles, largely because of what he said in 1974. Typically, you can dismiss an idea because the numbers don't match. In this case he may suggest that the forcing can't be strong enough to produce the result, but then he will have to claim the match is just a coincidence.

    ReplyDelete
  17. Nick: Has anyone ever commented that Hansen's Scenario C shows no evidence for "committed warming"? We are told that once forcing plateaus - as it does in 2000 for Scenario C - an additional 0.5 degC or more of warming should be expected. With high model climate sensitivity, committed warming should be even larger than usual. Scenario C shows no warming after 2000. If the expected amount of committed warming did appear after 2000, observed warming would be less than the projection for Scenario C. I don't think we should be trying to draw any conclusions about the agreement (or lack thereof) between the projections of the early model and observed warming.

    The details of the mixed layer of the model ocean seem somewhat strange. The mixed layer is 125 m in depth and the diffusion coefficient beneath is 1 cm2/s. If these figures are unrealistic, then the transient response of the model will be flawed.

    Note that Hansen included a volcanic eruption in 2015 (and 1995) in Scenarios B and C.

    Frank

    ReplyDelete
    Replies
    1. Frank,
      I'm not aware of any comments on that. Of course, the time frame is short - temperature does rise till 2005, and then there is a 15-year pause. Well, we know that sort of thing is not impossible :). And the volcano in 2015 does seem to cause a dip.

      Yes, I think the mixed layer is acknowledged to be rough. It pushed people to do better with AOGCMs.

      Delete
    2. Scenario C was run on to 2040 it seems (see Figure 3 from Hansen et al. 1988), with another eruption in 2025. It looks like there was about 0.25C warming between 2000 and 2040.

      Delete
    3. Frank said
      "The details of the mixed layer of the model ocean seem somewhat strange. The mixed layer is 125 m in depth and the diffusion coefficient beneath is 1 cm2/s. If these figures are unrealistic, then the transient response of the model will be flawed."

      You are likely confused Frank. The chopping action of the waves together with the vertical eddies that occur in the open ocean can create an effective diffusion coefficient much greater than still water. I remember the loons on Curry's blog Climate Etc that ridiculed the idea that water could have as high a thermal diffusion coefficient as copper. Denialists are so desperate that they will contort the application of physical properties to suit their twisted agenda. I suggest that you don't listen to those fake talking points and instead read the scientific literature where you will find effective vertical eddy diffusion coefficients on the order of 1cm^2/sec.





      Delete
    4. WHT, Nick: I understand why the mixed layer of the ocean responds rapidly to changes in surface temperature. 1 cm2/s refers to the rate at which heat is transported below the mixed layer. This rate obviously requires convection, no conduction, so comparisons to the thermal conductivity of copper aren't useful.

      If Hansen's model had shown the expected committed warming after forcing ended in 2000, the model would have predicted greater warming we have observed. Observed forcing would have been greater than the forcing in Scenario C and warming would be less - demonstrating the failure of the model. So the success of Hansen's model touted by Nick appears to be due to the absence of committed warming - not its ECS. The rate at which committed warming develops should depend on the rate at which heat is moving below the mixed layer, thus my comment about this process and the unusual thickness of the mixed layer.

      We know from Otto (2013) that the observed warming and forcing from 1970-2010 are most consistent with an ECS of 2.0 K. Nick is trying to convince readers that a climate model with an ECS of 4.2 accurately predicted warming over the same period. Something is SERIOUSLY wrong with this conclusion.

      However, it would be more appropriate to analyze this situation in terms of TCR than ECS. ECS doesn't belong in any discussion of observed vs projected warming. Observed warming since 1970 and forcing fit best with a TCR of 1.4 K, which isn't dramatically different from the average of most climate models (1.8 K). A better analysis of this situation would start with the actual forcing change in Hansen's three scenarios and the TCR of Hansen's Model. It would also include confidence intervals (rather than aligning records by eye).

      The one period when TCR wouldn't be better than ECS is after 2000 in Scenario C, when committed warming is absent. Models with the highest ECS don't generally have TCR dramatically greater than average.

      Frank

      Delete
    5. Fran said: "1 cm2/s refers to the rate at which heat is transported below the mixed layer."

      Any measure with the units distance^2/time is referring to a random walk phenomenon. Classical inter-molecular diffusion is one kind of random walk, but it doesn't have to be at that level. It could just as easily be a collection of eddys transporting heat in random directions.

      The observation is that the Pacific ocean's thermocline will slosh up and down, which results in the oscillating El Nino/La Nina pattern. This is enough to move heat all over the world, yet you refuse to believe that this strong a behavior might not push heat deeper through a random walk mechanism?

      An remember, even 1 cm^2/sec won't reach the depths over time that you imagine --- it goes as sqrt(D*t).









      Delete
  18. I'm trying to understand why actual CO2 ppm should be used rather than actual CO2 emissions. Is this for ease or for accuracy or for some other reason? I think if the objective is to test how well the model performed, then the actual emissions and land use changes should be used as inputs, if that's possible. Hansen says the global carbon sink responded much differently than expected, i.e., it strengthened significantly rather than weakened.

    ReplyDelete
    Replies
    1. At least in Hansen's day, a GCM just modelled the atmosphere, and used the atmosphere constituents - ie ppm. It takes another model to convert emissions to ppm. Hansen didn't have one, but in any case I'm not sure it is a good idea. It adds assumptions and uncertainty. Land use is particularly hard to get precision. Some modern AOGCMs do include that modelling.

      Delete
    2. Wow, talk about a model T!
      LOL

      Thanks.

      Delete
    3. Nick Stokes wrote:
      "...in Hanen's day"

      Sad to say, it still is Hansen's day and he's still getting ignored. Does this have anything to do with why this topic came up?

      Delete
    4. Nick wrote: "It takes another model to convert emissions to ppm. It adds assumptions and uncertainty.

      Is this why the IPCC switched from emission scenarios to RCPs? They didn't want to add the uncertainty inherent in converting emission into accumulation to all of the other uncertainties inherent in their projections. ... Especially the one they never talk about - parameter uncertainty. It is hard to tell political leaders just how much fossil fuel they can burn and still keep warming below an arbitrary 2 degC above a pre-industrial period lacking thermometers. /sarcasm

      Frank

      Delete
    5. Frank,
      "Is this why the IPCC switched from emission scenarios to RCPs?"
      That isn't what they did. GCM's actually use concentrations as input, and that was what was provided. People may base expected concentrations on expected emissions, but that is external to the GCM.

      The RCP's formalised this. From van Vuuren et al, 2011, step 5 in RCP formulation:
      "The emission data were converted to concentration data, using a selected simple carbon-cycle climate model for well-mixed greenhouse gases and an atmospheric chemistry model for reactive short-lived substances."

      Some models do have the carbon modelling attached, and I believe they used emissions directly. But the RCP did include the conversion.

      Delete
    6. Nick: We are emitting enough CO2 every year to increase atmospheric CO2 by 4 ppm/yr and accumulating only 2 ppm. A 50% cut in emissions should stabilize atmospheric CO2. Instead, a need an 80% cut in emissions is projected. A major change in the poorly understood relationship emission and accumulation appears to be anticipated. The uncertain associated with that relationship disappears when you change from emissions scenarios to RCPs. Perhaps that uncertainty was also being ignored when forecasts were reported for emission scenarios. In AR5 Wg 1, Figure SPM.10 "Global mean surface temperature increase as a function of cumulative total global CO2" seems grossly overconfident, but perhaps I don't fully understand how it was arrived at.

      Frank

      Delete
    7. Frank,
      "A 50% cut in emissions should stabilize atmospheric CO2."
      Why do you think the cut in emissions would not reduce sea uptake? I plotted total emitted vs air ppm here (3rd and 4th graph). The airborne fraction is very steady.

      I think Fig 10 is interesting, but it doesn't express the error ranges very well. They do seem substantial. I think the point it is making is that the different time scales of the scenarios really don't much affect the dependence of temperature (in the model means) on total CO2.

      Delete
  19. Nick asked: "Why do you think the cut in emissions would not reduce sea uptake?"

    Uptake processes respond to the concentration of CO2 in the air (sometimes linearly, sometimes not). The rate of transport of CO2 into the mixed layer depends on how much is in the atmosphere. The rate CO2 is released from sinks is independent of the amount in the atmosphere, but depends on the amount of CO2 stored inside the sink. At 280 ppm, natural uptake equaled natural release.

    Right now, natural uptake exceeds exceeds natural release by the equivalent of 2 ppm/yr, because 400 ppm, not 280 ppm, is in the atmosphere. The difference may be 2 ppm/yr because uptake has increased by 2 ppm and release hasn't changed or because uptake has increased by 5 ppm/yr and release from sinks (which are now holding more CO2) has increased by 3 ppm/yr.

    No natural mechanism of uptake or release has any idea of how much CO2 man releases in any particular year. If we stop releasing CO2 completely, CO2 will fall at an average rate of 2 ppm/yr. The "airborne fraction" is the net result of a very complicate set of uptake and release processes. The observed relatively constant "airborne fraction" is not the result of nature somehow knowing how much CO2 man has released in any given year and then doing something with half of it.

    The IPCC's is projecting - without discussing uncertainty - that the net result of changing natural uptake and release processes will be an increase in the "airborne fraction". As you can see in the linked poster, the airborne appears to be rising, but the trend probably isn't statistically significant. The rest of the poster is marginally useful if this subject is unfamiliar to you.

    http://www.esrl.noaa.gov/gmd/co2conference/posters_pdf/jones1_poster.pdf

    Frank

    ReplyDelete
    Replies
    1. Frank,
      "The IPCC's is projecting - without discussing uncertainty - that the net result of changing natural uptake and release processes will be an increase in the "airborne fraction"."

      Really? Here's what they say on p 495 of AR5:

      "A positive trend in airborne fraction of ~0.3% yr–1 relative to the mean of 0.44 ±0.06 (or about 0.05 increase over 50 years) was found by all recent studies (Raupach et al., 2008, and related papers; Knorr, 2009; Gloor et al., 2010) using the airborne fraction of total anthropogenic CO2 emissions over the approximately 1960–2010 period (for which the most accurate atmospheric CO2 data are available). However, there is no consensus on the significance of the trend because of differences in the treatment of uncertainty and noise (Raupach et al., 2008; Knorr, 2009). There is also no consensus on the cause of the trend (Canadell et al., 2007b; Raupach et al., 2008; Gloor et al., 2010)."

      Not exactly a projection (just an observation), and plenty of uncertainty.

      Keeping AF constant doesn't imply that the sea calculates anthropogenic emissions. If you had a succession of equilibrium states between a gas and liquid, successively adding gas, then Henry's Law says that a prescribed fraction would go into the liquid. That is just dependent on the new state, not calculation of the increment. Of course we don't have equilibrium here, but the tendency for airborne fraction to be maintained will still be there.

      Delete
    2. Nick: Nothing in the SPM warns policymakers about uncertainty in the chain from emissions to atmospheric accumulation to any particular RCP. The main reason that we are being told emissions need to be reduced by 80% to stabilize atmospheric CO2 is because the airborne fraction is expected to rise to 80%. As you can see from the technical details you kindly looked up, we aren't even sure the airborne fraction is rising today and we certainly can't say with any accuracy how fast. Today the airborne fraction is about 50% and we can stabilize* CO2 with a 50% cut. How sure are we that the airborne fraction will rise to 80%, requiring an 80% cut to stabilize. Perhaps 70% or 60% will be adequate. Perhaps 90% will be required. I'm complaining about the lack of candor about this subject in the SPM - a subject that is extremely critical to policymakers, since every percent cut in emissions gets progressively more expensive. IMO, the change from emissions scenarios to RCPs (accumulation scenarios) obscures this issue.

      * When I say we could stabilize today with a 50% cut, that doesn't mean we can continue emitting at a 50% rate forever, because some sinks may saturate. As best I can tell, the same may be true for stabilization after an 80% cut. I haven't read enough about this subject to make any definitive statements. (Now that the IPCC is using RCPs, they don't have to say anything definitive either.)

      Frank

      Delete
    3. Frank,
      "Today the airborne fraction is about 50% and we can stabilize* CO2 with a 50% cut."
      I don't know where you get this arithmetic from. If the AF stays at 50%, and we keep adding 10 Gt C/year, it will accumulate at 5 Gt/yr. If we cut to 5 Gt/yr emission, that cuts accumulation to 2.5 Gt/yr. That isn't stabilizing.

      IPCC related calc's often use the Bern model. That has an impulse function that does not vary with time. That implies a constant AF if the emissions rise exponentially, which they have been. But I don't know where you get your 80% figures.

      Delete
  20. This comment has been removed by the author.

    ReplyDelete
  21. Ha: I've just prepared my reply only to find that NIck has "scooped" me! Nevertheless the additional information I provide below is still of value.

    Frank,
    I suggest you find out something about the relationship between solubility of a gaseous substance in a liquid and the partial pressure of that gas in contact with the liquid. These are related by Henry’s Law:
    P=kC
    for P the partial pressure, C the concentration of the dissolved gas at equilibrium between gaseous and dissolved phases and k a constant for the gas/liquid combination.
    Thus as CO2 concentration in the atmosphere increases so will the amount dissolved in the oceans, i.e. the ocean will take up some of the anthropogenic CO2. Your claim that if the rate of CO2 production is halved the ocean will then take up ALL this CO2 is contrary to Henry’s Law, since C would increase while P remains constant.
    CONCLUSION: Your claim is based on ignorance of science. Sorry if this sounds harsh, but since you have chosen to attack climate scientists for their supposed lack of scientific competence it seems reasonable to point out that you are falling into the same trap.

    ReplyDelete
    Replies
    1. Bill: I do know something about Henry's Law. Your "constant" k depends on temperature, being about 50% smaller near 0 degC in the Arctic than at 20 degC.

      Henry's law is concerned with the EQUILIBRIUM solubility of a gas in water. The mixed layer equilibrates with the atmosphere thermally and presumably with CO2 on a monthly time scale due to turbulent mixing. Below the mixed layer, convection - not equilibrium solubility - controls the rate of transport of both into the deeper ocean. We have some idea about the rate of this process from following the spread of CFCs and C14 (from atmospheric nuclear testing) into the deeper ocean. Ocean uptake is responsible for half of the 50% of emissions that aren't currently accumulating in the atmosphere, so I presume that means that the equivalent of 1 ppm/yr more CO2 is reaching the deep ocean by convection (or more accurately the equivalent of 1ppm/yr more CO2 entering the deep ocean than leaving it by upwelling of cold CO2-rich deepwater.)

      I addressed some of the other issues that interested you in my reply to Nick. You might consider your ignorance of my ignorance before choosing to comment about the latter. Personally, I'm far more interested in reducing ignorance through sharing ideas. Unfortunately, I don't always write as clearly as I would prefer when commenting on blogs.

      Frank

      Delete
    2. Frank says:

      " Ocean uptake is responsible for half of the 50% of emissions that aren't currently accumulating in the atmosphere"

      A random walk in 1D will guarantee that it will be 50%. It's the diffusion, stupid.

      Please think logically about this. A random walk will take a 50% likely step in one direction, and a 50% step in the other direction. That means that 50% of the CO2 that enters the ocean will likely pop right back out.

      Frank said : "Henry's law is concerned with the EQUILIBRIUM solubility of a gas in water."

      Like I have said before, I am from the semiconductor industry where we live and breathe diffusion in a "non-equilibrium" environment. So your typical scare tactics of capitalizing EQUILIBRIUM has no effect on me.




      Delete
  22. Hi, Frank,

    Your point about my ignorance of your ignorance is a fair one, and I retract my assumption that your failure to introduce Henry's Law as an input to your model was due to ignorance. However, you should have made use of it, so your failure to do so seems to be due, if not to ignorance, then carelessness, or if not carelessness then deliberate obfuscation.

    You seem to be justifying your failure to use Henry's Law on the grounds that it describes equilibrium states, rendering it irrelevant to modelling non-equilibrium situations. This is not true: the rate of transfer of a substance from one phase to another will depend on how far from equilibrium the system is. For instance if you have a 1.01% sucrose solution and a 0.99% sucrose solution separated by a porous membrane then the net initial rate of flow of sucrose will be far less than if you have a 10% sucrose solution separated from a 0.1% solution by such a membrane, since in this particular case the equilibrium condition is equal concentrations on each side. Quite simply, you need to know the equilibrium conditions to know how far from equilibrium you are.

    I agree with you entirely that Henry's Law applies under conditions of constant temperature, or to put it another way the Henry Constant is a function of temperature. As you point out, the higher the temperature the lower the Henry Constant. In making this point you actually undermine your thesis, since in a warming world the oceans will be warming, and thus for a constant partial pressure of CO2 the oceans will actually have less capacity to store carbon dioxide.

    So, at present your model seems to be fundamentally deficient. The other problem is the near total lack of quantitative evidence cited in your presentation as to why so much CO2 can be absorbed simply due to ocean currents. All you mention is something about CFCs and "carbon 14" giving us an idea about ocean circulation: a very weak basis, it seems, on which to develop a quantitative model of oceanic carbon sequestration that can, inter alia, predict that if CO2 emissions are halved then they will be entirely absorbed by the ocean - a clear quantitative prediction. Or maybe you'd care to put some statement of uncertainty on that claim. For someone who sneers at others for their supposed lack of awareness of uncertainty it seems only reasonable for you to come up with such a statement.

    ReplyDelete
    Replies
    1. Bill: I don't have a model of CO2 transport between the atmosphere and various land and ocean reservoirs, the IPCC does. (The uptake of CFCs into the deep ocean provides information for this model, as does Henry's Law.) To stabilize atmospheric CO2, that model says that we need to cut CO2 emissions by 80% in the long run even though it is obvious that a 50% cut will produce stability today - at least temporarily. The difference between 50% and 80% is huge. My complaint is that the IPCC's SPMs don't warn policymakers about the importance of the uncertainty in their model. The "airborne fraction" (a term that oversimplifies a very complicated process) has been nearly constant for the last century and the IPCC has no observational evidence confirming that their model correctly predicts changes in airborne fraction. The IPCC needs one of those boxed sections entitled: "How Will Airborne Fraction Change?" and a bullet point in the SPM. (A quick search didn't uncover one, but my search was't exhaustive.)

      At one point, I wrote that uptake of CO2 was linearly dependent on how much CO2 was in the atmosphere. Henry's Law produces a linear relationship - as long as temperature doesn't change. Convection appears linear. Photosynthesis would produce a linear relationship if CO2 binding were rate-limiting, but Michaelis-Menten kinetics takes into account the the chemical step or the product release step may be rate-limiting. I decided I didn't know enough to be comfortable using the word "linear". However, I'm sure all of these process are driven by the amount of CO2 currently in the atmosphere, not the increases than can be attributed to man: 400-280 ppm and 4 ppm/yr.

      Frank

      Delete
    2. Thanks, Frank, so there's no model behind your extraordinary predictions about a steady rise in CO2 dissolved in the ocean being compatible with no net rise in atmospheric CO2.

      Your prediction must be correct because "it's obvious". No further explanation required

      That this contravenes Henry's Law can by dismissed by the mere mention on the word "non-equilibrium".



      Delete
    3. This comment has been removed by the author.

      Delete
    4. Bill complains: "there's no model behind your extraordinary predictions about a steady rise in CO2 dissolved in the ocean being compatible with no net rise in atmospheric CO2."

      I don't need a MODEL; there is observational DATA showing that 50% of today's emissions are disappearing into sinks when atmospheric CO2 is 400 ppm. This will be true as long as those sinks don't saturate and continue to take up CO2 at the rate they are today.

      The IPCC has presented a plan for stabilizing temperature at 2 degC above pre-industrial. Their MODEL that says that we can continue to indefinitely emit 20% of today's CO2 emissions beginning about 50 years from now when the atmosphere will contains about 450 pm CO2. They have observational data that fits their model, a constant airborne fraction of 50%. Their model projects a rise in AF to 80%.

      Why is it outrageous for me to claim we can emit CO2 for some indefinite period (until sinks saturate) without a rise in atmospheric CO2, when the IPCC projects the same thing without qualification?

      I have been complaining about the unpublicized uncertainty in the IPCC's projection that the airborne fraction will rise to 80% (and then stabilize).


      Bill also writes: "That this contravenes Henry's Law can by dismissed by the mere mention on the word "non-equilibrium".

      I agree that Henry's Law applies to the mixed layer that equilibrates quickly (monthly?) with the atmosphere because of turbulent mixing. That mixed layer is roughly 1% of the ocean. According to my reading, there is 50X more CO2 in the ocean than in the atmosphere. It takes about a 1000 years for water to circulate from the surface to the deep ocean and back. Yes, the situation is very far from equilibrium.

      The IPCC projects a large amount of committed warming after CO2 stabilizes for the same reason. The surface (including the mixed layer of the ocean) and atmosphere are not in equilibrium with the deep ocean on a decadal time scale.

      Frank

      Delete