angech (Comment #140352)

angech (Comment #140352)

Carping,
All of you are choosing your definitions of Prediction and Projection as a means to an ends.

lucia (Comment #140334)at least gives a dictionary reference

Predict, prophesy, foresee, forecast mean to know or tell (usually correctly) beforehand what will happen. To predict is usually to foretell with precision of calculation, knowledge, or shrewd inference from facts or experience: The astronomers can predict an eclipse;it may, however, be used without the implication of underlying knowledge or expertise:

So When Gavin says
” [Carrick (Comment #140333)] This is how Gavin distinguishes them::
Prediction A much broader category of scientific statement, that might apply to undiscovered information about the past, present or future, but that implies a complete specification of the circumstances under which X would be expected. The anticipated result of a well designed lab experiment is a prediction, the prediction of general relativity concerning Mercury etc. Predictions are the mainstay of the scientific method.”

He is completely up the creek. A prediction is a forecast of the future [sorry Lucia] which may or may not be made with information, which may be made with a gut feeling, poor chickens.
It can have a varying degree of likelihood, based on the relevance of any information used and the person making it but of course does not have to have any likelihood whatsoever.
and does not have to have any basis in reality whatsoever. Hence Nick Stokes gets it right when saying a committee designs the input, sorry elephant, not modelers, after all knowing is not a prerequisite. Has Nick been on any committees in the past?
So many wrong comments to dissect here but I have a 60 K bike ride x2 today.
Will get back.

Mal Adapted

Mal Adapted 2015/09/09
“Nothing wrong with a little payback. I should probably feel a little guilty for enjoying it so much, but somebody has to do it ;^)”
Not a notion that everyone shares but if it is in your character, go for it Mal.
Forget about Karma.
Eli, along with many others, argues that arguing with denialists only tosses fuel on the fire and should be discouraged.

“You may have noticed that Eli is not defending ET he is reminding RB that he threw a no ball.”
WC and ATTP both stand accused of acting honorably to their detriment and that of the Climate change cause.
I can only, as a skeptic, agree with Eli [a rare event] but still applaud their decision.

Paul S 2015/09/09
“Arguably all the scenarios are “business as usual” in a sense”
No, RCP8.5- has been quoted extensively as the business as usual scenario. Worse it is not even the upper limit.
We could well have had worse than usual scenario’s but this is never mentioned on the warmist side.
RCP8.5- is actually the middle of all possible scenarios, something scary to think about. ET could well have been right under a scarier scenario.
RCP8.5 at least got the CO2 level right even if it got the warming wrong.

0.999 recurring answer

Perhaps like the tortoise and the hare it is a time problem
consider a number of 0.99999 repeating having to exist at a certain point in time
Like to do a maths problem.
While you are going to do it, it has an infinitesimal, whatever that is, but when you add it to 1 and get 2, in the moment that you have added it, it turns into a 1.
In other words while you are considering it it is always less than 1 but whenever  you use it it becomes 1.
problem solved.
TM angech 2015

truth

Feynman’s “method” – of honesty and integrity and truthfulness applies to everything.
While this is technically and morally and ethically true it does not produce the greatest “happiness for everyone”.
Lots of lies produce good results for a society provided
a. they are not discovered
or
b. we agree to accept the slip in our moral compass for a greater good.
This occurs on both sides of the climate debate societies.
There is a big difference between those who use b.and may be given the excuse that they do not know they are participating in a lie and a. who are completely aware that they are not being honest.
Both have recourse to greater good arguments , the former for themselves, the latter for “others”.
copyright TM angech

why infrared absorbing gases in the atmosphere warm the surface.”

angech (Comment #136532)

SteveF ” I thought you had finally understood why infrared absorbing gases in the atmosphere warm the surface.”

We are talking about the temperature of the air at the surface, not the actual surface layer itself?
Green house gases absorb and re-emit infra red which other gases do to a much, much lesser extent.
The gas molecules therefor move faster causing more collisions and heat.
The heat works its way back upstairs by convection and emission.
Once the molecules are in steady state ie settled at the new average level the amount of heat going back out to space is equal to that coming in.
There is no continuous build up of heat in the atmosphere, it plateaus at the new CO2 level.
The air is warmer than it was previously so the surface will warm up.
Your point.

My problem the energy coming in must equal the energy going out at the new balance point.

Being hotter the object should emit more energy.
Hence the the outgoing long-wavelength flux should be increased.
Logic.

But the sun has not become hotter, The joules in are still the same.

The problem to me is that we are looking at an atmosphere layering problem.
In effect the addition of GHG is equivalent to lowering the albedo of the planet making it more of a blackbody than it is and thus increasing its ability to absorb heat.
But black or white body it still has to emit the same amount of radiation back out.
This might manifest as a change in the TOA level if the emitting body has warmer air.?
If one layer is hotter other layers above and below must be colder. The deep sea is still ice cold despite 4 billion years of sunshine
In any case the outgoing long-wavelength flux should not reduce but must be the same as previously.

My way of trying to say you are right but looking for any excuse.

SteveF (Comment #136533)

angech,
“My problem the energy coming in must equal the energy going out at the new balance point.”
.
Sure, but you are ignoring the heat capacities involved. The atmosphere has a relatively small heat capacity (maybe equal to 4 meters of water). The ocean surface layer is about 60 meters on average, so about 15 times more thermal mass than the atmosphere. The deeper ocean has vast heat capacity, but warms only slowly to very slowly (decades to many centuries, depending on location and depth). Land areas have considerable thermal mass below the surface, but like the deep oceans, warm only very slowly. There is not going to be a “new equilibrium” established in anything less than a millennium, so it is useless to think about the process as an equilibrium.

Instead, you need to think of it as an energy balance, where rising GHG levels slightly restrict heat loss to space, and so warm the surface…. but at a rate of warming which changes at all time scales as the various heat sinks, with different capacities and different rates of heat uptake, move toward equilibrium. It is a simple accounting of joules: less heat going to space (due to more GHG’s) means heat must accumulate in the various sinks; the size and rates of those sinks limit the rate of surface warming. Yes, given enough time, the system would establish an equilibrium, but since GHG forcing is constantly changing (and will continue to!), there is no possibility of equilibrium being established. All we can do is make reasonable estimates of future warming based on projections of GHG levels, how those GHG levels will restrict heat loss, our understanding of internal system feedbacks due to gradually changing temperatures (atmospheric moisture, clouds, etc), and our understanding of the sizes and speeds of the multiple thermal sinks.

The best estimate for the rate of heat uptake (usually called the ‘TOA imbalance’) is in the range of 0.5 – 0.6 watts/M^2 averaged over the last decade or so, while the current best estimate of GHG forcing is ~2.29 watts/M^2 (~3.1 watts/M^2 GHG forcing less aerosol aerosol effects; IPCC AR5). Average surface temperature has changed by ~0.85 C since the mid 19th century (before significant man-made GHG influence). So the current best estimate of sensitivity based on energy balance is ~ 0.85 / (2.29 – 0.55) = 0.49 degree/Watt/M^2, or equivalent to 1.8C equilibrium increase for a doubling of CO2. People will argue about the details (exactly how much heat is being accumulated, exactly how much warming has taken place, exactly how much aerosol effects have reduced GHG forcing, etc), but the basic accounting is pretty clear. Based on these figures, the warming realized to date (~0.85C) represents only about (2.29 – 0.55)/(2.29) = ~76% or what it would have been save for heat accumulation, and only about (2.29 – 0.55)/(3.1) = ~56% of what it would have been in the absence of heat accumulation and aerosol effects.
.
The greatest uncertainty lies in aerosol effects (~ -0.9 watt/M^2, IPCC, AR5), and (no surprise!) this is one of the most contentious factual issues remaining in climate science. Those alarmed by warming and committed to draconian action to reduce fossil fuel use will almost always argue that the estimated aerosol effects are larger than in AR5, implying greater sensitivity, while those more alarmed by draconian measures than by warming will argue that estimated aerosol effects are even smaller than in AR5, implying lower sensitivity.
.
Nic Lewis used Bjorn Stephens’ revised (lower) estimates of aerosol effects to show that the best estimate of equilibrium sensitivity would be quite low (~1.5 – 1.6C per doubling), and that the probability of very high sensitivity (>3C per doubling, and a legitimate cause for alarm, if true) would be miniscule, based on the Stephens results. This naturally caused much gnashing of teeth and pulling of hair among the warming concerned, and lead to the odd public disclaimer from Stephens that while he stands by his aerosol estimates, he completely disavows Nic’s analysis based on those estimates; one of the strangest documents I have read from an actual scientist in a while.
.
Of course, I don’t expect you will accept much of the above as correct. But I figured I would lay it out for you anyway.

angech (Comment #136534)

SteveF
“I don’t expect you will accept much of the above as correct”.

Very succinct, for the complexity of the matter. Covered a lot of the territory that I have had trouble in working out the figures involved and ideas
Thank you very much.

 

ATTP

ATTP
“my broad point was how do you have genuine dialog if people hold views that are not only discrepant, but in which one group appears to not even acknowledge the position that they really hold”
Genuine dialog needs the people involved to have respect for the people on the other side despite their holding views you disagree with.
It also requires the small possibility that some of their views may be valid which means some of “our” opposite views may be invalid and need to be recognized as such.
I see no questioning of views tolerated at most sites, If anyone steps out of line the views expressed are condemned without any leeway.
Inconsistencies are ignored and swept under the carpet.
Extreme views which start off agreeing with “our” views but magnify them are tolerated rather than challenged.
Steven said   “I’m with hansen and Watts on that. strange bedfellows”.
Even stranger would be you and Joshua and BBD in with Mosher, Judy and Monckton.
[Covers eyes aghast].
But the sad fact is we are all working together to try and “improve the world” over the same issue.
Some of us [myself included] have [in the heat of the moment] said things hurtful to others in making our points of view and trust in dialog has been broken almost to the point of non repair.
Keep trying is the only answer.

Isn’t it just using <u> to underline and </u> to end it?

(MY superpower is having it work only when desired, and not work for explanatory examples. 😉

oh dear

R. Gates | December 18, 2014 helps you Michael “I can only accept that theory as provisionally true until such time as enough new data would cause me to abandon or alter that stance. Even more so, rather than focus on data that supports your “truth”, you should focus on finding the exceptions or the data that does not.”

Increased precipitation is expected with rising temps.
Decreasing snow is expected with rising temps.

angech

December 22, 2014

Rob Honeycutt December 21, 2014
” The reliability of tree rings as temperature indicators is clearly in question”
“Can you please provide us the published research that makes such a claim? Or is this something you’re just making up?”
Is this good enough Rob?
Greg Laden December 21, 2014
“The vast, vast majority of tree ring data can not be used to reconstruct temperature. Most of it simply does not carry that signal. It wasn’t collected to look at temperature, it has other uses, etc. Also, many tree ring sequences look at climate related data other than temperature, and carry virtually no temperature signal as well”
Clearly not made up

angech

December 22, 2014

Greg
“The vast, vast majority of tree ring data can not be used to reconstruct temperature. Most of it simply does not carry that signal.
I think you mean that the carried data is too hard to dissect out from the other conflating values in most trees.

“There is evidence for a climate-response threshold between approximately 60–80 vertical m below treeline, above which trees have shown a positive growth-response to temperature and below which they do not ”

So fortunately a small group of tree rings in an extremely specific location behaved in accordance with other proxies over an extremely specific time interval and though they have not shown this behavior since, in fact behaving as the vast, vast majority of trees have done over time, we can use them as a proxy for temperature.
OK, It is Christmas . And they are bristle-cones.

” Between the science and a hard place Sunday, September 19, 2010 Probably the most famous “lukewarmer” is Lucia Liljegren, a mechanical engineer (surprise!) whose blog, The Blackboard, can be found on the blogroll here. The Blackboard entertains many lukewarmers, along with a bunch of deniers and a smattering of pro-consensus folks, including myself”. ?? Stewie Griffen at the idiot tracker. Interesting post on Climate Sensitivity in the past.
From the link by Eli he must like you after all. Eli Rabett #134034)

The central estimate by itself is of minor interest for policy.
“Better estimates for climate sensitivity are not necessarily all that relevant”. ATTP.

And here I  was being told   Climate Sensitivity was an emergent property of models.

Lucia “Of course the central estimate is important. Ideally the central estimate should be unbiased based on information we have, we should know what it is ”

This information is already known. Mosher said anyone can calculate the Climate sensitivity at the moment based on the data we have, but the different data streams available give a range of choices.He said if the pause continues Climate Sensitivity intrinsically becomes lower.

There are different ways of estimating it. The generally accepted radiative forcing (RF) is 3.7 W/M2. One simply needs linear changes in global surface temperature change (?Ts) which can be estimated or measured in many ways.

-Consensus estimates by committee 1979 Only two sets of models were available! 2 degrees and 4 degrees!
– IPCC models which ignore  observed climate change and use the  known “feedbacks” simulated in general circulation models to calculate grossly inflated temperatures.
-Calculations of CO2 sensitivity from observational data
eg industrial-age data  a value of the sensitivity to CO2 doubling of approximately 3 °C giving a value of ? of 0.8 K/(W/m2)  since  1750 Rahmstorf (2008
or Lewis and Curry (2014)  equilibrium climate sensitivity was 1.64  °C, based on the 1750-2011 time series
– Other experimental estimates include
Idso (1998) ] calculated   a ? of 0.1 °C/(Wm?2) resulting in a climate sensitivity of only 0.4 °C for a doubling of the concentration of CO2 in the atmosphere.
but his work is ignored as he only used 8 different methods to get this average.

The same question in a different way Lucia. Given the emergent CS range on the climate models makes them so wrong on the observed temperatures can the “true climate sensitivity for each model be derived by putting in the temperature difference and backworking the emerged CS to give the one that was needd for that model to give the lower observed temperature.

Finally the problem is that there is no one true climate sensitivity possible due to forcing changes like  contributions due to solar activity, aerosols, ozone, and numerous other influences. But an estimate can be made by using simple measures of change in temp over time with the radiative forcing of increasing CO2.
Note this means that when we have a pause in global temperature rise this means the Climate sensitivity for that period is actually zero by definition, [sorry Lucia]. If the temperature fell by definition the Climate sensitivity would be negative.
While this is in essence a nonsense result as the fact is this would be due in the short run purely to natural variation the implication of a decade or 3 decades of paused or dropping temperature without other cause would alarm most scientists. It would say that the negative feedbacks are much greater than anticipated.
It would say that the modellers have been very crass in their modelling for the consensus and it may be that a new much lower CS would be emergent, if not as the models failure shows, already emerged.

the paper by Lewandowsky, Gignac, and Oberauer that was published in PLOS ONE in 2013, entitled The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science Can be referenced from a link at Bishop Hill’s.

It contains a large section on moderation polices which Judith and others would find interesting, including

No ad hominem attacks. Attacking other users or anyone holding a different opinion to you is common in debates but gets us no closer to understanding the science. For example, comments containing the words ‘religion’ and ‘conspiracy’ tend to get deleted. Comments using labels like ‘alarmist’ and ‘denier’ are usually skating on thin ice.

One wonders how a paper on Conspiracist Ideation could ever be discussed when the comments containing the word ‘conspiracy’ tend to get deleted

Another beauty is ” The public has a right to be informed about the risks societies are facing, from issues such as climate change or the introduction of GM foods to often-fatal diseases that are preventable by childhood vaccinations. Sadly, the public is currently prevented from exercising that right,”
Yet
” Science is debate, but that debate takes place in the scientific literature and at scientific conferences. In the history of science, we are not aware of a case in which a serious scientific issue was adjudicated by tabloid journalists or their modern-day equivalents such as blog commenters.”

In other words the public has a right to know but no right to debate issues

I love Lewindowsky. Could you discuss these attitudes Judith?

xx

Argo And Ocean Heat Content The earth is closest to the sun in January, so the earth gains energy around that time, and loses it in the other half of the year. please QUOTE THE EXACT WORDS YOU DISAGREE WITH.

Time for me to get on a hobby horse and get knocked off.

I understand what you are trying to say but disagree with the concept.
The energy in equals the energy out on a 24 hour basis.
Hence when the earth is closer to the sun in January yes there is more energy in but also more energy out to balance.
The atmosphere is naturally hotter as the sun is closer.
But the earth does not retain more energy stored in the sea. Any heat that has gone deep is balanced by colder water elsewhere as the earth has to give up all the energy it takes in over the 24 hour cycle.
If that heat went deep somewhere else had to radiate the equivalent back to space.
Yes there are Kelvin waves, yes, there are pockets of down-welling hot water.
But these do not store extra heat, they only carry heat that has already been balanced by the outgoing radiation from the rest of the sea and land.
That is why “the net TOA imbalance generally only varies by something on the order of ± half a watt per square metre over the thirteen years of the record, with no statistically significant trend at all”
not astounding at all.
TOA is simply the heat in, heat out interface.
Hence so called stored heat cannot come back to bite us. It has already gone back to space.

ENSO and stadium waves and El Nino’s are simply descriptors of current weather patterns.

Yes El Nino is real, the sea is warmer but there is no more heat in the system because of it.

There must be more heat in the system causing El Nino.

The simplest explanation for this would be altered albedo due to cloud cover. This lets more heat into the atmosphere which then heats up.
More complex would be altered albedo due to atmospheric factors we have not taken into account.
Choppy surface water in storms, dust storms, forest fires.
or even factors in the sea which might cause increased reflectance off water.
The last would be simple variance in the amount of energy emitted by the sun which we are reluctant to consider.

angech, It is interesting how the deep ocean temperature varies. For there to be a reasonable steady state with the atmosphere, sea surface temperature has to vary with respect to the whole atmosphere not just the portion of the atmosphere above the actual ocean surface. The oceans have to “make up” for the land heat capacity short comings. So if you consider the ratio of ocean to the total surface area, the change in the 0-700 meter temperature looks a little bit different than the OHC for the tropical and sub-tropical regions.

There the temperature anomaly is scaled to the ratio of ocean surface to total surface area for the three regions.

The northern subtropics take about 10 to 20 years to catch up with changes in the rest of the oceans. There isn’t really enough 0-2000 meter data IMO , but that would take longer to “catch up”. Anyway, looking at just SST or just OHC changes without considering the surface ratios doesn’t paint a complete picture.

The heat transfer from the earths core is minuscule compared to the heat output from the sun which is what warms our atmosphere.
If the sun were to go cold the earth would continue to lose its own intrinsic heat at its own intrinsic rate.
The amount of heat the earth itself emits would gradually diminish over time according to most scientists who assume that the core of the earth is cooling.
You are mixing up total energy, sun plus very small earth with energy flow,very small earth out with imagination, cold sun plus very small earth out. There is a larger differential in surface temp of the cooler earth to the temperature inside the earth but the flow of endogenous energy (very small) from the earth core to the surface remains the same and will diminish in time as the earth’ score cools.

Sorry Steve.
Willis stuff is incidental to a concept I am trying to sort out.
Forget Willis.
I was trying to understand how if the energy in equals the energy out on a daily basis, which should be the case, how everyone is missing the fact that we are not really warming or cooling on a daily basis as a planet.
Yes we have hot and cold surface temperatures over a year but in reality, if the sun input does not change and the volcano’s stay average then the energy input equal output and the warm years are purely due to the atmosphere having more heat while other parts have less heat.
So air heat content] plus sea heat content plus land heat content plus ice heat content equals x, if the air heat content is higher then some of the other 3 must be lower.
Instead of looking for missing heat in the oceans we should be looking for missing cold.
CO2 and GHG help keep the air heat content at a a certain level responsive to the solar energy passing through.
The message is that no extra energy cannot be stored in the sea as energy in must equal energy out.
Your contention that CO2 is a prime driver of the air heat content is true. My contention is that we have not allowed for what amount of energy is getting through for the GHG effect to take place.
If albedo increases or the sun puts in less energy at times then the CO2 effect cannot be the cause of a rise in sea surface temps/air heat content. when the amount of energy it has to react to is less.
*solar energy varies with elliptical orbit [proviso].
I think this concept that we are dealing with a chimera of climate change when the scientific reality is counter intuitive deserves a much bigger discussion.

 

  • Curious George

    angech – energy in does NOT equal energy out on an hourly basis; on a 12-hour basis; on a daily basis; on a weekly basis; or on a yearly basis.

    Your confusion stems from the fact that on a 24-hour basis there is an approximate balance, 12 hours of warming, and 12 hours of cooling (all taken locally; globally it approximately evens out at all times.) There is a similar smaller but global effect of half-year warming when the Earth is closest to the sun, followed by a half-year cooling. So we have a 24-hour cycle and a 365.25 day cycle. Climate change is any change above these basic cycles.

  • Curious George

    angech – P.S. Don’t confuse a half-year planetary warming and cooling (a global effect) with a summer-winter cycle, a local effect of a completely different origin.

    Curious George | December 6,angech – energy in does NOT equal energy out on an hourly basis; on a 12-hour basis; on a daily basis; on a weekly basis; or on a yearly basis.
    As one of the voices of reason on this blog, I appreciate your comment.
    The sun is the source of energy for the earth, basically.
    The sun’s energy is practically constant.
    The earth receives this energy which varies.
    A little on the wobble and inclination of the earth as it is not completely spherical hence the amount of energy received is in slow flux but never mentioned 1-2% [guess].
    A lot on the elliptical orbit which may vary the input up to 6% [guess].
    A lot on the albedo which can vary markedly for a number of factors you are aware of -3 to + 6 [guess].
    Conservation of energy and black body emissivity dictates that the energy we receive in 24 hours [rotation period of the earth +/_] should be equal.
    I cannot be more insistent on that.
    As a corollary to that it is true that the energy in on an hourly basis; on a 12-hour basis; on a daily basis; on a weekly basis; or on a yearly basis,
    for the earth as a whole must be equal. I do not wish to nitpick on endogenous earth heat or energy trapped by chemical processes and photo-sensitivity, just the big picture.

    Curious George

    angech – I agree with you on most points. Where I disagree is that “Conservation of energy and black body emissivity dictates that the energy we receive in 24 hours [rotation period of the earth +/_] should be equal.”. No. Why pick a rotation period? It should be almost equal not just over 24 hours, but at all times, the surplus or deficit manifesting themselves as a (slight) warming or cooling.

    You have a great point regarding the albedo; I believe this is the main mechanism Mother Nature uses to stabilize the planet’s temperature. But there may be effects in infrared or UV that we don’t see and therefore consider..

     

  • Water can store heat, which means less current emissions on its way to the TOA. It can capture solar and hold it which moderates fluctuations.
    True. but energy in equal energy out.
    The total amount of energy in the sea/atmosphere/earth is a measure of the impedance of those mediums. The capacity to hold heat depends on the density of those mediums and their conductance and emissivity. at the end of 24 hours an ice block can only be an ice block if the local conditions are right. There is no extra energy ever stored in the system as a whole.

  • angech:
    I am an accountant so I see many things as they relate to accounting. The TOA primarily lets in solar energy. That is our revenue. Our spending is seen as TOA long wave emissions to space. The roughly 4 C the oceans have trapped is our savings account. It represents most of the past differences seen at the TOA. From time zero that’s all we’ve managed to hang on to. You are correct that everything that happens below the TOA basically zeros out, except for changes in our savings. Changes in savings is, though it’s hard to measure, directly proportional to changes in incoming and outgoing at the TOA. Above I disregarded the atmosphere, and ice due to their relatively low heat capacity. You are correct about heat diving into the North Atlantic or a Monster El Nino. Such changes do not instantaneously change anything, we’ll maybe albedo I guess. The system has the same amount of energy, it’s just shuffling it. Long term, a new sorting of energy is likely to change the system’s total heat.

    The TOA primarily lets in solar energy.

    WRONG!   The TOA also lets out reflected solar (shortwave) energy..

    Changes to how much is reflected are not directly correlated with changes to the energy of the system, but rather controlled, in a very complex way by changes to the details of its distribution. The assumption that those details will “cancel out” in any sort of averaging process is totally unwarranted.

     

    • The ocean temps can be graphed with the Global Argo Marine Atlas and CERES data at CERES data products.

      All the ocean warming is in the southern hemisphere.

      The annual peak in both net toa flux (positive warming) and ocean occur in January and February. These large changes in outgoing energy completely dominate – almost – changes in incoming energy. The changes in outgoing energy is the net change in out of phase changes in short wave and longwave variability. The out of phase changes are due to differences in land and ocean areas between the NH and SH.

      The ocean warming in the last couple of years is due in part to increase in solar intensity in the 11 year cycle.

      • Water can store heat, which means less current emissions on its way to the TOA. Here’s what it looks like.
        Rob, are these variations in heat content purely due to the closeness of the sun to the earth in January/February?

      • These are reflected shortwave (RSW) and outgoing longwave.(OLR)

        RSW peaks in December or January

        OLR peaks July or August

        ‘Changes in orbital eccentricity affect the Earth-sun distance. Currently, a difference of only 3 percent (5 million kilometers) exists between closest approach (perihelion), which occurs on or about January 3, and furthest departure (aphelion), which occurs on or about July 4. This difference in distance amounts to about a 6 percent increase in incoming solar radiation (insolation) from July to January.’

        So it looks like it warms in the SH summer and cools in the NH summer. Mostly as a result of orbits.

  • Pierre, the earth is in energy balance whether the sun is near or far. It has more energy coming in and going out so technically it does not gain energy.

    Angech,

    Planets that are nearer to the Sun (e.g. Mercury) are in energy balance at a higher temperature than planets that are further away (e.g Mars). It is precisely because they are warmer that they are in energy balance despite the larger solar power that they receive, as it enables them to radiate energy back to space also with a larger power. So, the Earth likewise can be expected to warm is it is nearer to the Sun (other things being equal), for the very same reason. Being nearer to the Sun makes the Earth (temporarily) more like Mercury and less like Mars.

    • true it is warmer nearer the sun, but it is in energy balance,It’s warmth is a function of it’s distance from the sun not the fact that it is storing any new energy.
      Energy in equals energy out, the radiative temp is higher.

    • Pierre-Normand

      Yes, ‘energy in’ = ‘energy out’ *after* it has achieved the surface temperature (and atmospheric temperature profile) that enables this planetary energy balance to be maintained. The change in surface temperature, while it occurs, also has an effect on the energy fluxes below the surface. In the oceans, for instance, the top 100m or so (on average) constitutes a well mixed layer. If the average sea surface temperature changes for a sustained period, then it must also warm (or cool) by nearly the same amount throughout this whole well mixed layer, and this temperature change throughout the volume of the well mixed layer entails a large change in heat content.

    • As the energy in equals the energy out and can only change from solar position or level of radiation how could the sea store more energy? there is no extra energy to store.
      If GHG goes up as in CO2 It will have to heat the atmosphere a little first, then this will have to transfer to the top level of the sea and over many thousands of years the temperature will go up incrementally a hundredth of a degree if the CO2 was to stay around this long.
      The sea surface temperature changes we see are short term ephemeral in nature lasting 3-30 years with even a 100 year length only a twinkle of natural variability.
      Look at the immense amount of water that has to be heated up by that incredibly small amount of atmosphere and think a little.

      Why would you think the earth is in radiative balance? Paleo data (sealevel rise) suggest the earth is rarely in radiative balance.

      Anyway, it’s not just that the earth is closer to the sun, it’s that its lowest albeo regions are closest to the sun. The oceans in particular, which may not give up the heat immediately.

      • aaron | December 5, 2014
        Why would you think the earth is in radiative balance? Paleo data (seal evel rise) suggest the earth is rarely in radiative balance.

        Sorry, settled science, the real type, says energy in equals energy out.
        Whether the oceans are rising or falling whether the earth is hot or cold. The energy coming in from the sun daily equals the energy out.
        The earth is always in radiative balance.
        The amount of energy held by the atmosphere or oceans can vary immensely due to factors like cloud albedo and GHG while the earth remains in perfect radiative balance.

        DMI data on Arctic temperatures: Hide the Increase?

        What The Science Says:
        While summer maximums have showed little trend, the annual average Arctic temperature has risen sharply in recent decades. This is indicated both by the GISS, DMI data and other high latitude data sets.

        Climate Myth: DMI show cooling Arctic
        From DMI we learn, that Arctic 80N-90N temperatures in the melt season this year is colder than average. This was the case last year too, while earlier years in the DMI analysis period (1958-2010) hardly ever shows Arctic melt season temperatures this cold (Frank Lansner)

        On Misleading Comparisons between the Danish Meteorological Institute (DMI) Arctic Air Temperatures and the Goddard Institute of Space Science (GISS) Arctic Surface Temperature Anomalies, and Implications for Arctic Warming.

        A recent WUWT article by Frank Lansner, August 5th 2010 has the heading “DMI polar data shows cooler Arctic temperature since 1958”. Peter Berenyi also posted a similar chart here on Skeptical Science. Frank Lansnser goes on to show data from GISS July polar views (where individual grid cells show large variability) and compares this with graphics of DMI data for July 2010 to cast doubt on the validity of the GISS gridded values in the polar regions. This follows on from similar points made by Steve Goddard, on WUWT and another article by Harold Ambler which tries to show how DMI is based on more data measurements than GISS, followed by questions and implications about the reliability of GISS gridded values in the Arctic region.

        Similar claims that the DMI data shows Arctic “cooling” or highlights problems with other temperature data sets (eg from GISS, which mostly interpolates over the Arctic ice) appear on other websites. Many other articles have attempted to compare Arctic trends derived from satellite based microwave sensors with surface based (such as GISS) Arctic values, highlighting apparent differences.

        This post seeks to correct the public misunderstandings that these articles may cause, primarily about the claim of Arctic “cooling” but also about comparisons between the DMI 2m Arctic absolute temperature time series and GISS Arctic temperature anomaly data. In the following analysis all data is from original sources (such as DMI) rather than data “extracted” from web page graphics.

        The Danish Meteorological Institute (DMI) Arctic 2m air temperature data

        The Danish Meteorological Institute (DMI) Arctic temperature data is an output of the latest operational model as used for meteorological forecasting by the European Centre for Medium-Range Weather Forecasts (ECMWF). This output is an average of all model points at 2m height, currently on a 0.5 degree grid over the most northerly part of the Arctic, above 80N. Because the number of land stations in the World Meteorological Organization (WMO) list above 80 degrees North is very small indeed (a handful), measured data inputs for the model must be supplemented by other sources for high resolution meteorological work. The models assimilate inputs from weather stations, drifting stations and buoys, radiosondes, aircraft, vessels and more recently infra-red and microwave satellite based sensors.

        The data is used to create a full three dimensional deterministic model of the global atmosphere which can be run forwards in time so that dynamic atmospheric conditions (weather) and regional climate can be analysed and forecast (or reanalyzed using historical observations as inputs) to the resolution limits of the model.

        The resolution of these operational models has continually increased to take advantage of increased computing power and higher spatial resolution satellite data. This means that the DMI Arctic temperature data has had several changes in its history.

        Between 1958 and 2002, the output of the ECMWF 40 year reanalysis (ERA-40) is used, (approximately 120km grid horizontal resolution). The ERA-40 re-analysis itself has three main sections of assimilated data, using pre-satellite observations up to 1972, then assimilating some satellite sensor observations, starting with the Vertical Temperature Profile Radiometer (VTPR) on early NOAA satellites, up to 1987/1988, and then using more recent observation types and more satellite sensors, both Infra Red and Microwave, in combined sensor packages such as the Television Infrared Observation Satellite (TIROS) Operational Vertical Sounder (TOVS), Special Sensor Microwave/Imager (SSM/I), Advanced TIROS Operational Vertical Sounder (ATOVS) amongst others, on platforms like the ERS (European Remote Sensing Satellite) and later NOAA series of satellites from 1987 onwards. In 2002 the DMI data switches to a higher resolution T511 model (40km resolution), then in 2006 to T799, (25km) and from 26th Jan 2010, T1279 (16km). These changes could be linked to minor differences in the apparent Summer melt temperature, (there are small differences between the ERA-40 and the T511 outputs in the overlap period in 2002).

        Assimilation of Satellite Sensor Surface Temperature data in the Arctic

        In general the satellite data has almost full polar coverage and very high spatial resolution, but lower absolute accuracy over ice. For surface measurements Infra red satellite sensors are used as they measure the infra-red energy (Long wave radiation) emitted directly from the Earths surface (skin temperature).

        Large bias differences exist in uncalibrated absolute surface temperature estimates from some of the different satellite sensors, eg Lakshmi 2002 discusses global differences between TOVS and Advanced Very High Resolution Radiometer (AVHRR) data. These longer time series from equivalent satellite sensors are themselves made up of data from several (not necessarily overlapping) satellite missions, so care must be taken when analyzing long term trends (see analysis of calibration of TOVS data for some visual representations and context).

        Direct comparisons of measurements from surface based sensors and remote sensors like the more recent Moderate Resolution Imaging Spectroradiometer (MODIS) on the TERRA and AQUA satellite platforms show that cloud cover can bias infra red Arctic surface temperature measurements low, for example see Hall 2004 on MODIS Sea Ice Surface Temperature, Scambos 2006 on validation of AVHRR and MODIS ice surface temperature, Randriamampianina 2009 on assimilating ATOVs data in Polar regions, and Koenig 2010 on MODIS data compared with high accuracy surface-based thermochron sensor data on Greenland).

        However, although work is always ongoing to reduce errors, the models allow this high resolution data to be used to interpolate between the sparse but very accurate observations from land stations, which form a network of absolute temperature value “tie points” – allowing bias calibration of the fine scale overall satellite derived relative changes.

        In general, the skill of such high resolution models to match observations has improved incrementally with each step increase in resolution. Nevertheless DMI recommend that the 2m Arctic air temperature data should not be compared with overall Arctic temperature estimates from other data sets, which generally cover a wider area (usually above the Arctic Circle at approximately 66N) where more land station data is available. Despite this, the DMI data has been used as a reference for comparison.

        Surface Air Temperatures above the Melting Ice in the Arctic Summer

        Most of the area above 80N is (currently) still covered in permanent sea ice. In the Arctic Summer when the surface ice is melting, the air temperature close to the surface is limited by this ice melt temperature to just above zero degrees C (Rigor 2000). This is why the Summer air temperatures have not varied much over the entire instrumental period. This maximum temperature “clipping” effect is clearly seen on the following chart which shows measured 2m air temperature data from former USSR Polar Stations NP-6 and NP-30 from positions above 80N (see Lindsay 1998 for more detail). Data from these “North Pole” (NP) stations is considered to be the most accurate Surface Air Temperature (SAT) data from the Arctic Ocean.


        Figure 1: Former USSR Arctic Station 2m air temperature data from NP-6 and NP-30

        This behavior is also clear in all other polar data sets including those from the International Arctic Buoy Programme (IABP), individual stations (such as Eureka, see Lesins 2010), Surface data AVHRR infra red sensors (carried on various satellites from TIROS-N in late 1978 to the current NOAA series of satellites), see Comiso 2006 (figure 2). The zero degrees Summer melt maximum is consistent enough to be used to calibrate Arctic temperature sensors suspected of bias (see Rigor 2000 figure 4). It is therefore not surprising that this limiting effect is also apparent on the gridded DMI data, here monthly (30 day) rolling averages, averaged decadally for clarity (the zero limit is less “flat-topped” in this case partly due to the averaging of data over 10 degrees of latitude). This pattern is likely to prevail whilst substantial permanent sea ice remains above 80N.


        Figure 2: 30 day rolling average DMI temperatures, averaged over two decadal periods showing increasing average seasonal temperatures over the DMI record

        Clearly high Arctic Summer surface temperatures just above zero are not really an indication of anything except proximity to a melting ice surface. To claim that the Arctic is cooling based on Summer surface temperature values over sea ice is to misunderstand the data.

        Average Arctic Temperature Trends

        It is also evident in all of these Arctic data sets that the average temperatures in Winter, Spring and Autumn have generally increased over the measurement period. It appears that the overall seasonal cycle is riding on a gradually warming average value, but peak positive excursions are being limited by the ice melt temperature in Summer.

        It would be intuitive that these seasonal warming patterns would affect the overall DMI temperature anomaly trend, and this is the case. If we plot the entire daily DMI temperature data, and then a 365 day rolling average, and calculate a linear best fit, we end with a positive trend of 0.383 degrees C per decade.


        Figure 3: DMI daily temperature values, annual average and linear trend over the entire record period

        Thus the reality is that the annual average temperature as indicated by DMI has risen at rates around twice the global average over the past 50 years, which is entirely consistent with other Arctic data sets, including the data from GISS. If we plot this average positive trend and five year running averages on the same vertical scale below (red) it gives clearer context to the Lansner chart of Summer melt average temperatures (green).


        Figure 4: DMI summer melt season temperatures and annual DMI temperature anomaly as well as five year running averages

        GISS data

        The Goddard Institute of Space Science (GISS) global surface temperature anomaly series is based on observations rather than models. Its primary usefulness is as a constantly updated indicator of global or large scale regional surface temperature changes. It uses a publicly available global surface temperature data set of over 6000 ground stations on land, from the Global Historical Climatology Network (GHCN), and Antarctic data from Scientific Committee on Antarctic Research (SCAR), and two different publicly available SST (Sea Surface Temperature) data sets, the Met Office Hadley Centre’s Sea Ice and Sea Surface Temperature data set (HadISST1) up to 1982 (vessel based) and the National Oceanographic and Atmospheric Agency (NOAA) Reynolds Optimum Interpolation Sea Surface Temperature analysis (OISSTv2) subsequently, the latter being mainly satellite based. These values are also cross checked against other available data sets and satellite data sets. The measurements used by GISS are gridded at either 1200km or 250km resolution, with appropriate weighting in grid cells containing both land and ocean. The SST data where available is used up to 100km from any coast, but data from any source is extended to a maximum radius of 1200km if no other measured data points are present within this range.

        For the high Arctic, we have already noted that there are relatively few land stations, (a handful above 80N), and most of this area is currently still covered by permanent sea ice. A study using data from stations and Polar drifting ice buoys showed that near surface air temperatures over the pack ice are relatively homogenous, with a CLS (correlation length scale) of 900-1000 km (Rigor 2000). Though this can reduce slightly in the Summer between the coast and ice covered areas. Obviously SST can not be used in this region, as the sea surface is frozen. In many data sets, models and re-analyses, it is standard to set SST to freezing point of sea water (around -1.8 degrees C) where there is >90% ice cover eg HadISST1 (described in Rayner 2003) or recent NOAA SST modeling which uses fixed and drifting buoys, ship and surface station and satellite data (currently from NOAA-18 AVHRR and the European Meteorological Operational Satellite Programme (METOP-A). In open Ocean SST tracks variations in Air temperature, but this is not the case near the transitional and mobile ice “boundaries”. This is one reason why GISS does not currently use SST data in the seasonal ice region above 75N, even when this data is seasonally available, – as is increasingly the case due to diminishing trends in ice extent, and better coverage due to satellite data. See current data from the National Snow and Ice Data Centre (NSIDC). However, the data is treated consistently over the longer term, which is very important for trend analysis.

        This means in the Arctic region, GISS data is relatively coarse grained, as individual grid cells above 80N may include station data interpolated out to as much as 1200km, and are likely to show the higher short term variability which is characteristic of data from individual Polar stations.

        However, given all of the above, and the significant differences in horizontal resolution, and methodology, and the caveats and cautions from both DMI and GISS, how do monthly or annual time series compare?


        Figure 5: Annual DMI and GISS Arctic temperature anomalies and trends

        Here the annual average values for each year have been plotted for both the polar “zonal” GISS data (64-90N) and the DMI Arctic data, and the trends calculated for both data sets for the full DMI period. Note the relatively high variability in both polar data sets compared with the “global” or “northern hemisphere” averages. At this resolution the correlation between DMI and GISS averages is reasonable, and the 50 year gradients are statistically indistinguishable. These trend figures are also consistent with those from a recent comprehensive surface data based study of the Arctic (Bekryaev 2010) which gives 0.364 degrees C/decade from 60-90N over this same period.


        Figure 6: DMI and GISS 50 year temperature anomalies and trends, from higher resolution data

        Zooming to higher resolution and using Monthly GISS average gridded values (light blue) above 80N to plot a 12 month rolling average (dark blue), and using the DMI daily values to plot an equivalent 365 day rolling average (pink), we can see that there is good correlation of the high resolution model based DMI values and the lower spatial resolution GISS data, and the significant positive temperature trends are again consistent.

        What Lansner (and others) have effectively done is to pick one of the peak monthly GISS values (light blue) and compared the average DMI value for that month. Clearly this is likely to give a misleading result (as it would with any monthly sample from any Arctic data set) when compared to the overall records.

        Is the GISS Arctic data consistent with other studies of Arctic temperature trends and other data sources?

        The former USSR polar ice station programme and IABP programme give 2m air temperature data from the high Arctic. There are also a few other high Arctic stations with limited data. Several attempts have been made to homogenize and assimilate these data sets into a common Arctic time series. One example is Polyakov 2002. The associated data is available from the International Arctic Research Centre (IARC) at the University of Alaska Fairbanks).

        It is no surprise that temperature data from before the ice buoy and drifting station programs should be consistent in both this data set and GISS, as they are based on identical station data. Comparing the results with GISS over the same grid area over dates where the recent ice station and buoy data is used, the correlation is just as high.

        What about the other observation based data sets, and other independent satellite data? Monthly averaged global data sets often cited and discussed are (HadCRUT3 from the UK Met Office Hadley Centre in conjunction with the Climate Research Unit (CRU) and the Lower Troposphere weighted values from combined satellite Microwave Sounder Unit (MSU) and Advanced MSU (AMSU) time series from the teams at Remote Sensing Systems (RSS) and University of Alabama in Huntsville (UAH).

        Microwave “sounders” (see Waters 1975) measure microwave radiation at multiple characteristic frequencies, each emitted from specific components (eg Oxygen molecules) of the atmosphere which are present in different relatively broad vertical “profiles” within the overall atmospheric column. They do not directly measure surface temperature. Absolute values show less seasonal variation than at the surface, whilst the annual average temperatures are a few degrees colder. The Lower Troposphere MSU time series are made up of data from several similar sensors on overlapping satellite missions. An excellent graphical overview of these missions and updated recalibrated results are given on the NOAA site for Microwave Sounding Calibration and Trend (MSCAT).

        A superficial analysis of publicly available “Arctic” subsets of these data series indicates that whilst all data sets show strong warming, there are differences in positive gradient between data sets over the satellite measurement period. The interpolated GISS 64-90N product gives 0.62 degrees C/decade over the satellite period (since 1979), UAH North Polar subset is 0.47 degrees C/decade, HadCRUT3 from 65-90N gives 0.45 degrees C/decade, and RSS from 60-82.5N gives 0.34 degrees C/decade. All of these data sets should be used with caution when making comparisons at high latitudes, as they do not have identical coverage. HadCRUT3 uses a similar gridded format to GISS, but a different methodology which does not include extrapolated Arctic grid cell values. Care is also advised in the use of MSU data at high latitudes due to MSU sensor scanning geometry, for example the standard zonal “polar” data from RSS covers from 60N to 82.5N. Data above this latitude is considered unreliable, and is not available. If the data from each global data set is gridded over approximately equal Arctic areas then the differences reduce to well within the error bars of the data.


        Figure 7: GISS, HadCRUT3 and MSU satellite sensor data gridded over similar areas, with linear trends and five year running averages shown

        From the above discussion we see relatively small changes in latitudinal extent can significantly affect the trends of average gridded values in the polar region. This is because of the strong increase in temperature anomaly trends as we approach the poles, particularly at high Northern latitudes. This gradient (Arctic amplification) has increased in recent decades, for example see (Vinnikov 2006 and more recently Zou 2009 on ongoing MSU intercalibration efforts, where figure 9 shows the significant warming trend gradient with latitude (as measured with MSU), with temperature anomaly trends between 0.7 and 0.8 degrees C/decade above 80N for the 1987 to 2006 period. To help visualize this, the trend values for 10 degree latitudinal zones for GISS Land and Ocean, RSS Lower Troposphere and the latest ERA-Interim re-analysis surface temperature over the last thirty (where available) and last twenty years are shown below.


        Figure 8: MSU Lower Troposphere, and GISS and ERA-Interim surface temperature trend variation with latitude for most recent 30 and 20 year periods

        For context the GISS trend values for 64 to 90N are given as 0.62 degrees/decade since 1979, The DMI trend for above 80N over the same period is 0.68 degrees C/decade. The comprehensive surface station data based on Bakyraev 2010 gives 0.639 degrees C/decade for 60-90N between 1979 and 2009. The full coverage AVHRR decadal trend from 1981 to 2005 is 0.61 degrees/decade above 60 degrees N, but 0.72 degrees C/decade inside the Arctic Circle (around 66 N) whilst the IABP polar buoy data set gives 0.88 degrees C/decade between 1979 and 1999 for measurements which are (mostly) well inside 80N. The newly available (2009) and updated high resolution ERA Interim re-analysis gives even higher trend values between 70-90N of around 1.0 degrees C/decade over the twenty year 1989 to 2010 period at the surface, where warming has been shown to be strongest (see Screen and Simmonds 2010). If the ERA-interim product is spliced with ERA-40 (to minimize differences in the overlap period) the result for the surface gives 0.78 degrees C/decade between 1979 and 2010 which is consistent with the values above over the same three decade period.

        Thus conclusions from an objective analysis of available data, and recent peer reviewed work, whether based on updated independent observations at various resolutions, or state of the art high resolution climate modeling using all available data sources, is that the Arctic is experiencing strong warming, roughly double the global average, and showing increasing surface temperature trends with higher latitude.

        Whilst some caution is advised by GISS in using the interpolated data in the Arctic region, it appears that GISS trends in the Arctic region as a whole are consistent with other high latitude data sets, and show similar trends (in terms of annual averaged values) with other data sets which cover the region above 80N.

        Comments on Miskolczi’s (2010) Controversial Greenhouse Theory     important

 

no warming since 1998”

But you might have heard claims like “there’s been no warming since 1998”, so let us have a look at temperatures starting in 1998 (the year sticking out most above the trend line in the previous graph)
I will be using the HadCRUT4 hybrid data, which have the most sophisticated method to fill data gaps in the Arctic with the help of satellites,

Keith Woollard said You lose me at the first graph. If you use the satellite data you cannot make any of the points you raise.

[Response: Of course the same points about the meaning of significance and confidence intervals could have been made also with satellite data. But here we are interested in the evolution of global surface temperatures, which cannot be measured by satellites. – stefan]
I’m curious, why aren’t they using the satellite data?
[Response: Satellite data doesn’t measure the same thing and has different levels of signal and noise. – gavin]

The issue is that claims like “there’s been no warming since 1998” relate to the satellite data which do show a pause for that time no matter how you try to deny it.
You cannot both incorporate satellite data and then dismiss it.

energy

angech | December 5, 2014 at 5:07 am | Reply

Willis Eschenbach has an article up at WUWT
Argo And Ocean Heat ContentThe earth is closest to the sun in January, so the earth gains energy around that time, and loses it in the other half of the year. please QUOTE THE EXACT WORDS YOU DISAGREE WITH.

Time for me to get on a hobby horse and get knocked off.

I understand what you are trying to say but disagree with the concept.
The energy in equals the energy out on a 24 hour basis.
Hence when the earth is closer to the sun in January yes there is more energy in but also more energy out to balance.
The atmosphere is naturally hotter as the sun is closer.
But the earth does not retain more energy stored in the sea. Any heat that has gone deep is balanced by colder water elsewhere as the earth has to give up all the energy it takes in over the 24 hour cycle.
If that heat went deep somewhere else had to radiate the equivalent back to space.
Yes there are Kelvin waves, yes, there are pockets of down-welling hot water.
But these do not store extra heat, they only carry heat that has already been balanced by the outgoing radiation from the rest of the sea and land.
That is why “the net TOA imbalance generally only varies by something on the order of ± half a watt per square metre over the thirteen years of the record, with no statistically significant trend at all”
not astounding at all.
TOA is simply the heat in, heat out interface.
Hence so called stored heat cannot come back to bite us. It has already gone back to space.

ENSO and stadium waves and El Nino’s are simply descriptors of current weather patterns.

Yes El Nino is real, the sea is warmer but there is no more heat in the system because of it.

There must be more heat in the system causing El Nino.

The simplest explanation for this would be altered albedo due to cloud cover. This lets more heat into the atmosphere which then heats up.
More complex would be altered albedo due to atmospheric factors we have not taken into account.
Choppy surface water in storms, dust storms, forest fires.
or even factors in the sea which might cause increased reflectance off water.
The last would be simple variance in the amount of energy emitted by the sun which we are reluctant to consider.