Arctic Ice

I find the the trend in sea ice age over the last ten years or so a conceptually difficult metric.
Ine of the problems as I have mentioned before is that the less ice you have to start with the less the percentage of multi year ice appears to be in a good recovery year.
Counter intuitively this means that years with low percentage multi year ice are actually making good recoveries.
This might help explain the contradiction between a 10 year pause in ice volumes, sought of a recovery in a way from the previous high falls and a downwards trend in multi year ice for 10 years which also fits in with recovering, not diminishing ice in the Arctic?

La Casa di Signore Mancini

La Casa di Signore Mancini
This may prove to be a seminal work.
I intend to build a memory castle out of ordinary pieces of bricks and mortar that anyone can use.
For purposes of education, memory training, entertainment and refreshment.
The house may yet change in it’s nature, a bit like the Japanese walking castle and different people
may end up with a different construct with different uses altogether.
I envisage that some will be able to use algorithms and machine learning to join human and artificial intelligence ideas together.
Grandiose ideas but as usual oaks are born from tiny acorns.
The name is fictional and related to my current study of Italian where I have used a very basic form to help people try to learn Italian.
It did not really work though the potential was there because the basic principles were to hard to install in 2 lessons.

So from scratch
English being my native language others will have to transfer it into their own styles but it should be possible and logically consistent.
The idea of a memory palace varies a little from being an artificial construct, in this case,
to a much more practical use of ones own houses, schools, workplaces and towns.
To marry the two is not possible but one can subsist in the other, the choice of which way you wish to do it is up to you.
Remember all doors have two sides and inside for one idea is outside for the other in a binary world at least.
Other worlds, parallel universes and time issues etc can all be investigated here later if you wish.
My preference, knowing as I said that it is really only a matter of perspective is to address it from a fixed point, Signor Mancini’s Casa.

An entrance, a lobby, receptionists and sit down computer log in screens and keyboards [or voice activated] await centrally.
An elevator shaft to floors above and below.
A circular construct with 24 doors.
Why 24.
Divides by 2,3,4.
Is large enough for most Western Languages to encompass most letters.
Provides enough sectors to cope with a large range of topics.
Now the tricky part, traveling from one room or one floor to another.
Recording where one has traveled.
The first is easy.
Enter the lobby and sit down at your desk.
One can either go and ask for help, at reception and use the elevators and aides there.
Or secondly, using Star Trek technology, on your desk select the location you wish,
press enter and the house will phase you to the right floor.

Lets try Italian again.
Italian floor.
Languages or regions.
Go to the language floor, for anglophones, English speakers.
Here the concierge is Signor Mancini.
Let’s get started.

Overview, product information, Instructions.
First rule you have to read the Instructions but after the first few times , on most floors you can skip this step.
For Italian there are several provisos.
No-one can speak Italian perfectly because there are thousands of dialects.
People in different regions do not even speak on the same tenses as in other areas.
Nonetheless in 1892 with the unification of Italy [see Italian History] a standardised written Italian was introduced
which most of the people use to communicate with each other.
Language is a multi dimensional skill which is not all verbal.
Italian uses a lot of gesturing to accompany and give meaning to what they are saying and can transform a seeming
compliment to a maladetto [rude word] in an instant.

As mentioned in the introduction everything has at least two sides.
Here a second side or outside inside concept is the English speaker, yourself, trying to learn.
English is a bastardised language which has led to great functionality.
All languages are constructs of previous languages and experiences.
Italian and English are not dissimilar in that respect.
Italian is a complete language however combining one basic root Latin.
Easy to speak but lacking the range of expressions available in a 4 culture language.
For people with a different perspective, Chines or Indian say the root language is missing.
Many more words and idioms have to be learnt to achieve proficiency.

One way to learn Os to totally reprogram the language part of the brain into the new language.
Learn it from scratch and only understand the comparisons later.
Repetition is the second way.
Practicing with a more fluent person in speaking is the third.
Unfortunately, in the house, there is only the room and what you can put in it.
A simulacrum is beyond most people at this stage.

Of the various course to learn I would approach Italian in this way.
The language is Latin Based composed of subject/object, verbs [actions]
The function of all language is to ask a question and get a response.
Cause and effect as the Merovingian would say.

Language is always best learnt from the past to the present to the future.
ontogeny recapitulates phylogeny Ernst Haeckel *. Meckel–Serres law.
We can only speak truly about actions that have happened in the past or thoughts that we have had in the past.
One represents a reality of what has happened and the other a reality of what we though had happened in the past.
The vast gap between thought and action, separated by only a spark of initiation,
drives all thought language and action.
The classic idea of what comes first, thought or action as exemplified in Avere and Essere by Erich Fromm.

Latin is a common root between English and Italian and must always be considered when study and understanding the two languages.
It enters the English Language in at least 4 different phases.
The Roman Invaders circa 55 BC.
Vallum Hadriani Proto-Germanic borrowing from Latin. Etymology
From vallus (“stake, palisade, point”), from Proto-Indo-European *wel- (“to turn, wind, roll”).

vallo Compound of imperative (tu form) of andare and lo.
Old Latin *moerus, *moiros, from Proto-Indo-European *mey- (“to strengthen”).
muro m (plural muros)
wall (stone structure built for defense) Synonym: muralla wall (stone structure built for delimitation) Synonyms: valado, valo
Otherwise the masculine plural muri is used: I muri hanno orecchie. ? The walls have ears.
The feminine plural mura denotes the walls of a town, castle or similar, viewed collectively:
Le mura di Roma hanno dodici porte. ? Rome’s walls have twelve gates.
murare (transitive) to wall up (transitive) to embed into a wall
From *moiros, from Proto-Indo-European *mey- (“to fix, to build fortifications or fences”), see also Latin m?n?re (“to protect”), Old Norse -mæri (“border-land, boundary”), Old English mære (“landmark, border, boundary”). See also Sanskrit ???? (múr, “wall”), Sanskrit ??? (mura, “surrounding, encircling, enclosing”).

In defense of Roger Pielke jun

I get the drift that the scenario itself is not the outcome
and that the scenario does not have to be real.
and that therefore, a scenario may not be a prediction, only a conditional prediction.

The problem is that you cannot usefully cleave [split] a scenario and a prediction in this way without losing the meaning of both words.

For your analogy I agree that one does not usually try to prove the precept is wrong to show that the outcome is wrong.
That is because a precept or scenario is not falsifiable, You determine the input.
If one uses a different input one would would have to put up a different output.
A scenario can only be a scenario if it is predicating [and hence predicting] a future outcome.

If the situation the scenario is attempting to mimic is shown by time to be different to the assumptions you used that is not a failure of the scenario.
Reality is a different scenario and you cannot falsify either by comparing the outcomes.

RP and I have never tried ” to claim a conditional projection failed, since they claim a predicted scenario didn’t occur.”
It is wrong to say that.
A more apt analogy would be that the child placed it’s hand on the hot stove and it did not burn.

In this case the fact that you claimed the stove was hot enough to burn the child’s hand is wrong.
You did not put enough wood in the fire [wrong assumptions] or did not light the match [check the starting conditions were as you said] or did not run it long enough [dodgy thermometers].

I do not mind people bagging my arguments but I do mind people bagging their opponents unjustly.
Fair enough with me, i make misunderstandings.
Roger Pielke is a true scientist, brought up in a scientific family and background and does not make basic misunderstandings of scenario’s and it is just plain wrong to say that he does.

“If there’s warming then I think you still need some kind of flux imbalance. My understanding is that quite soon after a perturbation (say, an increase in atmospheric CO2) the LW fluxes can return to balance, but the cloud feedback leads to an imbalance in the SW fluxes, which then dominates the subsequent warming.”
izen “The increase in surface temperature is a result in the greater thermalisation of OLR from the surface in the lower layers of the atmosphere, not in a imbalance in the energy flux for the whole system.”

“If there’s warming then I think you still need some kind of flux imbalance” This bit is very true but emphasises the problem raised by Izen.
If warming is occurring there must be a flux imbalance.
We see this every day when the sun comes up. The GHG concentration does not change **[much] but the atmosphere heats up and the radiating layer goes much further outward.
So some energy has been garnished from the sun and thermalised.

But what happens when the heat input stabilizes say just after midday[** more provisos].
For a short period of tome the energy in equals the energy out as everything is in balance.
Then the radiating layer contracts as the atmosphere cools.

Does the CO2 level affect this pattern? No [* more provisos].
What it does affect though is the amount of atmospheric thermalisation that day.
The atmosphere will be warmer with more CO2 in it.
Not in 100 years but at that lovely moment of equibrilation.
Which occurs every day, usually after midday, though it might occur several times around that time due to albedo cloud changes.

ATTP “quite soon after a perturbation (say, an increase in atmospheric CO2) the LW fluxes can return to balance,”
” but the cloud feedback leads to an imbalance in the SW fluxes, which then dominates the subsequent warming.”

Not sure of this. Feedbacks occur including clouds which is more part of the expected imbalance due to the change in incoming heat.
The SW fluxes can only be variable due to the variable albedo? They temporarily alter the actual heat input which is why you might have several moments of equilibrium usually after midday. The longterm feedback effect amplification is more due to increased GHG [water vapour] in the air raising the ECS not the SW effects.

What responsibility, Doc

Joshua says:
“At any rate – the point being to respect the uncertainty, until we have better data.”
Why start now?
It is a bit late.
Plus it is more than the data,
Every Pandemic presents something novel so past experience does not guide future results.
What was that story about the fellow with the lion on the loose.
Sometimes you have to respect the uncertainty,
Sometimes you have to run. April 28, 2020 at 3:31 am

Mal Adapted
What responsibility, Doc? What are your expectations of scientists? What do you expect from yourself, your family, your neighbors, your country? Just who is responsible for AGW?
IMO, your comment reveals how alien the culture of science is to you.

The line
” d) maintaining research practices that normalize careless use of scenarios in a vacuum of plausibility,” came from Bete Noir, R.P. Who also said,
“As a consequence, the climate research community is presently off-track. Attempts to address scenario misuse within the community have thus far not worked.”

I added “Not to mention a vacuum of responsibility.”.
I fail to see the difference between what he is saying and what I appended, if something is used in a non plausible fashion it is being used irresponsibly.

Very difficult to answer questions about responsibility. It tends to get conflated with blame [responsibility for doing something wrong].
Do you want me to be responsible? I.e. Do things the way you want me to do them.
Do you want me to be responsible . I.e. the cause of AGW? [ or Collectively with skeptics or with humanity?].

I think you have asked a very important question epistemologically.
I do understand where you are coming from, a genuine care for the world.


In summary we have basically learned or revised the game in the best way.
By playing hands with other people, making mistakes and testing the rules.
It is a great card game, a little complicated by those darn Jacks or Bowers  and by those people who insist on trying Misere.

I hope we can get back up running in the near future.
We have had 30 people through who now know each other a little better. I would hope that some of you can contact others and have some games at home perhaps.

Today we are giving out some small prizes to those lucky people who first call and make 10 of a suit contract, a misere or a No trump bid at the 7 level. There will also be a prize for the person who first picks up a Joker and a bower in their suit contract.

In summary, be bold if you can. Give other people a chance and bid 6 spades in opening position if you have nothing to bid, at least your partner will not get carried away.
4 top cards eg AK, AK, with a joker to bid 6 No Trumps.
5 in a suit with a joker or bower and an outside ace to call a trump suit at the 6 level.
You cannot bid Misere after passing or if the 7 level has not been reached.

Today’s lesson was to be on “Finessing” or trapping an opponents king when you hold the A and Queen. You should lead another suit to partner asking for a lead back in the next suit up if they ever get in.

Lead a spade if you want a club lead, lead a cub for a diamond , a diamond for a heart and a heart for a spade.


  • angech says:

    Are you serious?

    Outgoing radiation is measured in terms of W/square metres from the top of the atmosphere by satellite on average 100 km out.

    You cannot measure it any other way.

    It is 240 W/TOASqM.
    This figure would have a lot more energy in it if converted to W/SqM earth surface.
    One cannot balance an energy balance by balancing incompatible terms.

    TOA measurements are prone to very large SD.
    My understanding is that what the earth would put out per sq M at the TOA surface equals what is put out at a much higher temperature at the earth’s surface.
    They are equivalent.
    There should not be any loss of energy at the TOA.
    The amount of energy should be equal.
    The GHG back radiation is needed to build the earth surface up to the required radiating temp.
    Very disappointed that no one else can see this.

Would you mind explaining how and where the Total outgoing IR to space is calculated? 240 W/M2 but where is the meter located?
The energy budget works on the amount of energy calculated on its way to the earths surface and at the earths surface.
But the outgoing IR to space is not coming from the surface.
That has to be the emitted surface radiation, 395 W/M 2.
The radiation emitted to space is the radiation out at the TOA.
This is a concern because any radiation measured or assumed to be 240 W/M 2 at a level of 100 km [TOA surface area 526.2 million km² Radius: 6,471 km].
cannot be the same as radiation measured at the Earth Surface area [510.1 million km² Radius: 6,371 km.

The maths in the energy budget just becomes wrong.
If the outgoing energy has been adjusted to earth circumference the true TIA must be higher and then would not agree with what your instruments measure in terms of outgoing flux.
So are the instruments you use measuring true outgoing radiation at TOA
which is per square meter of the TOA surface area, what I would expect.
If so how can anyone use this figure 240 W/M2 to balance an energy equation based on square meters at earth Surface level.

I am sorry to be a nuisance.
You have offered to try to get the message across on IR which I hopefully get.
Could you please point out where the basic error I am making is?Enough already.
I do not care what religion anyone is.
That is their business, right or wrong.
I do not care how religion treats science.
Their gain or loss.
Discuss the science or lack of it only.
I do know that most nearly all science has evolved within the religious frameworks of nearly all societies.
Zoroastrian, Hindu, Chinese, Aztec, Arabic as in Jewish, Christian and Moslem in order of appearance. Multiple other religions current and extinguished have had scientists in their ranks.
Our greatest modern scientists in the western world have had strong religious beliefs, Galileo and Newton for starters.
I think we should treat our great scientists with the respect they deserve for the work they have done. Christy and Spencer should probably get a Nobel Prize for the more than 30 years of dedicated work and theorizing they have put in.
They won’t.
Thank you DG for your comments which were expressed much better.
Can we please drop the religious attacks and concentrate on the science” Much of the first part of Dr. Ollila’s article is just fine. His objection to the diagram is introduced with the following statement, which those who hold similar views to his will be triggered by:

“The obvious reason for the GH effect seems to be the downward infrared radiation from the atmosphere to the surface and its magnitude is 345 W/m2. Therefore, the surface absorbs totally 165 (solar) + 345 (downward infrared from the atmosphere) = 510 W/m2.“

But this is where the problem with ambiguous wording comes in. The atmosphere is not, strictly speaking, adding more [“New”] energy to the surface. It is merely returning a portion of the atmosphere-absorbed solar, infrared, and convective transport energy back to the surface in the form of infrared energy.

As shown in Fig. 2, the surface is still emitting more IR energy than the atmosphere is returning to the surface, resulting in net surface loss of [395 – 345 =] 50 W/m2 of infrared energy. And, as previously mentioned, all energy fluxes at the surface balance.

And this is what our intuition tells us should be happening: the surface is warmed by sunlight, and cooled by the loss of IR energy (plus moist and dry convective cooling of the surface of 91 and 24 W/m2, respectively.)”

Therefore, the surface absorbs totally 165 (solar) + 345 (downward infrared from the atmosphere) = 510 W/m2.“ Yes
the surface is still emitting more IR energy than the atmosphere is returning to the surface, resulting in net surface loss of [395 – 345 =] 50 W/m2 of infrared energy. Yes
Plus (plus moist and dry convective cooling of the surface of 91 and 24 W/m2, respectively.)” = 165 W/M-2 The surface absorbed solar radiation] Yes

So far I agree with both of you?

But Ollila” The difference between the radiation to the surface and the net solar radiation is 510 – 240 = 270 Wm-2. The real GH warming effect is right here: it is 270 Wm-2 because it is the extra energy warming the Earth’s surface in addition to the net solar energy.”

This is the magical energy from nowhere step you are referring to? Because , as you say, The atmosphere is not, strictly speaking, adding more [“New”] energy to the surface.
Ollila actually acknowledges this in his article ” According to the energy conversation law, energy cannot be created from the void. According to the same law, energy does not disappear, but it can change its form.” but ploughs ahead.

“The final step is that we must find out what is the mechanism creating this infrared radiation from the atmosphere. According to the IPCC’s definition, the GH effect is caused by the GH gases and clouds which absorb infrared radiation of 155 Wm-2 emitted by the surface and which they further radiate to the surface. This same figure has been applied by the research group of Gavin Schmidt calculating the contributions of GH gases and clouds. As we can see there is a problem – and a very big problem – in the IPCC’s GH effect definition: the absorbed energy of 155 Wm-2 cannot radiate to the surface 345 Wm-2 or even 270 Wm-2.”

Here I go off the rails.
“If we were to represent these infrared energy flows in Fig. 1 more completely, there would be a nearly infinite number of red arrows, both upward and downward, connecting every vanishingly-thin layer of atmosphere with every other vanishingly thin layer. Those are the flows that are happening continuously in the atmosphere.”

I presume that the ” net surface loss of [395 – 345 =] 50 W/m2 of infrared energy.” which is all that is left over when “moist and dry convective cooling of the surface of 91 and 24 W/m2, respectively.” is removed from the “initial surface absorbs totally 165 (solar)” is actually doing far more than rebounding just once and going off into space. There would be a limiting factor at 345 W/M-2 which is how much energy bounces back repeatedly until it can escape?

Dr Ollila’s summary of heat sources
” it is easy to name the two other energy sources which are needed for causing the GH effect namely latent heating 91 Wm-2 and sensible heating 24 Wm-2, which make 270 Wm-2 with the longwave absorption of 155 Wm-2. When the solar radiation absorption of 75 Wm-2 by the atmosphere will be added to these three GH effect sources, the sum is 345 Wm2.”
explains why it is a little more complicated than that in that some of the IR comes from the effects of the IR radiation from other parts of the atmosphere but I am not sure where he gets the longwave absorption of 155 Wm-2.

It is the fact that the surface emission is higher than the TOA radiation to space so energy [quite a lot] must somehow be be being trapped in the atmosphere.

“Now, I have spent at least a couple of hours trying to follow his line of reasoning, and I cannot.”

Dr Ollila’s reasoning “Here is the point: the IPCC’s definition means that the LW absorption of 155 Wm-2 could create radiation of 270 Wm-2 which is impossible.”

“The Role of Earth Radiation Budget Studies in Climate and General Circulation Research“ Ramanathan
The greenhouse effect. The estimates of the outgoing longwave radiation also lead to quantitative inferences about the atmospheric greenhouse effect. At a globally averaged temperature of 15°C the surface emits about 390 W m -2, while according to satellites, the long-wave radiation escaping to space is only 237 W m -2. Thus the absorption and emission of long-wave radiation by the intervening atmospheric gases and clouds cause a net reduction of about 150 W m -2 in the radiation emitted to space. This trapping effect of radiation, referred to as the greenhouse effect, plays a dominant role in governing the temperature of the planet.”‘

Dr Ollila has a point. the surface emits about 390 W m -2, the long-wave radiation escaping to space is said to be only 237 W m -2. [where ??TOA or earth’s surface vitally important]
How can anybody say this ” a net reduction of about 150 W m -2 in the radiation emitted to space.”
The earth has had Greenhouses gases for over 2 billion years, possibly 4 billion.
How hot should we be if our planet can keep trapping 150W/M-2 for 2 billion years?

  • “Now, this is curious. On average the change at the surface is a little less than half the TOA greenhouse effect change. So an increase of 3.7 W/m2 at the TOA from a doubling of CO2 becomes a 1.8 W/m2 increase at the surface.”
    “The key is to realize that the atmosphere is not heated by just Ramanathan’s ~150 W/m2.”

    Hate that diagram.
    Trying to explain things
    There is a TOA of 237 W/m2.
    At this level 100 km above the earth the incoming energy that is not reflected exactly balances the outgoing energy 237 W/m2.
    The surface of the earth is radiating at 392 W/m2.
    This is amazingly higher than the 342 W/m2. from the total incoming reflected and incident solar radiation.
    The GHG effect is basically to add 321 W/m2. of back radiation to the heating of the earth surface to the 169 W/m2. from the incident solar radiation that reaches the earth.
    Basically the surface should be at a temperature commensurate with 490 W/m2. ie hotter than it is.
    It emits however at 392 W/ I presume a temp of 15C, because the other 98 W/m2. is lost by sensible heat 10 W/m2. and latent heat 18 W/m2.

    Now there is no Ramanathan 150 W/m2. being absorbed all the time. Some energy has to absorbed to raise the temperature of the air and surface but this is almost instantaneous and trivial when considering all those hydrogen bombs of energy going through the atmosphere every second. Air temperature changes very quickly night to day. Once it is warmed up there is no 150 W/m2. being drained into an atmospheric greenhouse battery all the time.
    The energy in equals the energy out at the TOA.

    Now why do we have a seeming TOA imbalance from the surface when there is not one at the TOA?
    Because we are not comparing oranges with oranges.
    The total energy absorbed at the surface is for a much smaller sphere.
    Earth Surface area: 510.1 million km² Radius: 6,371 km energy emitted 392.
    TOA surface area 526.2 million km² Radius: 6,471 km energy emitted 237.
    Is this enough to make these 2 figures equal is what I would like someone to answer.
    On the surface it does not look likely but?

    One cannot take energy figures per square meter of a much larger sphere from energy figures for a much smaller sphere without doing a calibration for surface area and attenuation.

  • Hmm seems the outgoing IR is measured at the TOA 100 KM out so spread over a bigger sphere surface area but the energy going into the ground is measured at earth surface area a smaller sphere so the energy budget diagrams are technically out of whack.
    Technically the two have to balance to have a TOA in the first place

    OLR is a critical component of the Earth’s energy budget, and represents the total radiation going to space emitted by the atmosphere.[3] OLR contributes to the net all-wave radiation for a surface which is equal to the sum of shortwave and long-wave down-welling radiation minus the sum of shortwave and long-wave up-welling radiation.[4] The net all-wave radiation balance is dominated by long-wave radiation during the night and during most times of the year in the polar regions.[5] Earth’s radiation balance is quite closely achieved since the OLR very nearly equals the Shortwave Absorbed Radiation received at high energy from the sun. Thus, the Earth’s average temperature is very nearly stable

    • angech January 21, 2020 at 11:04 pm

      Hmm seems the outgoing IR is measured at the TOA 100 KM out so spread over a bigger sphere surface area but the energy going into the ground is measured at earth surface area a smaller sphere so the energy budget diagrams are technically out of whack.

      Angtech, I would be shocked if this were not taken into consideration. Scientists are often foolish but rarely dumb. Hang on, let me run the numbers …

      … OK, The surface area of a sphere varies as R^2. The CERES satellites are actually at an altitude of about 500 km., not 100. That means that the area of the sphere where the satellites orbit is about 16.3% larger than the earth’s surface. The idea that scientists wouldn’t bot notice and adjust for a potential error of 16% is simply not reasonable.


  • angech January 21, 2020 at 10:14 pm Edit

    “Now, this is curious. On average the change at the surface is a little less than half the TOA greenhouse effect change. So an increase of 3.7 W/m2 at the TOA from a doubling of CO2 becomes a 1.8 W/m2 increase at the surface.”

    “The key is to realize that the atmosphere is not heated by just Ramanathan’s ~150 W/m2.”

    Hate that diagram.

    Back up. Explain what it is that you hate about my diagram. It is a representation of the simplest possible layout of the energy flows. Just what is it that you “hate” about it?

    Now, I drew that up about 20 years ago because of the problems with the Trenberth version, which has lots of handwaving. Mine, on the other hand, obeys the physical laws—energy is conserved at all levels, and radiation up = radiation down.

    Now, the numbers are slightly out per CERES … but then two decades ago I didn’t have CERES data. But other than that … what’s wrong with it?

    Finally, the top layer is not 500 km out, or a hundred KM out. The bottom layer of the stratosphere is the effective radiating layer. We know this from the brightness temperature of the radiation. It’s at about 10 km. This difference in altitude introduces an error of 0.3% in the simplified energy diagram … lost in the noise.


    • Willis,
      I notice that you’ve got, (in your diagram), 321 watts/sq.m of “backradiation” from the “greenhouse” gases coming down from the atmosphere and absorbed by the surface.
      According to the diagram you only get 169 watts/sq.m impinging on the surface from the sun…the sun Willis,… in summer hot enough to melt tar on the roads.
      I was wondering if you leave your bacon and eggs out on the porch overnight and have them cooked for you in the morning by that backradiation from the atmosphere.?

      • Mack January 22, 2020 at 2:47 am

        I notice that you’ve got, (in your diagram), 321 watts/sq.m of “backradiation” from the “greenhouse” gases coming down from the atmosphere and absorbed by the surface.
        According to the diagram you only get 169 watts/sq.m impinging on the surface from the sun…the sun Willis,… in summer hot enough to melt tar on the roads.
        I was wondering if you leave your bacon and eggs out on the porch overnight and have them cooked for you in the morning by that backradiation from the atmosphere.?

        Summer roads are not heated by the average radiation of 169 W/m2. They’re heated by something like a kilowatt per square metre or so of sunshine, plus thermal radiation from the atmosphere.

        Next, it seems you think that the idea that the atmosphere emits thermal radiation to be somehow incredible or impossible. Not sure why. It’s been measured, not theorized but measured, thousands and thousands of times by scientists around the planet.


        • Well, I thought those numbers would have pricked up your ears, Willis I would have thought that 321 watts/ sq.m. of “backradiation” belting down from the ATMOSPHERE 24/7, would have triggered some form of thought process in your head….. particularly since it’s nearly TWICE the amount of solar radiation impinging upon the surface.! Is there nothing about that which really unsettles you? Is there nothing about that which says…”hang on, there could be some mistake in these diagrams.” ?

          • Thanks, Mack. You clearly think downwelling longwave infrared radiation is imaginary.

            Me, I know that it’s been measured all over the planet by scientists. It’s measured at all the SURFRAD sites. It’s measured by the TAO buoys. It’s measured at the ARM sites.

            Do you truly think that those hundreds of scientists are just making it up?

            Also, if the ? 169 w/m2 of sunlight was the only thing heating the surface, it would be at about -40°C or so … is there nothing about that which really unsettles you?


          • Willis Eschenbach January 23, 2020 at 12:35 am

            Also, if the ? 169 w/m2 of sunlight was the only thing heating the surface, it would be at about -40°C or so …

            Why do you keep pushing this radiative balance temperature nonsense?
            If the surface temperatures on Earth were in radiative balance with incoming solar we would see temperature swings from ~3K during the night to 365K or higher during the day.
            Is not happening.
            169 W/m^2 is ~14,6 MJ/m^2 over 24 hrs. This seems close to the world average as shown in these charts:
            14,6 MJ/m^2 between sunrise and sunset is enough energy to INCREASE the temperature of the upper 4 m of ocean water 1K.
            Has nothing to do with RADIATIVE balance.
            Backradiation does not warm the surface, it reduces the energy loss from the surface to the atmosphere. Otherwise we would see your 321 W/m^2 + ~1000 W/m^2 at noon giving temperatures of ~390K.

1 metre by 1 metre by 1 metre concrete block floating in outer space.
The block is insulated on four sides, a perfect insulator, no heat at all is lost from the four insulated sides.
the emissivity “epsilon” and absorptivity across the spectrum are both 0.95.
the thermal conductivity “k” of the concrete is equal to 0.8 watts per metre per kelvin (0.8 W/m K^-1)
it gets full-time sunshine on the front side at a rate of 1360 watts per square metre (W/m2).
what will be the temperature T_hot of the hot side and the temperature T_cold of the opposite cold side?

At a 1 molecule thinness the temperature on both sides would be equal hence half as hot as expected if the back surface was also insulated. 65.67C
At a million metres the back surface would be at a very low temperature just above 62 K.
This would be enough to drain the minute amount of energy that makes it across the block.
Th surface of the block receiving radiation has to heat up to a higher temperature to force heat across the concrete gradient. The maximum it can heat to is double the energy it absorbs.
The soldering iron I think someone referred to it as.

129.77 C or 512.92K is the Temp of the hot side.
Similar range to the surface of the moon different albedo.

The cold side is more difficult. The bulk of the thermal mass built up by absorption of energy is at the heated end which radiates most of the radiation back out.
The small amount that “conducts” 0.8 watts per metre per kelvin finally gives that level to the other side which immediately radiates it into space giving it a temp of -210.4 C or 62.75 K


500 week 5

Dealing and suits.

No Trumps.

How to discard. Any questions?

todays talk is on aspects of play.

How to assess your hand, communicate with your partner and what to lead and when.

You need 5 cards with one of the top 3 cards and an outside ace to open  Suit at the 6 level.

Global Temperature anomalies nick stokes

Global Temperature anomalies

The problems with land and ocean based reconstructions of global temperature and global temperature anomaly maps is very basic. Why do them at all?
This is a serious question and does have some complex answers which lead to more complex questions.

The main answer would be to act as an adjunct to and back up checklist for the satellite systems which give a much greater coverage and specificity and accuracy than a limited stone age measuring and recording system can do. Satellite systems are the backbone for all weather estimation, reporting and predicting. They are the only reasonably accurate source of a true current global temperature, depending on what criteria you wish to choose. None of this can be done by land and sea based systems recording singular sites on a current event pattern with poor real time transmission of data from limited sites.

Satellite problems and strengths.
Satellite collection of data does have problems at the local level, a bit like quantum theory  when you have to drill down to the size of raindrops you lose the ability to measure the actual micro events like precipitation and surface temperature. This is more than made up for by the ability to reach inaccessible areas all around the globe, by being able to chart map and check cloud formation and coverage over land and sea plus get temperatures from all parts of the visible ocean.
Satellites do drift and adjustments do have to be made for the effects on the wavelengths they are recording. Satellites do have to have confirmation of the temperature readings at specified locations which in turn helps in adjusting the settings to get the best match. What satellites also offer is GPS positioning of all places, heights, depths, elevation and distance from the centre of the earth which effects the temperature and air flow. Mentioning air flow raises the fact that the jet streams polar vortices and other important higher up air flow patterns are assessed as is moisture content of the air masses . Similar data can be gathered to some extent on ocean currents at a superficial level.

Thermometers and a standard.
To come back to the question of assessing the global temperature or more correctly a global temperature anomaly how many, functioning properly, well sited thermometers do you need?
This raises the problem of a definition of an average global temperature
The answer strangely is just two.
The resultant temperature would be a bit like the El Nino, La Nina temperature patterns. It would run in a trough between summer and winter conditions, It would have times when weather patterns conspired to make seemingly ridiculous departures from the norm but over time it would give a reasonable approximation of both anomaly change and average temperature at those sites.

Could you improve the overall accuracy by choice of site position?  A site at the equator  is  different to one at the poles in the amount of variation possible in the temperature range seasonally and in the actual average temperature. Two sites inland by 10 kilometers on a low lying plain say 30* metres elevation  on the tropics of Cancer and Capricorn   would obviate the problems of polar and equatorial extremes and also allow for a balance between the more land filled NH and the water based SH. Two further pairs would allow back up and comparison to occur.

Six continents, 6 countries, State of the art thermometers and restriction zones of a kilometer and you could have an agreed , reliable , international standard global temperature for all time. Now this would still not be as accurate and reliable as the satellites can do without the blink of an eye but it would standardize  issues as it would have a much longer term  reproducible result than satellites with their limited life span can give.

The Global Temperature What is it and why is it important.

Coming back to a global temperature and the fact that a reasonably  accurate version can only be given currently by Satellite data the question is why is it useful? The answer is that it is needed for all projections of earth weather, climate and climate change.
All GCM’s must have an initial Global temperature start up point. This data point obviously should not change over time. It should be the same for all GCM and all weather and climate predicting models. That is it should have a chosen start date and level known by everyone and written down in black and white.
However different data sets will have different GT.
GCM have been around since the 70’s and are said to be quite good in their projections. There are new ones being created  all the time.
If we assume that Zeke is right then there should be an original GT* and date in the first model that matches ir is linked to each subsequent model. If adjustments have been made to this parameter then the models would no longer be working on the same scenario.
If we have a standard GT  then there must be a past 1850 GT for each model that would be different due to their different algorithms, no problem there.
Other than there can be no exact  agreement on the amount of warming since 1850.

The problem for the models is that they must run on the Input GT at the time of setting up the model. To achieve this they have to input the current level from whatever data set they choose to use. Different data sets may have different GT.
As the new GCM may differ in its assumptions from the data set assumptions there will then exist an anomaly between the two models when they do a backwards run and comparison.
Further the data sets change their data going backwards daily by adjustments [Zeke] . When a comparison is used  a year later the current data set model being compared will differ from the one used a year before. The GCM will not adjust the data in the past so it on will give a GT based on the old readings as it enters the new readings.

GCMS  and Data sets have a built in bias for CO2 increase, also known as the ECS.

The Data sets like GHCN register the correct current temperatures and therefore bias the past lower to show the global warming that was expected to have occurred. There is no ability to move thermometers upwards. GCM’s on the other hand suffer a double whammy. They have incorporated data sets at the time of inception cooling the past which is fixed and now make assumptions on future warming from that date. Consequently they add on warming at a rate predicted by CO2 levels.
Why does CMP 6 run much hotter than CMP5 ? Wherever the newer models start from the GMT was not much higher than the older models yet had a much larger CO2 warming adjustment to work with

Here is where the problem with gridding and adjustments and sites comes into full play.

An assumption that all sea levels are the same yet due to the earth shape grids away from the equator have a lesser air pressure due to lesser gravity which affects the temperature but is not taken into account. Grids here may be determined in part by ones 1000 Km north or south.
Gridding on land may not take elevation fully into account. Two sites on either side of a mountain range have a different temperature to the grid with the mountain between. Elevation is taken into account in local site shifts  but very difficult to do properly when sites are hundreds of miles apart. Certainly not taken into account by some of the people describing their grid attempts which are only temperature infills not elevation, forestation, mountains and deserts
When the sites used  become airports, airports do need accurate temperature levels, we can either have a Global Airport Temperature reading  or an inaccurate Global Temperature outcome.

Note this comment and or post was inspired by Nick Stokes Moyhu blog where he gives an excellent summary of the main methods of assessing global temperatures and local temperatures “US temperatures 27/2/2020” Averaging and graphics methods.



I have for many years been experimenting with methods for calculating average surface temperature anomalies from collections of station readings (including sea surface from grid, regarded as stations). I describe the history here. In the early days, I tried to calculate the averages for various countries and continents. With the coarse grids I was using at the time, and later meshes, I found the results unsatisfactory, and of limited scientific importance.

So I put more effort into developing global methods. There is a rather mathematical survey of these here. Briefly, there are six main types:

  • The conventional grid analysis, averaging the measurements within each cell, and then getting an area-weighted average of the cells. I think this is very unsatisfactory since either the grid is very coarse, or there are cells with no data.
  • Triangular mesh, with the sites as nodes. This has been my mainstay. But it is not very good at following national boundaries.
  • A method where spherical harmonics are fitted and then integrated. This is now implemented as a general method for improving otherwise weak methods. The structure is instructive, but again, intrinsically global.
  • A LOESS method, described here This has the characteristic of gathering information from as wide a net as needed; useful when stations get sparse, but not a respecter of boundaries.
  • Most recently, a method using FEM shape functions. I think this may be best, in terms of getting the best representation on a relatively coarse grid. Again, not so good for boundaries, but it has as a special case, one of my earlier methods:
  • Grid with infill (eg an early version here). The weakness of conventional gridding is that it does not use local information to estimate missing cells, and so the grid must be kept fairly coarse. But I have worked out a way of doing that systematically, which then allows much finer gridding. And that does have the incidental benefit of tracking national and land boundaries. It also allows a good graphics scheme. I’ll say more about it in the next section. In this text, I’m using blue colors for the more technical text.

Infilling and diffusion – the Laplace equation.

The logical expression of using neighbour information is that empty cells should be assigned the average temperature of their neighbours. For an isolated cell, or maybe a few, you can do this directly. But with clumps of empty cells, some may have no known neighbours at all. But you can still maintain the requirement; a solution procedure is needed to make it happen.

The idealization of this is the Laplace differential equation, solved with Dirichlet boundary conditions expressing the known cells, and zero gradient (Neumann) conditions at the boundaries (not needed for the globe). That equation would describe a physical realisation in which a sheet of metal was kept insulated but held at the appropriate temperatures in specified locations. The temperatures in between would, at equilibrium, vary continuously according to the Laplace equation.

This is a very basic physical problem, and methods are well established for solving. You just write a huge set of linear equations linking the variables – basically, one for each cell saying that it should be the average of the neighbours. I used to solve that system just by letting it play out as a diffusion, but faster is to use conjugate gradients.

Once the data-free cells are filled, the whole grid can be integrated by taking the area-weighted average of the cell values.

In one dimension, requiring each unknown value to be the average of its neighbours would lead to linear interpolation. Solving the Laplace equation is thus the 2D analogue of linear interpolation.