Global Temperature anomalies
The problems with land and ocean based reconstructions of global temperature and global temperature anomaly maps is very basic. Why do them at all?
This is a serious question and does have some complex answers which lead to more complex questions.
The main answer would be to act as an adjunct to and back up checklist for the satellite systems which give a much greater coverage and specificity and accuracy than a limited stone age measuring and recording system can do. Satellite systems are the backbone for all weather estimation, reporting and predicting. They are the only reasonably accurate source of a true current global temperature, depending on what criteria you wish to choose. None of this can be done by land and sea based systems recording singular sites on a current event pattern with poor real time transmission of data from limited sites.
Satellite problems and strengths.
Satellite collection of data does have problems at the local level, a bit like quantum theory when you have to drill down to the size of raindrops you lose the ability to measure the actual micro events like precipitation and surface temperature. This is more than made up for by the ability to reach inaccessible areas all around the globe, by being able to chart map and check cloud formation and coverage over land and sea plus get temperatures from all parts of the visible ocean.
Satellites do drift and adjustments do have to be made for the effects on the wavelengths they are recording. Satellites do have to have confirmation of the temperature readings at specified locations which in turn helps in adjusting the settings to get the best match. What satellites also offer is GPS positioning of all places, heights, depths, elevation and distance from the centre of the earth which effects the temperature and air flow. Mentioning air flow raises the fact that the jet streams polar vortices and other important higher up air flow patterns are assessed as is moisture content of the air masses . Similar data can be gathered to some extent on ocean currents at a superficial level.
Thermometers and a standard.
To come back to the question of assessing the global temperature or more correctly a global temperature anomaly how many, functioning properly, well sited thermometers do you need?
This raises the problem of a definition of an average global temperature
The answer strangely is just two.
The resultant temperature would be a bit like the El Nino, La Nina temperature patterns. It would run in a trough between summer and winter conditions, It would have times when weather patterns conspired to make seemingly ridiculous departures from the norm but over time it would give a reasonable approximation of both anomaly change and average temperature at those sites.
Could you improve the overall accuracy by choice of site position? A site at the equator is different to one at the poles in the amount of variation possible in the temperature range seasonally and in the actual average temperature. Two sites inland by 10 kilometers on a low lying plain say 30* metres elevation on the tropics of Cancer and Capricorn would obviate the problems of polar and equatorial extremes and also allow for a balance between the more land filled NH and the water based SH. Two further pairs would allow back up and comparison to occur.
Six continents, 6 countries, State of the art thermometers and restriction zones of a kilometer and you could have an agreed , reliable , international standard global temperature for all time. Now this would still not be as accurate and reliable as the satellites can do without the blink of an eye but it would standardize issues as it would have a much longer term reproducible result than satellites with their limited life span can give.
The Global Temperature What is it and why is it important.
Coming back to a global temperature and the fact that a reasonably accurate version can only be given currently by Satellite data the question is why is it useful? The answer is that it is needed for all projections of earth weather, climate and climate change.
All GCM’s must have an initial Global temperature start up point. This data point obviously should not change over time. It should be the same for all GCM and all weather and climate predicting models. That is it should have a chosen start date and level known by everyone and written down in black and white.
However different data sets will have different GT.
GCM have been around since the 70’s and are said to be quite good in their projections. There are new ones being created all the time.
If we assume that Zeke is right then there should be an original GT* and date in the first model that matches ir is linked to each subsequent model. If adjustments have been made to this parameter then the models would no longer be working on the same scenario.
If we have a standard GT then there must be a past 1850 GT for each model that would be different due to their different algorithms, no problem there.
Other than there can be no exact agreement on the amount of warming since 1850.
The problem for the models is that they must run on the Input GT at the time of setting up the model. To achieve this they have to input the current level from whatever data set they choose to use. Different data sets may have different GT.
As the new GCM may differ in its assumptions from the data set assumptions there will then exist an anomaly between the two models when they do a backwards run and comparison.
Further the data sets change their data going backwards daily by adjustments [Zeke] . When a comparison is used a year later the current data set model being compared will differ from the one used a year before. The GCM will not adjust the data in the past so it on will give a GT based on the old readings as it enters the new readings.
GCMS and Data sets have a built in bias for CO2 increase, also known as the ECS.
The Data sets like GHCN register the correct current temperatures and therefore bias the past lower to show the global warming that was expected to have occurred. There is no ability to move thermometers upwards. GCM’s on the other hand suffer a double whammy. They have incorporated data sets at the time of inception cooling the past which is fixed and now make assumptions on future warming from that date. Consequently they add on warming at a rate predicted by CO2 levels.
Why does CMP 6 run much hotter than CMP5 ? Wherever the newer models start from the GMT was not much higher than the older models yet had a much larger CO2 warming adjustment to work with
Here is where the problem with gridding and adjustments and sites comes into full play.
An assumption that all sea levels are the same yet due to the earth shape grids away from the equator have a lesser air pressure due to lesser gravity which affects the temperature but is not taken into account. Grids here may be determined in part by ones 1000 Km north or south.
Gridding on land may not take elevation fully into account. Two sites on either side of a mountain range have a different temperature to the grid with the mountain between. Elevation is taken into account in local site shifts but very difficult to do properly when sites are hundreds of miles apart. Certainly not taken into account by some of the people describing their grid attempts which are only temperature infills not elevation, forestation, mountains and deserts
When the sites used become airports, airports do need accurate temperature levels, we can either have a Global Airport Temperature reading or an inaccurate Global Temperature outcome.
Note this comment and or post was inspired by Nick Stokes Moyhu blog where he gives an excellent summary of the main methods of assessing global temperatures and local temperatures “US temperatures 27/2/2020” Averaging and graphics methods.
I have for many years been experimenting with methods for calculating average surface temperature anomalies from collections of station readings (including sea surface from grid, regarded as stations). I describe the history here. In the early days, I tried to calculate the averages for various countries and continents. With the coarse grids I was using at the time, and later meshes, I found the results unsatisfactory, and of limited scientific importance.
So I put more effort into developing global methods. There is a rather mathematical survey of these here. Briefly, there are six main types:
- The conventional grid analysis, averaging the measurements within each cell, and then getting an area-weighted average of the cells. I think this is very unsatisfactory since either the grid is very coarse, or there are cells with no data.
- Triangular mesh, with the sites as nodes. This has been my mainstay. But it is not very good at following national boundaries.
- A method where spherical harmonics are fitted and then integrated. This is now implemented as a general method for improving otherwise weak methods. The structure is instructive, but again, intrinsically global.
- A LOESS method, described here This has the characteristic of gathering information from as wide a net as needed; useful when stations get sparse, but not a respecter of boundaries.
- Most recently, a method using FEM shape functions. I think this may be best, in terms of getting the best representation on a relatively coarse grid. Again, not so good for boundaries, but it has as a special case, one of my earlier methods:
- Grid with infill (eg an early version here). The weakness of conventional gridding is that it does not use local information to estimate missing cells, and so the grid must be kept fairly coarse. But I have worked out a way of doing that systematically, which then allows much finer gridding. And that does have the incidental benefit of tracking national and land boundaries. It also allows a good graphics scheme. I’ll say more about it in the next section. In this text, I’m using blue colors for the more technical text.
Infilling and diffusion – the Laplace equation.
The logical expression of using neighbour information is that empty cells should be assigned the average temperature of their neighbours. For an isolated cell, or maybe a few, you can do this directly. But with clumps of empty cells, some may have no known neighbours at all. But you can still maintain the requirement; a solution procedure is needed to make it happen.
The idealization of this is the Laplace differential equation, solved with Dirichlet boundary conditions expressing the known cells, and zero gradient (Neumann) conditions at the boundaries (not needed for the globe). That equation would describe a physical realisation in which a sheet of metal was kept insulated but held at the appropriate temperatures in specified locations. The temperatures in between would, at equilibrium, vary continuously according to the Laplace equation.
This is a very basic physical problem, and methods are well established for solving. You just write a huge set of linear equations linking the variables – basically, one for each cell saying that it should be the average of the neighbours. I used to solve that system just by letting it play out as a diffusion, but faster is to use conjugate gradients.
Once the data-free cells are filled, the whole grid can be integrated by taking the area-weighted average of the cell values.
In one dimension, requiring each unknown value to be the average of its neighbours would lead to linear interpolation. Solving the Laplace equation is thus the 2D analogue of linear interpolation.