mosher

2011-09-23 10:03:27
grypogryposaurus@gmail…
173.69.6.13
Zeke left a comment.  He’s trustworthy.  I’ll fix his link and thank him.

2011-10-08
Kevin Cowtan, from York, UK. I’ve got a PhD in computational physics, and am a long standing post-doc with fellowship-in-the-pipeline working on computational methods development in X-ray crystallography.
maintain
2010-08-15 Robert Way
I am a Masters student at Memorial University of Newfoundland in Eastern Canada. at the University of Ottawa in Geography with a minor in Geomatics and Spatial Analysis.
My primary interests lie in paleoclimatology, remote sensing of techniques for glaciers and ice sheets, and ocean-atmospheric dynamics.
My 2 poster boys.

://climateaudit.org/2012/07/31/surface-stations/

://climateaudit.org/2012/07/31/surface-stations/
angech | July 7, 2014 at 10:59 pm | Reply

Judith, I and others I’m sure would like to do a more formal rebuttal of Zeke’s approach if allowed and only if well written and argued.
Mine would focus on 3 key points.
The first of adjustment of past temperatures from current ones.
The second of a possible flaw in TOBS as used.
The third on the number of what are referred to a Zombie stations
1. Zeke says this is incremental and unavoidable using current temperatures as the best guide and adjusting backwards.
“NCDC assumes that the current set of instruments recording temperature is
accurate, so any time of observation changes or PHA-adjustments are done
relative to current temperatures. Because breakpoints [TOBS] are detected through pair-wise comparisons, new data coming in may SLIGHTLY change the magnitude of recent adjustments by providing a more comprehensive difference series between neighboring stations.

When breakpoints are removed, the entire record prior to the breakpoint is
adjusted up or down depending on the size and direction of the breakpoint.
This means that slight modifications of recent breakpoints

The incremental changes add up to WHOPPING changes of over 1.5 degrees over 100 years to past records and 1.0 degree to 1930 records. Zeke says the TOBS changes at the actual times are only in range of 0.2 to 0.25 degrees. This would mean a cumulative change of 1.3 degrees colder in the distant past on his figures, everywhere.
Note he is only technically right to say this ” will impact all past temperatures at the station in question though a constant offset.”
But he is not changing the past 0.2 degrees.It alters all the past TOBS changes which cause the massive up to 1.5 degrees change in only 100 years.

angech (Comment #130582)

Zeke if a record high temp was recorded in Death Valley, Or Texas Or Alaska on one of the 1218 stations in 2010, It is by your own admission no longer a record on your system because it has had to be adjusted down by the dropping out of the warmer stations you mention.
Nick currently denies this over at WUWT even though he knows the record is being adjusted down.
In his eyes and yours one needs to correct the past records to maintain the purity of the current records under your adjustment system merely for mathematical predictions.
The folly of this perfectly correct mathematical approach is that in we live in a real life, not Maths and graphs world.
We cling to truth in real past records at individual sites, not wanting your attempt to do perfectly correct modelling of the US and world temperatures at a cost of throwing out the past which is what it does.
you are building the same giant clockwork device believing that the universe goes round the earth not the sun, rather than a simple model which incorporates the truth but is still usable for future projection.
The more you put in the more cumbersome and impractical and divorced from reality it becomes.

angech (Comment #129994)

So to be clear
there were “ 1218 real stations (USHCN) in the late 1980s
There are now [???] original real stations left-my guess half 609
There are [???] total real stations – my guess eyeballing 870
There are 161 new real stations , all in airports or cities added to the graph
There are 348 made up stations and 161 selected new stations.
The number of the original 1218 has to be kept like the stock exchange to have a mythical representative temperature or temperature anomaly over this number of sites.
Nobody has put up a new thermometer in rural USA in the last 30 years and none has considered using any of the rural thermometers of which possibly 3000 of the discarded 5782 cooperative network stations.
And all this is Steve Goddards fault.
Can someone confirm these figures are accurate and if so why any trust should be put in this Michael Mann like ensemble of real, adjusted real and computer infilling models.

Zeke (Comment #130058)

Mosh,

Actually, your explanation of adjusting distant past temperatures as a result of using reference stations is not correct. NCDC uses a common anomaly method, not RFM.

The reason why station values in the distant past end up getting adjusted is due to a choice by NCDC to assume that current values are the “true” values. Each month, as new station data come in, NCDC runs their pairwise homogenization algorithm which looks for non-climatic breakpoints by comparing each station to its surrounding stations. When these breakpoints are detected, they are removed. If a small step change is detected in a 100-year station record in the year 2006, for example, removing that step change will move all the values for that station prior to 2006 up or down by the amount of the breakpoint removed. As long as new data leads to new breakpoint detection, the past station temperatures will be raised or lowered by the size of the breakpoint.

An alternative approach would be to assume that the initial temperature reported by a station when it joins the network is “true”, and remove breakpoints relative to the start of the network rather than the end. It would have no effect at all on the trends over the period, of course, but it would lead to less complaining about distant past temperatures changing at the expense of more present temperatures changing.
.
angech,

As I mentioned in the original post, about 300 of the 1218 stations originally assigned to the USHCN in the late 1980s have closed, mostly due to volunteer observers dying or otherwise stopping reporting. No stations have been added to the network to make up for this loss, so there are closer to 900 stations reporting on the monthly basis today.
.
To folks in general,

If you don’t like infilling, don’t use infilled values and create a temperature record only from the 900 stations that are still reporting, or from all the non-infilled stations in each month. As the first graph in the post shows, infilling has no effect on CONUS-average temperatures.

angech (Comment #130074)

Carrack, your link to Moyhu showed Nick Stokes attempting to discredit SG with 6 diagrams talking about a spike in 2014 but all 6 graphs only went to 2000 why the heck is that.
Zeke has a post at SG where he admits that there are only 650 real stations out of 1218 . This is a lot less than only 918 that he alludes to above. Why would he say 650 to SG ( May 12th 3.00 pm) and instead #130058 at the Blackboard about 300 of the 1218 stations have closed down.
Can Zeke give clarity on the number of real stations (raw data) and the number of unreal stations using filled in Data in the 1218 stations.

Nick Stokes (Comment #130077)

angech (Comment #130074)
“6 diagrams talking about a spike in 2014 but all 6 graphs only went to 2000 why the heck is that.”

You’re not very good at reading graphs. The x axis is marked (by R) in years multiple of 20. The data shown is up to date.

“Zeke has a post at SG where he admits that there are only 650 real stations out of 1218 . This is a lot less than only 918 that he alludes to above.”

When I last looked a few weeks ago, in 2014 numbers reporting were Jan 891, Feb 883, Mar 883, and 645 for April. Many are staffed by volunteers and some reports are late. So 918 sounds right.

angech (Comment #130078)

Nick, I cannot understand your post It seems that you split your data into real and infilled sub groups,
There appear to be a large number of these infilled stations 1218 -650 = 668 according to Zeke at SG and here.
there are claims that the real data is not located in the right areas to be useful for graphing the areas due to differences in latitude and elevation.
The artificial sites at the best locations give a “true grid” for the 1218 “stations”.
One knows what the true readings for these artificial sites “should be” , put them in and the adjust the real sites to what the artificial sites say the temperature should be..Zeke says each month one takes the infilled data from the unreal stations. I guess it ” comes in” from the computer programme primed with a need to rise up as CO goes up otherwise known as Steven’s Climate Sensitivity factor which is being adjust downwards from 3.0 to 1.4 currently due to the satellite pause.
One then has to look for non climatic break points, AKA real data,behaving badly which has to be removed.
Fortunately when you do this the difference between the raw R1 data and the final F1 data is almost eliminated as Nick so elegantly shows. Bravo for the shell trick.

angech (Comment #130314)

Thank you Zeke for putting this post up. Hopefully it will result in greater openness and sharing of information though you may not be feeling this yet you are trying, which a lot of your colleagues do not want to do. The level of vitriol reflects the extreme importance of dong the data collection and models openly so all sides can feel confident that their arguments are on standard ground. As you know this is not the case at the moment and has not been the case for skeptics for a long time.
Incidentally Mosher described the principle that if site A is closer to site B than site C then site A is more likely to be similar to site B than C is a fundamental theorem of geo statistics at JC. 10.50 12/6/2014 asymmetric responses of arctic and Antarctic.
My question to you is that Robert Way has stated at Skeptical Science that this is not true when calculating the Arctic infilling as used in Cowtan and Way and my understanding is that this faulty principle may now be being used in your current Arctic infilling. Can you assure us if you use Steven’s fundamental principle or Robert Ways new improved principle.

angech (Comment #130316)

See “how global warming broke the thermometer records” by Kevin Cowtan at Skeptical science 25/4/2014 speaking of his and Robert Way’s finding that the Gisstemp conundrum was due to actual GCHN Arctic data and infilling showing a cooling “bias” when compared to their model only method.
This occurred supposedly by violating the assumption that neighbouring regions of the planet’s surface warm at a similar rate.

Zeke (Comment #130317)

angech,

The problem in the arctic is one of station density; Cowtan and Way actually discovered the problem with GHCN’s adjustments by comparing them to Berkeley’s results, which are more accurate for those stations given the denser network. There is always a challenge in very sparsely sampled areas of misclassifying abrupt changes (in this case an abrupt warming trend) as local biases rather than true regional effects. Larger station networks can help ameliorate this.
.
Will Nitschke,

Some sort of automated homogenization is necessary. We’ve been working on ways to test to ensure that homogenization is not introducing bias. The Williams at al paper makes a compelling case, for example: ftp://ftp.ncdc.noaa.gov/pub/da…..al2012.pdf

Our recent UHI paper also looks at this by redoing homogenization using only rural stations to detect breakpoints/trend biases.

The reason I suspect that the Amundsen-Scott results are a bug due to the extreme cold is that they are flagged as regional climatology outliers. I’ll suggest that the Berkeley team look into it in more detail next week.
.
JD Ohio,

Using that approach, observations are still within the 95% CI for models, though they are close to the bottom. As I mention in the article you reference, the next few years will be key in seeing if they become clearly inconsistent. I have an updated graph here: http://www.yaleclimatemediafor…..in-review/
.
For other folks: sorry for being slow in responding; other things in life have been pulling me away from blogging, and I’m about to head out on a camping trip with no internet access fo

Zeke (Comment #130839)

The exact number of real stations reporting each month in USHCN version 2 is shown in this figure: http://rankexploits.com/musing…..-Count.png

You can download all the raw station data yourself here: ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2.5/

I really have no clue why people keep harping on this “exact number of active real stations” question when its trivial to answer…

Judith, I and others I’m sure would like to do a more formal rebuttal of Zeke’s approach if allowed and only if well written and argued.
Mine would focus on 3 key points.
The first of adjustment of past temperatures from current ones.
The second of a possible flaw in TOBS as used.
The third on the number of what are referred to a Zombie stations
2. TOBS  and break adjustments are made on stations which do not have data taken at the correct time.
The process is automated in the PHA.
Infilling is done on stations missing data, ie not correct time . Zombie stations have made up data, ie not correct time.
This means that potentially half the 1218 stations, the zombie and the ones missing data have an automatic cooling of the past done every day with the result of compounding past temperature altered levels.
This should not be allowed to happen.
Once a TOBS change has originally been made in the past eg 1900 should have been 0.2 warmer thern this altered estimate should stay forever and not be affected by future changes.
Judith, I and others I’m sure would like to do a more formal rebuttal of Zeke’s approach if allowed and only if well written and argued.
Mine would focus on 3 key points.
The first of adjustment of past temperatures from current ones.
The second of a possible flaw in TOBS as used.
The third on the number of what are referred to a Zombie stations
Going for a bike ride
3.  I will comment on Zeke’s and others obfuscation on this vital issue when I return.
Samples            Ponder this

Zeke (Comment #130839)   July 6th, 2014 at 12:22 pm
The exact number of real stations reporting each month in USHCN version 2 is shown in this figure: http://rankexploits.com/musing…..-Count.png
[HERE HE SHOWS AN OUT OF DATE GRAPH  check it out

You can download all the raw station data yourself here: ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2.5/
No it is not labelled as raw data and it certainly does not give a number for the raw stations

I really have no clue why people keep harping on this “exact number of active real stations” question when its trivial to answer…

PLEASE, please  answer Zeke.
Trivial to answer but you refuse pointblank to give an answer.
How many real original stations giving data are there since 1987 out of 1218
How many new ones have been added
How many do not report each month
How many zombie stations were used in March 2014 now you have all the data in
Judith,  Chief,Fan, Joshua, Climate etc,
if Zeke is honestly presenting his case, which he is, why will he not answer a trivial question?

“angech The TOBS adjustment would be done once
the adjustment that would/could change on a daily basis is PHA.
wait for Monday. the entire process will be explained.
But wait and read what is coming out on Monday
Then if you don’t like what NCDC does with USHCN, we could just dump all of USHCN, dump all of the US and the answer wouldntt change much

Man are you dense. Zeke has a written a paper describing all this.
Personally, I dont look at USHCN and I dont use it. for years.
But I am able to read Zeke paper and suggest that you read it on monday.
Seriously you are descending to Goddarian levels of argument.

(Comment #130800) July 4th, 2014 at 9:48 pm
Still no count given [ever] of the exact number of active real stations and it is obvious no one will be giving one.
The silence is deafening.

Go to the ftp
download the data
do a count

personally, in my own work, I dont look at USHCN. My advice to NOAA is to drop GHCN-M and USHCN and just supply GHCN Daily

I am not interested in doing your homework.
you could find a million errors in USHCN and none of it matters to me because Im upstream. get that. Not doing your homework. dont care what your issues are. they are moot.

Zeke (Comment #130839)   July 6th, 2014 at 12:22 pm
I really have no clue why people keep harping on this “exact number of active real stations” question when its trivial to answer…

Steven Mosher (Comment #130831)July 6th, 2014 at 2:54 am
Man are you dense. Zeke has a written a paper describing all this.
Personally, I dont look at USHCN and I dont use it. for years.

Still no count given [ever] of the exact number of active real stations and it is obvious no one will be giving one.The silence is deafening.

Go to the ftp  download the data  do a count
personally, in my own work, I dont look at USHCN. My advice to NOAA is to drop GHCN-M and USHCN and just supply GHCN Daily
I am not interested in doing your homework.
you could find a million errors in USHCN and none of it matters to me because Im upstream. get that. Not doing your homework. dont care what your issues are. they are moot.

interesting that you could find a million errors in USHCN and not care, and that you dont care but have 51 posts here and no one of the triumvirate will give a count Stokes /Zeke/ Steven and have spent hours running away from it. Says something.