This post will be an extended rebuttal to a portion of a much longer discussion. It is in response to the cartoonists website, SKEPTICAL SCIENCE, which has been responded to in a manner that should undermine it’s validity. But alas, it does not. So first let me post the graphic that caused the person to run to this propaganda site for a rebuttal:
The year 1934 was a very hot year in the United States, ranking fourth behind 2012, 2006, and 1998. However, global warming takes into account temperatures over the entire planet. The U.S.’s land area accounts for only 2% of the earth’s total surface area. Despite the U.S. heat in 1934, the year was not so hot over the rest of the planet, and is barely holding onto a place in the hottest 50 years in the global rankings (today it ranks 49th).
There are some preliminary thoughts on this before we get into the “nitty gritty.” First, in 1933-1936 (the heat wave and wild weather span), how many countries or persons were recording weather and temperature anomalies? Common sense says that of course, not nearly as much as today. We only started measuring with satellites since 1979. Another head-tilter would be have or did the NOAA or NASA previously note other years as warmer — in the past? We will find this out as we go along.
As you read this stuff (follow links to stories, and the like), that some sites just use the basic data without correcting it. I will discuss this later in the post… but first and foremost, are there reports from those years showing temperature and weather anomalies from around the world? Yes. Here is some evidences:
Mind you, this is a small sampling from THE NO TRICKS ZONE, many more examples can be found at C3 HEADLINES, as well as a site documenting the United Kingdom’s weather for some years. So it seems that we are looking at a few year GLOBAL weather pattern. AGAIN, we did not have satellites then so we can rely on eyewitness accounts such as these.
However, I wanted to comment on some of the wrong data being used or fudged data to make climate look worse now than in the past. I have to thank the person I was discussing the issue with, because he had posted a story linked to the journal, Nature. All it had was an abstract, and while I am a member of the AAAS, the Nature journal subscription was too much. There were also a list of corrections, so I asked him for either a full article and/or some links to the corrections. Lo-and-behold he gave me a link of a corrected graph that worked in my favor:
The red area is what concerns me… but even with it is has been warmer in the past (to note the bottom line). A professor at Berkeley notes that there is a collusion between some to hide the data that counters the narrative from the “warmists.” Mind you, the following video is for 2014, but I will make a point afterwords:
MRCTV BLOG notes the following that will throw a stone into the shoe of the left about 2015 being the hottest year. Remember, the internet never forgets!
…To explain where NOAA messed up, we have to start with 1997.
In that 1997 report, they say clearly that the Global Average Temperature (GAT) was 62.45°F, based on a 30-year average (1961-1990) of the combined land and sea surface temperatures. Since we know the 1997 El Nino caused a record high spike in temperature, that means that for that 30 year period, there was no warmer GAT than 62.45°F up until that time.
Now, back to 2015. The recently released NOAA report, claiming that 2015 was the hottest year ever, says:
During 2015, the average temperature across global land and ocean surfaceswas 1.62°F (0.90°C) above the 20th century average. This was the highest among all 136 years in the 1880–2015 record, surpassing the previous record set last year by 0.29°F (0.16°C) and marking the fourth time a global temperature record has been set this century.
It was blogger Wattsupwiththat who first noticed and explained the funky math. He explains that the recent report compares 2015 to the 20th century average – but, doesn’t mention what that average temperature was.
However, it does give a 20th Century average in its November 2015 State of the Climate Report: – 13.9 degrees Celsius (57 degrees Fahrenheit):
Now, it’s math time:
According to NOAA, the global average temperature for the 20th century was 57 degrees Fahrenheit.
And the 2015 average was 1.62 degrees Fahrenheit above that average.
In other words, according to this recent NOAA report, 2015 was the hottest year ever at 58.62 degrees Fahrenheit (57+1.62).
And, that 1997 NOAA report states 1997 had an average global temperature of 62.45 degrees.
….According to my calculator:
1997’s 62.45 degrees is 3.83 degrees Fahrenheit higher than 2015’s 58.62 degrees Fahrenheit, meaning that – using NOAA’s own numbers – 2015 cannot be the hottest year on record.
As suggested monthly by the mainstream media since April 2015, NOAA and NASA officially declared that 2015 surpassed the 2014 record to become the new hottest year on record and that 2016 could be hotter than 2015! The average global temperatures calculated by NOAA and NASA, however, appear to be inaccurate and unreliable for the following reasons:
In Table 2, we’ve compiled the top five years when the most records were set. When multiple years tie for the high, each individual year gets a fraction of a “record”. So, for example, 1954 and 1933 each get a half of a record for Colorado.
According to data on the NOAA website, 1997 was truly the hottest year on record at 62.45 oF. The average global temperature in 2015 was 58.62 oF or 3.83 oF below the 1997 temperature.
According to data on the NOAA website, the temperatures such as the 20th century average temperature and annual temperature anomalies, which must be fixed, have different values in the annual global analyses.
NOAA and NASA corrected historical temperature data and fabricated temperature data in areas without temperature record systematically, widely, and uni-directionally to “cool” the past in an attempt to prove the soaring temperature trend.
NOAA and NASA made efforts to discredit their own satellite data – which is consistent with the balloon data – because it suggests a global warming hiatus since 1998 contradicting with the NOAA and NASA preferred narrative.
NOAA and NASA refused to give data and information requested by the US House of Representatives Science, Space and Technology committee. There is no reason for them to withhold the data and information, which are public domain, unless they have something to hide.
The headline “last year was the hottest year on record, this year is hotter than last year, and next year could be hotter than this year” is likely to be repeated years after years until funding for climate change is stopped!
The National Center for Policy Analysis, mentions a change a few years back:
The warmest year on record is no longer 1998 and not because it has been overtaken by a recent heat wave. National Aeronautics and Space Administration (NASA) scientist James Hansen’s famous claims about 1998 being the warmest year on record in the United States was the result of a serious math error, according to H. Sterling Burnett, a senior fellow at the National Center for Policy Analysis (NCPA).
According to NASA’s newly published data:
The hottest year on record is 1934, not 1998.
The third hottest year on record was 1921, not 2006.
Three of the five hottest years on record occurred before 1940.
Six of the top 10 hottest years occurred before 90 percent of the growth in greenhouse gas emissions during the last century occurred.
Here we see raw data vs. messaged data run through a hopper. In 2007 NASA adjusted it’s U.S. findings to match the real data:
The measured USHCN daily temperature data shows a decline in US temperatures since the 1930s. But before they release it to the public, they put it thorough a series of adjustments which change it from a cooling trend to a warming trend.
[….]
The next blink comparator shows changes in the US temperature record from GISS. It alternates between their 1999 graph and the 2012 version of the same graph. The past is cooled and the present is warmed.
Here are some examples of fudged data because of sub-standard equipment:
You’d think the answer would be obvious, but here we have a NOAA operated USHCN climate station of record providing a live experiment. It always helps to illustrate with photos. Today I surveyed a sewage treatment plant, one of 4 stations surveyed today (though I tried for 5) and found that for convenience, they had made a nice concrete walkway to allow servicing the Fisher-Porter rain gauge, which needs a paper punch tape replaced one a month.
Here is what you see in visible light:
Here is what the infrared camera sees:
Note that the concrete surface is around 22-24°C, while the grassy areas are between 12-19°C
Class 5 (error >~= 5C) – Temperature sensor located next to/above an artificial heating source, such a building, roof top, parking lot, or concrete surface.”
More than half of the stations the NOAA use are tainted or wrongly placed equipment. Here is an example of a well maintained station with data sets:
Here is data sets corresponding to a site where heat corrupts the data:
Here are even more corrupt examples of these stations placed next to air conditioner outlets, next to burners, next to lights that glow at night, asphalt surrounding them and buildings that radiate heat as well as block wind (NOTE this continually growing accumulation of these sites at WUWT):
CHANGING DATA-SETS
Another example of changing averages was noted by Steve Goddard and others — even the NOAA have acknowledge it — have been discussing recently is exemplified in Dr. Judith Carry’s post on the matter (from a larger post of mine):
Even the Wall Street Journal chose the higher temperature reading to say that July of 2012 was July was the “hottest month in the contiguous U.S. since records began in 1895.” WUWT found this on accident and it has led to quite a few other revelations as we will see. Here is description in part of what we looking at:
Glaring inconsistencies found between State of the Climate (SOTC) reports sent to the press and public and the “official” climate database record for the United States. Using NCDC’s own data, July 2012 can no longer be claimed to be the “hottest month on record”.
[….]
I initially thought this was just some simple arithmetic error or reporting error, a one-off event, but then I began to find it in other months when I compared the output from the NCDC climate database plotter. Here is a table of the differences I found for the last two years between claims made in the SOTC report and the NCDC database output.
[….]
In almost every instance dating back to the inception of the CONUS Tavg value being reported in the SOTC report, there’s a difference. Some are quite significant. In most cases, the database value is cooler than the claim made in the SOTC report. Clearly, it is a systemic issue that spans over two years of reporting to the press and to the public.
It suggests that claims made by NCDC when they send out these SOTC reports aren’t credible because there are such differences between the data. Clearly, NCDC means for the plotter output they link to, to be an official representation to the public, so there cannot be a claim of me using some “not fit for purpose” method to get that data….
The Wall Street Journal made a graph showing this record setting month (below-left). The more accurate temperature for July likewise is shown in the same graph (below-right):
This looking at the data sets chosen and what is used and isn’t used to support an idea that fails in every way. Combine this obvious cherry-picking with the bias, collusion, and charges against the report that the President used to route Congress, all show we have a problem Houston! But this is only the tip of the proverbial iceberg. It seems the NOAA has been skewing these temps for some time. Why? Because the left uses this as a way to promote an ever growing government and the scientists get more-and-more funding. This data fudging story is newer, and it is evolving quickley, including this newest post via Real Science where Steve Goddard notes that More Than 40% Of USHCN Station Data Is Fabricated. Here is Dr. Judith carry’s synopsis (excerpted), in which she critiques a bit Goddard’s post… but then bows to the evidence:
OK, acknowledging that Goddard made some analysis errors, I am still left with some uneasiness about the actual data, and why it keeps changing. For example, Jennifer Marohasy has been writing about Corrupting Australian’s temperature record.
In the midst of preparing this blog post, I received an email from Anthony Watts, suggesting that I hold off on my post since there is some breaking news. Watts pointed me to a post by Paul Homewood entitled Massive Temperature Adjustments At Luling, Texas. Excerpt:
So, I thought it might be worth looking in more detail at a few stations, to see what is going on. In Steve’s post, mentioned above, he links to the USHCN Final dataset for monthly temperatures, making the point that approx 40% of these monthly readings are “estimated”, as there is no raw data.
From this dataset, I picked the one at the top of the list, (which appears to be totally random), Station number 415429, which is Luling, Texas.
Taking last year as an example, we can see that ten of the twelve months are tagged as “E”, i.e estimated. It is understandable that a station might be a month, or even two, late in reporting, but it is not conceivable that readings from last year are late. (The other two months, Jan/Feb are marked “a”, indicating missing days).
But, the mystery thickens. Each state produces a monthly and annual State Climatological Report, which among other things includes a list of monthly mean temperatures by station. If we look at the 2013 annual report for Texas, we can see these monthly temperatures for Luling.
Where an “M” appears after the temperature, this indicates some days are missing, i.e Jan, Feb, Oct and Nov. (Detailed daily data shows just one missing day’s minimum temperature for each of these months).
Yet, according to the USHCN dataset, all ten months from March to December are “Estimated”. Why, when there is full data available?
But it gets worse. The table below compares the actual station data with what USHCN describe as “the bias-adjusted temperature”. The results are shocking.
In other words, the adjustments have added an astonishing 1.35C to the annual temperature for 2013. Note also that I have included the same figures for 1934, which show that the adjustment has reduced temperatures that year by 0.91C. So, the net effect of the adjustments between 1934 and 2013 has been to add 2.26C of warming.
Note as well, that the largest adjustments are for the estimated months of March – December. This is something that Steve Goddard has been emphasising.
It is plain that these adjustments made are not justifiable in any way. It is also clear that the number of “Estimated” measurements made are not justified either, as the real data is there, present and correct.
Watts appears in the comments, stating that he has contacted John Nielsen-Gammon (Texas State Climatologist) about this issue. Nick Stokes also appears in the comments, and one commenter finds a similar problem for another Texas station.
Homewood’s post sheds light on Goddard’s original claim regarding the data drop out (not just stations that are no longer reporting, but reporting stations that are ‘estimated’). I infer from this that there seems to be a real problem with the USHCN data set, or at least with some of the stations. Maybe it is a tempest in a teacup, but it looks like something that requires NOAA’s attention. As far as I can tell, NOAA has not responded to Goddard’s allegations. Now, with Homewood’s explanation/clarification, NOAA really needs to respond….
So we see in the above, that temperatures can be changed years later as the totality of the data is included. What was considered the hottest falls to just an average month in the heat index.
And this has — within the past few months — turned into a very large debate.
EQUIPMENT FAIL II
Here is another example of older/faulty equipment:
A Quick Note about the Difference between RSS and UAH TLT data
There is a noticeable difference between the RSS and UAH lower troposphere temperature anomaly data. Dr. Roy Spencer discussed this in his July 2011 blog post On the Divergence Between the UAH and RSS Global Temperature Records. In summary, John Christy and Roy Spencer believe the divergence is caused by the use of data from different satellites. UAH has used the NASA Aqua AMSU satellite in recent years, while as Dr. Spencer writes:
…RSS is still using the old NOAA-15 satellite which has a decaying orbit, to which they are then applying a diurnal cycle drift correction based upon a climate model, which does not quite match reality.
While the two lower troposphere temperature datasets are different in recent years, UAH believes their data are correct, and, likewise, RSS believes their TLT data are correct. Does the UAH data have a warming bias in recent years or does the RSS data have cooling bias? Until the two suppliers can account for and agree on the differences, both are available for presentation.
The bottom line is that those wishing to expand regulation and laws and taxes on the rest of us fudge the facts. But facts-are-facts. And the ship is sinking for these lefties.