Uncommon Descent Serving The Intelligent Design Community

“Phraud” that Dwarfs Climategate?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Yes, according to Anthony Watts.

Comments
The people involved admitted that they fudged the data.
They did?
They also refused to disclose their methods, taking offense that anyone should want to check up on them.
They did? Why don't we look at the original publication to make sure this is an accurate description of what went on? So, what is the exact publication in question? And where did they admit they fudged the data? And how did they refuse to disclose the methods? hrun0815
Querius: The people involved admitted that they fudged the data. The historical data has discontinuities. While the original studies spliced the data manually, today, modern statistical techniques are used to extract the trends from the raw data. Querius: They also refused to disclose their methods, taking offense that anyone should want to check up on them. The methods were published in peer reviewed journals. More important, the raw data has always been available to researchers who were willing to take the trouble to aggregate it. Today, the raw data is easily available. Querius: The scientific method is definitely NOT force-fitting data into curves and then dreaming up semi-plausible explanations for why the data was altered. Multiple independent studies have largely confirmed the original findings. Zachriel
No, Zachriel. That's not what I'm saying. The people involved admitted that they fudged the data. They also refused to disclose their methods, taking offense that anyone should want to check up on them. Thus, their method is by definition flawed, unscientific, and indefensible, and their "results" are as well. The scientific method is definitely NOT force-fitting data into curves and then dreaming up semi-plausible explanations for why the data was altered. -Q Querius
Querius: Again, I’m not familiar with the methods used in meteorology to try to correct data collection errors, but they have to be clearly justified in each case and the algorithms that were used freely available... Sorry, I’m not buying into their results, whatever they might be. You seem to be saying that rather than consider the findings or methods, you'll reject the findings in any case. Zachriel
Again, I’m not familiar with the methods used in meteorology to try to correct data collection errors, but they have to be clearly justified in each case and the algorithms that were used freely available.
Correct. And that's why we should look at the original paper and see if that's what they did.
However, I do have a problem with passing off interpolated values if they were raw weather station data, if that’s what they did.
Correct. And that's why we should look at the original paper and see if that's what they did.
For example, you can’t just draw a line between two weather stations and average the results in between (linear interpolation). You have to incorporate the rates of change (first and second derivatives). If multiple stations are involved, perhaps you would interpolate using RMS calculations. Again, I don’t know what the meteorologists used, but they shouldn’t be able to adjust raw data without scrupulous justification at every station.
Um, yeah. So let's take a look already.
I took only one upper division meteorology class in college and in the lab we did interpolate the data *between* weather stations to create isobaric and weather front surface maps. But we wouldn’t have dared to fill in any missing weather station data or change any data that were reported! And you can’t just discard data, unless you can justify it statistically.
Yes, yes, yes. Let's look already.
If it’s true that in the past some stations confused farenheit and centigrade data (which I find hard to believe), and allowed air conditioners to be installed next to weather data collection stations (which can be seen in some of the photographs), it gives me even less confidence in the competence of the people involved and the raw data that they recorded. Just imagine having a heat source that close, and your “homogenization” adjustments dependent on the direction and velocity of the breeze and wind eddies around the building.
So then we should look maybe how many of the weather stations we are talking about. One out of a thousand? Nine hundred out of a thousand?
Sorry, I’m not buying into their results, whatever they might be.
Aehhhhhh, so you didn't look at the paper, you have no idea what they actually did or did not do, you don't know how they justified the process, how many data points were involved, but... you don't buy their results. Why don't we just look at the paper and actually see what they did and if it is actually problematic. hrun0815
Again, I'm not familiar with the methods used in meteorology to try to correct data collection errors, but they have to be clearly justified in each case and the algorithms that were used freely available. However, I do have a problem with passing off interpolated values if they were raw weather station data, if that's what they did. For example, you can't just draw a line between two weather stations and average the results in between (linear interpolation). You have to incorporate the rates of change (first and second derivatives). If multiple stations are involved, perhaps you would interpolate using RMS calculations. Again, I don't know what the meteorologists used, but they shouldn't be able to adjust raw data without scrupulous justification at every station. I took only one upper division meteorology class in college and in the lab we did interpolate the data *between* weather stations to create isobaric and weather front surface maps. But we wouldn't have dared to fill in any missing weather station data or change any data that were reported! And you can't just discard data, unless you can justify it statistically. If it's true that in the past some stations confused farenheit and centigrade data (which I find hard to believe), and allowed air conditioners to be installed next to weather data collection stations (which can be seen in some of the photographs), it gives me even less confidence in the competence of the people involved and the raw data that they recorded. Just imagine having a heat source that close, and your "homogenization" adjustments dependent on the direction and velocity of the breeze and wind eddies around the building. Sorry, I'm not buying into their results, whatever they might be. -Q Querius
hrun0815: is that paper actually the original source for the discrepancy between raw and processed data in the link picked by Querius? The article about Australia cited by Querius seems more concerned with the homogenization algorithm than any misreporting of the raw data. http://joannenova.com.au/2014/08/the-heat-is-on-bureau-of-meteorology-altering-climate-figures-the-australian/ Zachriel
Zachariel, is that paper actually the original source for the discrepancy between raw and processed data in the link picked by Querius? Querius, do you know? I think it is pretty clear that ONLY if we compare the actual scientific publication of the processed data with the raw data can we actually make a decision in which of the listed categories this should fall (outright fraud, misleading, sloppy science, ...). Can you confirm that this is indeed the paper in question? hrun0815
Querius: Why don’t we start with the last link, the one from Australia? Trewin, A daily homogenized temperature data set for Australia, International Journal of Climatology 2013:
A new homogenized daily maximum and minimum temperature data set, the Australian Climate Observations Reference Network—Surface Air Temperature data set, has been developed for Australia. This data set contains data from 112 locations across Australia, and extends from 1910 to the present, with 60 locations having data for the full post-1910 period. These data have been comprehensively analysed for inhomogeneities and data errors ensuring a set of station temperature data which are suitable for the analysis of climate variability and trends. For the purposes of merging station series and correcting inhomogeneities, the data set has been developed using a technique, the percentile-matching (PM) algorithm, which applies differing adjustments to daily data depending on their position in the frequency distribution. This method is intended to produce data sets that are homogeneous for higher-order statistical properties, such as variance and the frequency of extremes, as well as for mean values. The PM algorithm is evaluated and found to have clear advantages over adjustments based on monthly means, particularly in the homogenization of temperature extremes.
Zachriel
Querius: The specific processing step(s) performed were apparently never revealed, so my contention is that because they cannot be independently validated, the results are not scientifically acceptable. That is simply incorrect. Homogenization is an important issue in climate science, and has been subject to intensive study. The problem is that older data was collected for local weather forecasting, and is of uneven quality. The protocols changed, personnel changed, the stations were moved, there was urban development around stations, the languages and cultures and government systems were different, the handwriting is often difficult to read, instrumentation changed without notice, some institutions won't release their data, some records are damaged or missing, the guy was sick that day, etc. The first modern studies were done by manually splicing data together where there were obvious discontinuities. Since then, statisticians have made the process rigorous. This has all been done in the open, in peer reviewed journals. The edges of science are often determined by very incomplete and tentative data. You will find scientists in many fields of study extracting usable information from very tenuous data. Zachriel
Querius, could you help me out and directly cite the original publication where the data processing is not or only inadequately described? hrun0815
Querius @ 50
Also, interpolated values are not measured data, since they depend on the algorithm used.Again, I’m not accusing them of fraud, just a very poor methodology.
If you have limited data and want finer measurement within any two points, or plot a curve, or integrate and find the area under curve, say, to measure the total temperature difference within certain period, you have to interpolate. Interpolation is standard practice not just in science, but engineering too. Interpolation is also a cool technique to 'zoom' into a particular curve to see finer trends. Me_Think
There's a profound difference between transferring a small amount of heat from the outdoors into a thermometer and observing the resulting change, and performing a mathematical operation on temperature data. The specific processing step(s) performed were apparently never revealed, so my contention is that because they cannot be independently validated, the results are not scientifically acceptable. Also, interpolated values are not measured data, since they depend on the algorithm used. Again, I'm not accusing them of fraud, just a very poor methodology. -Q Querius
Thinking back, if I “homogenized” my results in a quantitative analysis or a physics lab, I’d have probably failed the class, and possibly faced disciplinary action.
I'm pretty sure if what you did was defined and reported it would have been fine-- no matter whether the procedure was called reconstruction, homogenization, processing, normalizing, correcting, or some other term.
Depending on the experiment, I sometimes had to account for heat capacity, buoyancy, tare, or other significant source of error. But I don’t agree with you that all of science is as slipshod as indicated, nor do I feel obligated to defend this travesty.
Yep, depending on the instrument or experimental setup you have to somehow process the raw data to arrive at some form of useful output. And in virtually all cases you are not measuring the actual piece of information you are interested in, but rather some form of proxy that is somehow converted into what you actually want to know. Which means that as far as I can tell virtually all data is both derived and processed. The key question is: is this processing step described in the original publication of the data or not. If not, it's sloppy science, if it is, then you can criticize the process, but rather than shoring general outrage you should probably explain why this particular processing step should not have been performed. hrun0815
Zachriel @ 39,
That is incorrect. Data and methods are published in peer reviewed journals. Automated statistical methods have replaced manual splicing, and include regression models, percentile-matching and kriging.
And fudging. -Q Querius
Me_Think,
What else did you think homogenization is? There are various methods too, as indicated in section 3.4
It sounded like blending a mixture, so I didn't know that this is what used to be called "fudging the results." If you had read the article, you would know that the people involved declined to reveal what corrections were applied. I'm not accusing them of fraud, but as far as scientific ethics goes, this is a fail. -Q Querius
hrun0815, Thinking back, if I "homogenized" my results in a quantitative analysis or a physics lab, I'd have probably failed the class, and possibly faced disciplinary action.
It turns out that virtually all data are derived estimates.
Depending on the experiment, I sometimes had to account for heat capacity, buoyancy, tare, or other significant source of error. But I don't agree with you that all of science is as slipshod as indicated, nor do I feel obligated to defend this travesty. Me_Think answered my question regarding "homogenization" of historical weather data. I believe you'll find most of the answers to your questions in the referenced article. -Q Querius
I read the complete paper “Homogenization” apparently is a set of methods of trying to estimate and correct for the error in historical data.
What else did you think homogenization is? There are various methods too, as indicated in section 3.4 Me_Think
“Homogenization” apparently is a set of methods of trying to estimate and correct for the error in historical data. I won’t fault meteorologists for trying to reconstruct the data, but the result is a derived estimate. One wouldn’t even call it corrected, since not all the factors are known. I’m speechless.
Yes, sure. Data are derived estimates. It turns out that virtually all data are derived estimates. I just don't understand why you are speechless. The question still remains: In your example were the processing steps explained at the time the data was published or was it claimed that the data was unprocessed? Did anything nefarious go on? Was something hidden? We're mistakes made? Should they have used a different processing procedure? None at all? hrun0815
Thank you, Me_Think @ 41. I read the complete paper, which started out extremely basic:
If we measure rainfall, in order for the data to be useful for future users, we also need to document where and how the measurements were made. Station documentation is information about the data or data about the data: metadata. The word metadata is made up by the addition of the Greek “meta” (beyond) and the Latin “datum” (a given fact). Metadata should reflect how, where, when and by whom information was collected. Ideally, a complete metadata should register all the changes a station has undergone during its lifetime, composing what is called the station history.
Ya think? It ended pretty basic, too. Examples they cited of the problems that resulted from not keeping station histories included a period of time when one station apparently switched between farenheit and centigrade without using a conversion formula! Are you kidding me??? "Homogenization" apparently is a set of methods of trying to estimate and correct for the error in historical data. I won't fault meteorologists for trying to reconstruct the data, but the result is a derived estimate. One wouldn't even call it corrected, since not all the factors are known. I'm speechless. -Q Querius
Hmmm, yeah, so this whole thing predictably fizzled out. All investigations found no fraud occurring at the so-called climategate. There was no 'phraud' or any impropriety in the paper on pH measurements. No data was hidden. There were a whole bunch of suggestions of massaging data, but so far we couldn't drill down into a single example to actually show that anything improper was done. And even though there is this great pause going on, the past decade saw the four hottest years of the nstrumental temperature record. Maybe people would take this and rethink their position on human-caused climate change... :) Nah. Of course not. There's just another fraud or scandal or evil conspiracy plot waiting just around the corner. hrun0815
Querius @ 38, If you want to learn more - There are guidelines for homogenization given by World Meteorological Organization : Guidelines Me_Think
Querius, so we can dispense with slanted looks at this lets identify the sources of the raw data and the published (or altered) data. From your description it sounds to me that 'homogenized' refers to a form of data processing. The question is if this process is described in the paper where this 'homogenized' data was published? Depending on how this question is answered I guess this example could fall into categories 1, 2, or 4. hrun0815
Querius: The raw historical data was altered by a process called “homogenized” without any information as to what this process entailed. That is incorrect. Data and methods are published in peer reviewed journals. Automated statistical methods have replaced manual splicing, and include regression models, percentile-matching and kriging. Zachriel
Why don't we start with the last link, the one from Australia? I don't think this one fits into one of your categories. The raw historical data was altered by a process called "homogenized" without any information as to what this process entailed. I think the post processing needs to be clearly documented, and the resulting data distinguished from raw data. What do you think about it? -Q Querius
So into which of the categories I listed do these example fall, Q? Where is the officially published data and where did the raw data come from? Maybe you can start with one example and we could start chasing the issue down. hrun0815
Sorry for spamming, but the issue of massaging historical data continues. Foolishly, I thought it had been done only once. http://www.principia-scientific.org/breaking-new-climate-data-rigging-scandal-rocks-us-government.html And now also in Australia . . . http://joannenova.com.au/2014/08/the-heat-is-on-bureau-of-meteorology-altering-climate-figures-the-australian/ Gosh, the past is changing faster than the present! Orwell was right! -Q Querius
And this one . . . http://wattsupwiththat.com/2012/09/26/nasa-giss-caught-changing-past-data-again-violates-data-quality-act/ This isn't right. -Q Querius
I'm still looking for the original article, but I found these two links that are interesting: http://stevengoddard.wordpress.com/2014/06/23/noaanasa-dramatically-altered-us-temperatures-after-the-year-2000/ http://wattsupwiththat.com/2014/06/28/the-scientific-method-is-at-work-on-the-ushcn-temperature-data-set/ I loved the photo of the MMTS sensor in the parking lot! LOL! ;-) -Q Querius
hrun0815, IIRC, the author plotted the deviations between the original and the current historical data. From the single-cycle sinusoidal shape of these deviations, I think it would be reasonable to conclude that a methodological correction was applied to the historical raw data to create new "raw" historical data. The author's issue was that the correction was not disclosed along with the revised data. If true, would suggest that the revised historical data is not raw but derived. I'll look for the article. -Q Querius
Would this be considered fraud? Does anyone else remember the article?
I do not remember seeing this article, but here is my guess what happened: The data found on the hard drive is raw data while the online published data is processed. Since in general during a publication the processing steps are published as well I would think that it is transparent how the raw data was altered to obtain the published data--and likely also why. 1) Now, if the fact the data was processed prior to publication was not clearly stated this was definitely sloppy science or an attempt to deceive. 2) If it was claimed the data was indeed unaltered but it was in fact altered I would consider it fraud. 3) If there is a minor discrepancy in the data on the hard drive and the published data that can not be accounted for by processing then this might be either an error or an attempt to deceive. The ORI would need to investigate. 4) If the raw data was processed and then published alongside with the information how the data was processed then the writer of the article you are referring to was either ignorant or willfully attempting to deceive. Once you find the article I'm curious which one of my scenarios is true. Does anybody want to guess? hrun0815
Many years ago, I ran into an article that compared online historical climate data with the same data that was stored on somebody's hard drive. The author noticed that the online data had been altered. Would this be considered fraud? Does anyone else remember the article? -Q Querius
And then of course there was the Koch funded Berkeley investigation headed by Richard Muller that was supposed to review comprehensively of human activity is indeed to blame for global warming. Watts famously proclaimed “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” In the end the review panel produced multiple studies- several of which directly dealt with claims on Anthony Watts's blog. You can have two guesses how that turned out: http://thinkprogress.org/climate/2011/10/20/349544/berkeley-temperature-study-results-confirm-global-warming/ hrun0815
Collin, would you mind pointing me to the part of the Wikipedia page that says there are "insinuations against the people who broke into the computers to get the data" in the official investigations. I'd love to go back to the original documents of the investigation and check on this. hrun0815
Eight committees investigated the allegations and published reports, finding no evidence of fraud or scientific misconduct. The scientific consensus that global warming is occurring as a result of human activity remained unchanged by the end of the investigations. http://en.wikipedia.org/wiki/Climatic_Research_Unit_email_controversy#Inquiries_and_reports
Zachriel
Not according to wikipedia. You should correct it. Collin
Collin, I'm pretty sure that in all of the investigations there is hardly anything about the illegal activities used to obtain the emails. They address the scientists actions. hrun0815
hrun at 23, the wikipedia page is clear that the scientists were at least not being forthcoming and failed to comply with information requests. Collin
hrun, I wish I had the time to engage fully. So I will have to formally concede the argument, by default. I will say that I read the wikipedia article about it and came across nothing substantive. Only insinuations against the people who broke into the computers to get the data. It felt like a "brush aside" rather than a vindication. But I haven't read the reports themselves, so you win. Collin
Tjguy, are you serious with your post? This is such a simple matter and you are wondering who is right? And who do you think is hiding any data? How in the world do you think a scientist is able to hide old easily accessible data??? I don't even understand the charge. hrun0815
Hiding data does nothing to build trust. When will these guys learn? It only gives the appearance of them being anti-science and having a particular agenda. Scientists are not immune to bias. We are all humans and all have biases and these effect our interpretation and sometimes presentation of the data - unfortunately. Just goes to show that the idea that scientists are somehow immune to bias is a foolish old wives tale. Data is trustworthy as long as it has not been fudged, but when it comes to the interpretation and presentation of it, that is where we need to realize that our biases can play a large role, even for scientists. Mark Frank points us to a blog that supposedly destroys and belittles this article. I don't know, but I do find it interesting that the Materialists are mostly pro-climate change and probably a majority of IDers tend to question the whole story a bit. So when an article like this comes out, certain people tend to jump on the bandwagon and receive the news positively, and others tend to question it and try and poke holes in it. Who takes what side is rather predictable. Mark, are you in agreement with what the blog says that you pointed us to? Is this guy trustworthy and knowledgeable? Why do you believe him as opposed to the writer who documented his story so well? Is it just your personal biases or have you really researched it? If the story was about an IDer who was withholding data, imagine the uproar we would hear from the Materialists? As I said, I don't know enough to be able to really say which side I believe is right, but I find the data suppression quite troubling and it raises red flags in my mind. Show us the data so we can all see. I think even you would agree with that, right Mark? tjguy
Aurelio:
I asked about evidence for a gigantic fraud.
I was under the impression you were responding to what I posted: The entire “human emissions of CO2 is driving climate change”, is a fraud- your comment seemed as if it was addressed to me and what I said. Also warm waters release CO2 and that is why we see an increase in atmospheric CO2 follow the warming. Joe
Hold on, the "skeptic" made a "global average pH" by taking the mean of many records spread across the globe, in different seasons and at different depths? Can you imagine what the so called skeptics would say if a climate scientist tried to make a global average temperature in the same slipshod way? wd400
Collin, I am not sure what you want me to explain that goes beyond the official results of the investigations. One summary of a large chunk of the results can be found here: http://www.skepticalscience.com/Climategate-CRU-emails-hacked.htm I believe that for all investigations there is a link to the full PDF document of the investigation results together with the information of who led the investigation. The investigating entities are universities and government agencies both in the US and in the UK. Now, you can, of course, still believe that there is a giant cover-up. But you do have to now make the case that this is not just about a few climate researchers with an agenda, but whole universities and government agencies in at least two countries. In addition, there is nothing secretive going on. These are all officials that are publishing their investigation results-- openly and for the record. So, who to believe: Some bloggers that received hacked emails and based on a few snippets jumped to conclusions or the official investigations. And if you come down on the side of the bloggers, I really would like to know why. hrun0815
Ok, hrun, your sarcasm has piqued my interest. Why don't you educate me about climategate and why those 10 investigations felt that it was appropriate for researchers to "hide the decline" in temperatures. Collin
LoL! @ Aurelio- You don't seem to be able to follow along- sad but typical. YOU asked about the fraud I mentioned and I answered. Please TRY to keep up Joe
@News:
I sometimes wonder whether naturalist science is approximately where the Catholic Church was before the Reformation and the Council of Trent. That is, so many grand claims, such poor performance. Thoughts?
Yes. What is "naturalist science"?! JWTruthInLove
You guys might want to look at some of the responses to this e.g http://blog.hotwhopper.com/2014/12/where-has-all-co2-gone-wuwt-fails.html Mark Frank
Sorry if I'm doubtful that there are 100 years of pH measurements that the Darwinist-atheist-general bad guy cabal is hiding. 1) A proper pH meter dates to about 1934. Beckman's vacuum tube amplifier and electrodes. Electrodes without saline error (and the oceans are salty!) come later. NIST standards? Even later. Temperature corrections? 2) The dissociation constant of carbonic acid depends on pressure (think soda-and test it if you want). So unless since 1914, scientists have been using the same exact same protocol (testing pH at depth with a wood box filled with vacuum tubes?)...... REC
Collin, don't sell the scope of the scandal short. It was not a single investigation by a single institution. I believe it was nearly ten! And each one as big a scandal as climategate-- or even bigger!!! And now this. Yet an even bigger scandal. And if this new scandal gets investigated there will be an even bigger one still. I have a felling it'll just keep on growing and growing. hrun0815
As, hrun, I think that the investigation after climate gate was the second scandal. Collin
Yes Aurelio, lack of credible evidence that human contributions of CO2 are affecting the climate. Joe
Yup, Aurelio. Same one. hrun0815
The entire "human emissions of CO2 is driving climate change", is a fraud Joe
“Phraud” that Dwarfs Climategate?
Considering what came out of the official investigations into all of 'climategate' or specific aspects thereof... that doesn't seem very hard. hrun0815
I sometimes wonder whether naturalist science is approximately where the Catholic Church was before the Reformation and the Council of Trent. That is, so many grand claims, such poor performance. Thoughts? News

Leave a Reply