The FAA and the Ideals of Science


Today, its not unusual to see researchers publishing seemingly important findings in journals accompanied by a global news release at the time the article appears.  At this point, such research has perhaps been reviewed prior to journal publication by only several individuals.

However, it has become fairly common for researchers asscociated with globally impactful findings to withhold methodologies that led to them.   The natural result, particularly in the climate change domain, is a firestorm since critics in that research domain are SURE that there are misdeeds or errors due to “confirmation bias.”

An example in point is the important “Hockey Stick” paper by Mann et al. 1998 (Nature), one that was to have tremendous influence before it could be checked by outsiders on how exactly it came about.  This paper showed a sharp rise in global temperatures during the past 30-40 years, one commensurate with the thought that rising CO2 concentrations were already having a noticeable effect on global temperatures.   Eventually, a number of errors were found in this paper, and the Hockey Stick, as presented, was thrown in doubt.  It should be kept in mind, however, that just because the original paper was flawed, that there is not going to be such a rise–the author tends to agree with this proposition that global temperatures will gradually rise in the future.

This long, tortured chapter involving the “Hockey Stick” should not have happened.   It was clearly due to the original researchers believing that their results were too important for others to learn how they got them.   Sadly, in this writer’s opinion, it is a position of the National Science Foundation well that researchers can hide their methodologies and the exact data they used under a “proprietary” umbrella.  A scientific horror story in concerning the Hockey Stick has been laid out in detail by A. W. Montford, in “The Hockey Stick Illusion”, a book I highly recommend.

Withholding methodologies and data suggests that something is wrong with the outcome of the research, and, furthermore, is anti-science.  Imagine, a lab announces that it has cured cancer, but can’t tell us exactly how they did it, and so no one can replicate their results!  In the domain of medicine, this would be a ludicrous, surreal example; it wouldn’t happen.  It should not happen in the important climate change domain,  either.

On the other hand, the view that opening the door to skeptics of your work can lead to a lessening of conflict in research domains, and even more likely,  an improvement in the robustness of the orignal work, is one that is shared by numerous scientists.

Who among us as science workers, is so arrogant that we think our work cannot be improved upon?

While we depend on peer-review to catch errors, it has been this writer’s experience that hundreds of pages of peer-reviewed literature in the domain of cloud seeding research can reach the journals and stand untouched, uncritiqued for years at a time.   This is because peer-review in conflictive environments can easily fail with soft reviews by advocates of the conclusions being reached in a manuscript.  No scientist reading this doubts this.

The Federal Aviation Administration is fully aware of the hazards of “soft reviews”.  The attached statement at left concerning work on the writer’s former research aircraft at the University of Washington might well be a metaphor for our science environment.   “….there will be a paper trail.”   “…there will be an inspection by someone other than the person doing the work.”

We all know that these kinds of rules established by the FAA is to protect us from plane crashes.   But imagine, that there is no “paper trail”, no documentation of what’s in and what’s off the aircraft!  That’s how we get journal “plane crashes.”

It is the same with journal articles on scientific results.  How our results were arrived at is mandatory for purposes of replication, of which the first, most basic step is to use exactly the methodologies and data that the original researcher (s) claimed they used and see if you get the same result.